Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WARNING: LLM_local.original_IS_CHANGED() got an unexpected keyword argument 'system_prompt' System role not supported #142

Open
emircanerkul opened this issue Jan 14, 2025 · 3 comments

Comments

@emircanerkul
Copy link

WARNING: LLM_local.original_IS_CHANGED() got an unexpected keyword argument 'system_prompt'
System role not supported

Trying to use models/LLM/gemma-2-27b-it-Q5_K_L.gguf, there are no option to disable system_prompt option in the widget

@heshengtao
Copy link
Owner

I'm very sorry, there is a lot of code related to System role in my node code, scattered all over the repository. Although I really want to solve this problem, I'm afraid that my update will make many people's workflow unusable. This problem is really quite challenging.

@emircanerkul
Copy link
Author

I see, anyway @heshengtao thank you for whole comfyui_LLM_party

@heshengtao
Copy link
Owner

I tried adding the is_enable_system_role attribute to the local universal link node. When it becomes disabled, system will be replaced with user input into the model, and when it is output, it will be replaced back to system, which minimizes the risk of the node being unusable due to system replacement to the maximum extent. You can try it and see if it can run normally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants