We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using open-webui as backend, this is my config file:
CHAT_CACHE_PATH=/tmp/chat_cache CACHE_PATH=/tmp/cache CHAT_CACHE_LENGTH=100 CACHE_LENGTH=100 REQUEST_TIMEOUT=60 DEFAULT_MODEL=llama3.1:latest DEFAULT_COLOR=magenta ROLE_STORAGE_PATH=/home/peng/.config/shell_gpt/roles DEFAULT_EXECUTE_SHELL_CMD=false DISABLE_STREAMING=false CODE_THEME=dracula OPENAI_FUNCTIONS_PATH=/home/peng/.config/shell_gpt/functions OPENAI_USE_FUNCTIONS=false SHOW_FUNCTIONS_OUTPUT=false API_BASE_URL=http://XX.XX.XX.XX:3000/api PRETTIFY_MARKDOWN=true USE_LITELLM=false SHELL_INTERACTION=true OS_NAME=auto SHELL_NAME=auto OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXX
Running a simple command I get this error:
It seems to come from this script:
shell_gpt/sgpt/handlers/handler.py
Line 121 in aac2f54
This sounds related and could be a fix as somehow open webui's responses don't have a delta:
run-llama/llama_index#16570 (comment)
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Using open-webui as backend, this is my config file:
Running a simple command I get this error:
It seems to come from this script:
shell_gpt/sgpt/handlers/handler.py
Line 121 in aac2f54
This sounds related and could be a fix as somehow open webui's responses don't have a delta:
run-llama/llama_index#16570 (comment)
The text was updated successfully, but these errors were encountered: