-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add the ability to start Ollama when it's stopped #3653
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: Fred Bricon <fbricon@gmail.com>
✅ Deploy Preview for continuedev canceled.
|
@fbricon thanks for opening the PR! 🚀 I believe the e2e tests are failing because you don't have access to the env variables team members have access to. Let me know whenever this PR is ready for review and I'll run the tests to make sure they pass. In the meantime, you might be able to run them locally. |
It was almost ready, until I found a bug in ide.runCommand :-/ It reuses a terminal than might still be busy (eg. tsc:watch task). I want to change the runCommand API to add an optional options object, where I can declare we need to use a specific terminal |
Signed-off-by: Fred Bricon <fbricon@gmail.com>
@tomasz-stefaniak It should be ready to review. I'll still build a vsix on windows to double check tomorrow, but since it works on Mac and Linux already, I'm fairly confident about it. famous last words ;-) |
Signed-off-by: Fred Bricon <fbricon@gmail.com>
core/llm/index.ts
Outdated
"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai.", | ||
); | ||
const message = (await isOllamaInstalled()) ? | ||
"Failed to connect to local Ollama instance. Ollama appears to be stopped. It needs to be running." : |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Three sentences feels a bit hard to read, and reads a bit abruptly. Reducing it down to two sentences would be easier to digest.
"Failed to connect to local Ollama instance. Ollama appears to be stopped. It needs to be running." : | |
"Unable to connect to local Ollama instance. Ollama may not be running." : |
core/llm/index.ts
Outdated
); | ||
const message = (await isOllamaInstalled()) ? | ||
"Failed to connect to local Ollama instance. Ollama appears to be stopped. It needs to be running." : | ||
"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The notification text is a bit long here. Personally I'd drop the download link - it's too much detail for a notification and you'd hope that someone could figure out what to do.
"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai." | |
"Unable to connect to local Ollama instance. Ollama may not be installed or may not running." |
Signed-off-by: Fred Bricon <fbricon@gmail.com>
Ollama detection now works on windows as well. Updated the error messages as per @allanday's suggestions: @tomasz-stefaniak @sestinj feel free to review |
Fixes #3318
contributing the ollama_start.sh script from vscode-paver
Checked it works from a vsix on all platforms: