Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the ability to start Ollama when it's stopped #3653

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

fbricon
Copy link
Contributor

@fbricon fbricon commented Jan 8, 2025

Fixes #3318

contributing the ollama_start.sh script from vscode-paver

Screenshot 2025-01-08 at 20 04 04

Checked it works from a vsix on all platforms:

  • Mac
  • Linux
  • Windows

Signed-off-by: Fred Bricon <fbricon@gmail.com>
Copy link

netlify bot commented Jan 8, 2025

Deploy Preview for continuedev canceled.

Name Link
🔨 Latest commit bbd5d3d
🔍 Latest deploy log https://app.netlify.com/sites/continuedev/deploys/677fefc38d05a20008b9705e

@tomasz-stefaniak
Copy link
Collaborator

@fbricon thanks for opening the PR! 🚀

I believe the e2e tests are failing because you don't have access to the env variables team members have access to.

Let me know whenever this PR is ready for review and I'll run the tests to make sure they pass. In the meantime, you might be able to run them locally.

@fbricon
Copy link
Contributor Author

fbricon commented Jan 8, 2025

It was almost ready, until I found a bug in ide.runCommand :-/ It reuses a terminal than might still be busy (eg. tsc:watch task). I want to change the runCommand API to add an optional options object, where I can declare we need to use a specific terminal

Signed-off-by: Fred Bricon <fbricon@gmail.com>
@fbricon fbricon marked this pull request as ready for review January 8, 2025 19:12
@fbricon
Copy link
Contributor Author

fbricon commented Jan 8, 2025

@tomasz-stefaniak It should be ready to review. I'll still build a vsix on windows to double check tomorrow, but since it works on Mac and Linux already, I'm fairly confident about it. famous last words ;-)

Signed-off-by: Fred Bricon <fbricon@gmail.com>
"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai.",
);
const message = (await isOllamaInstalled()) ?
"Failed to connect to local Ollama instance. Ollama appears to be stopped. It needs to be running." :
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Three sentences feels a bit hard to read, and reads a bit abruptly. Reducing it down to two sentences would be easier to digest.

Suggested change
"Failed to connect to local Ollama instance. Ollama appears to be stopped. It needs to be running." :
"Unable to connect to local Ollama instance. Ollama may not be running." :

);
const message = (await isOllamaInstalled()) ?
"Failed to connect to local Ollama instance. Ollama appears to be stopped. It needs to be running." :
"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai."
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The notification text is a bit long here. Personally I'd drop the download link - it's too much detail for a notification and you'd hope that someone could figure out what to do.

Suggested change
"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai."
"Unable to connect to local Ollama instance. Ollama may not be installed or may not running."

Signed-off-by: Fred Bricon <fbricon@gmail.com>
@fbricon
Copy link
Contributor Author

fbricon commented Jan 9, 2025

Ollama detection now works on windows as well.

Updated the error messages as per @allanday's suggestions:

Capture d'écran 2025-01-09 170435
Capture d'écran 2025-01-09 170929

@tomasz-stefaniak @sestinj feel free to review

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

When ollama is stopped, propose to start it, instead of downloading it
3 participants