Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the ability to start Ollama when it's stopped #3653

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 7 additions & 2 deletions core/index.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -630,7 +630,7 @@ export interface IDE {

openUrl(url: string): Promise<void>;

runCommand(command: string): Promise<void>;
runCommand(command: string, options?: TerminalOptions): Promise<void>;

saveFile(fileUri: string): Promise<void>;

Expand Down Expand Up @@ -1230,4 +1230,9 @@ export type PackageDocsResult = {
} & (
| { error: string; details?: never }
| { details: PackageDetailsSuccess; error?: never }
);
);

export interface TerminalOptions {
reuseTerminal?: boolean,
terminalName?: string,
}
9 changes: 6 additions & 3 deletions core/llm/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ import {
toCompleteBody,
toFimBody,
} from "./openaiTypeConverters.js";
import { isOllamaInstalled } from "../util/ollamaHelper.js";

export abstract class BaseLLM implements ILLM {
static providerName: string;
Expand Down Expand Up @@ -414,9 +415,11 @@ export abstract class BaseLLM implements ILLM {
e.code === "ECONNREFUSED" &&
e.message.includes("http://127.0.0.1:11434")
) {
throw new Error(
"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai.",
);
const message = (await isOllamaInstalled()) ?
"Failed to connect to local Ollama instance. Ollama appears to be stopped. It needs to be running." :
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Three sentences feels a bit hard to read, and reads a bit abruptly. Reducing it down to two sentences would be easier to digest.

Suggested change
"Failed to connect to local Ollama instance. Ollama appears to be stopped. It needs to be running." :
"Unable to connect to local Ollama instance. Ollama may not be running." :

"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai."
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The notification text is a bit long here. Personally I'd drop the download link - it's too much detail for a notification and you'd hope that someone could figure out what to do.

Suggested change
"Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai."
"Unable to connect to local Ollama instance. Ollama may not be installed or may not running."

;
throw new Error(message);
}
}
throw new Error(e.message);
Expand Down
3 changes: 2 additions & 1 deletion core/protocol/ide.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ import type {
Range,
RangeInFile,
Thread,
TerminalOptions
} from "../";

export interface GetGhTokenArgs {
Expand All @@ -28,7 +29,7 @@ export type ToIdeFromWebviewOrCoreProtocol = {
showVirtualFile: [{ name: string; content: string }, void];
openFile: [{ path: string }, void];
openUrl: [string, void];
runCommand: [{ command: string }, void];
runCommand: [{ command: string, options?: TerminalOptions }, void];
getSearchResults: [{ query: string }, string];
subprocess: [{ command: string; cwd?: string }, [string, string]];
saveFile: [{ filepath: string }, void];
Expand Down
5 changes: 3 additions & 2 deletions core/protocol/messenger/messageIde.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import type {
Problem,
Range,
RangeInFile,
TerminalOptions,
Thread,
} from "../..";

Expand Down Expand Up @@ -151,8 +152,8 @@ export class MessageIde implements IDE {
await this.request("openUrl", url);
}

async runCommand(command: string): Promise<void> {
await this.request("runCommand", { command });
async runCommand(command: string, options?: TerminalOptions): Promise<void> {
await this.request("runCommand", { command, options });
}

async saveFile(fileUri: string): Promise<void> {
Expand Down
3 changes: 2 additions & 1 deletion core/util/filesystem.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import {
Problem,
Range,
RangeInFile,
TerminalOptions,
Thread,
ToastType,
} from "../index.js";
Expand Down Expand Up @@ -190,7 +191,7 @@ class FileSystemIde implements IDE {
return Promise.resolve();
}

runCommand(command: string): Promise<void> {
runCommand(command: string, options?: TerminalOptions): Promise<void> {
return Promise.resolve();
}

Expand Down
41 changes: 41 additions & 0 deletions core/util/ollamaHelper.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
import path from "node:path";
import { IDE } from "..";
import { exec } from "node:child_process";

export async function isOllamaInstalled(): Promise<boolean> {
return new Promise((resolve, _reject) => {
const command = process.platform === "win32" ? "where.exe ollama" : "which ollama";
exec(command, (error, _stdout, _stderr) => {
resolve(!error);
});
});
}

export async function startLocalOllama(ide: IDE): Promise<void> {
let startCommand: string | undefined;

switch (process.platform) {
case "darwin"://MacOS
startCommand = "open -a Ollama.app\n";
break;

case "win32"://Windows
startCommand = `& "ollama app.exe"\n`;
break;

default: //Linux...
const start_script_path = path.resolve(__dirname, './start_ollama.sh');
if (await ide.fileExists(`file:/${start_script_path}`)) {
startCommand = `set -e && chmod +x ${start_script_path} && ${start_script_path}\n`;
console.log(`Ollama Linux startup script at : ${start_script_path}`);
} else {
ide.showToast("error", `Cannot start Ollama: could not find ${start_script_path}!`)
}
}
if (startCommand) {
return ide.runCommand(startCommand, {
reuseTerminal: true,
terminalName: "Start Ollama"
});
}
}
84 changes: 84 additions & 0 deletions core/util/start_ollama.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
#!/bin/bash

# Timeout for waiting for Ollama to start (in seconds)
readonly TIMEOUT=60
readonly OLLAMA_API_URL="http://localhost:11434/api/version"

# Check if a command exists
command_exists() {
command -v "$1" >/dev/null 2>&1
}

# Check if Ollama is already running
ollama_is_running() {
VERSION=$(curl -s --fail --max-time 1 "$OLLAMA_API_URL" | grep '"version"' | awk -F: '{ print $2 }' | sed -e 's/.*"\([^"]*\)".*/\1/')
test -n "$VERSION"
}

# Determine if running in a container environment
in_container() {
[[ -f /.dockerenv || -f /run/.containerenv ]]
}

# Start Ollama service or background process
start_ollama() {
local start_method="${1:-background}"
echo "Starting Ollama ($start_method)..."

if [[ "$start_method" == "service" ]]; then
if in_container; then
service ollama start || return $?
else
sudo systemctl start ollama || return $?
fi
else
nohup ollama serve >/dev/null 2>&1 &
fi

# Wait for Ollama to start. TIMEOUT * 4, since we sleep 1/4 sec on each iteration
for _ in $(seq 1 $((TIMEOUT * 4))); do
if ollama_is_running; then
echo -e "\nOllama started successfully."
return 0
fi
(( _ % 2 == 0 )) && printf "."
sleep 0.25
done

echo -e "\nTimeout: Failed to start Ollama."
return 1
}

# Main script execution
main() {
# Early exit if Ollama is already running
if ollama_is_running; then
echo "Ollama is already running."
return 0
fi

# Try starting via service if possible
if in_container; then
if service --status-all 2>&1 | grep -qw 'ollama'; then
start_ollama service || return $?
return 0
fi
elif command_exists systemctl; then
if systemctl list-unit-files ollama.service >/dev/null 2>&1; then
start_ollama service || return $?
return 0
fi
fi

# Fallback to background process
if command_exists ollama; then
start_ollama || return 1
else
echo "Error: Ollama is not installed or not in the PATH." >&2
return 1
fi
}

# Run the main function and exit with its status
main
exit $?
4 changes: 4 additions & 0 deletions extensions/vscode/scripts/prepackage-cross-platform.js
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ const {
installNodeModuleInTempDirAndCopyToCurrent,
downloadSqliteBinary,
copyTokenizers,
copyScripts,
} = require("./utils");

// Clear folders that will be packaged to ensure clean slate
Expand Down Expand Up @@ -108,6 +109,9 @@ async function package(target, os, arch, exe) {
// copy llama tokenizers to out
copyTokenizers();

// Copy Linux scripts
await copyScripts();

// *** Install @lancedb binary ***
const lancePackageToInstall = {
"darwin-arm64": "@lancedb/vectordb-darwin-arm64",
Expand Down
1 change: 1 addition & 0 deletions extensions/vscode/scripts/prepackage.js
Original file line number Diff line number Diff line change
Expand Up @@ -222,6 +222,7 @@ const exe = os === "win32" ? ".exe" : "";
"../../../core/llm/llamaTokenizerWorkerPool.mjs",
"../../../core/llm/llamaTokenizer.mjs",
"../../../core/llm/tiktokenWorkerPool.mjs",
"../../../core/util/start_ollama.sh"
];

for (const f of filesToCopy) {
Expand Down
21 changes: 21 additions & 0 deletions extensions/vscode/scripts/utils.js
Original file line number Diff line number Diff line change
Expand Up @@ -505,6 +505,26 @@ async function installNodeModuleInTempDirAndCopyToCurrent(packageName, toCopy) {
}
}

async function copyScripts() {
process.chdir(path.join(continueDir, "extensions", "vscode"));
console.log("[info] Copying scripts from core");
await new Promise((resolve, reject) => {
ncp(
path.join(__dirname, "../../../core/scripts"),
path.join(__dirname, "../out"),
{ dereference: true },
(error) => {
if (error) {
console.warn("[error] Error copying script files", error);
reject(error);
} else {
resolve();
}
},
);
});
}

module.exports = {
copyConfigSchema,
installNodeModules,
Expand All @@ -519,4 +539,5 @@ module.exports = {
downloadSqliteBinary,
downloadRipgrepBinary,
copyTokenizers,
copyScripts
};
25 changes: 15 additions & 10 deletions extensions/vscode/src/VsCodeIde.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
import { EXTENSION_NAME } from "core/control-plane/env";
import { GetGhTokenArgs } from "core/protocol/ide";
import { editConfigJson, getConfigJsonPath } from "core/util/paths";
import * as vscode from "vscode";

Check warning on line 8 in extensions/vscode/src/VsCodeIde.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

There should be no empty line within import group

import * as URI from "uri-js";

Check warning on line 10 in extensions/vscode/src/VsCodeIde.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

There should be at least one empty line between import groups

Check warning on line 10 in extensions/vscode/src/VsCodeIde.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

`uri-js` import should occur before import of `vscode`
import { executeGotoProvider } from "./autocomplete/lsp";
import { Repository } from "./otherExtensions/git";
import { VsCodeIdeUtils } from "./util/ideUtils";
Expand All @@ -25,6 +25,7 @@
Location,
Problem,
RangeInFile,
TerminalOptions,
Thread,
} from "core";

Expand Down Expand Up @@ -398,24 +399,28 @@
);
}

async runCommand(command: string): Promise<void> {
if (vscode.window.terminals.length) {
const terminal =
vscode.window.activeTerminal ?? vscode.window.terminals[0];
terminal.show();
terminal.sendText(command, false);
} else {
const terminal = vscode.window.createTerminal();
terminal.show();
terminal.sendText(command, false);
async runCommand(command: string, options: TerminalOptions = {reuseTerminal: true}): Promise<void> {
let terminal: vscode.Terminal | undefined;
if (vscode.window.terminals.length && options.reuseTerminal) {
if (options.terminalName) {
terminal = vscode.window.terminals.find(t => t?.name === options.terminalName);
} else {
terminal = vscode.window.activeTerminal ?? vscode.window.terminals[0];
}
}

if( !terminal) {
terminal = vscode.window.createTerminal(options?.terminalName);
}
terminal.show();
terminal.sendText(command, false);
}

async saveFile(fileUri: string): Promise<void> {
await this.ideUtils.saveFile(vscode.Uri.parse(fileUri));
}

private static MAX_BYTES = 100000;

Check warning on line 423 in extensions/vscode/src/VsCodeIde.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

Class Property name `MAX_BYTES` must match one of the following formats: camelCase

async readFile(fileUri: string): Promise<string> {
try {
Expand Down
5 changes: 5 additions & 0 deletions extensions/vscode/src/autocomplete/completionProvider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,14 @@
} from "core/autocomplete/util/types";
import { ConfigHandler } from "core/config/ConfigHandler";
import { v4 as uuidv4 } from "uuid";
import * as vscode from "vscode";

Check warning on line 8 in extensions/vscode/src/autocomplete/completionProvider.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

There should be at least one empty line between import groups
const Diff = require("diff");

Check warning on line 9 in extensions/vscode/src/autocomplete/completionProvider.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

Variable name `Diff` must match one of the following formats: camelCase, UPPER_CASE

import { showFreeTrialLoginMessage } from "../util/messages";

Check warning on line 11 in extensions/vscode/src/autocomplete/completionProvider.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

`../util/messages` import should occur before import of `diff`
import { VsCodeWebviewProtocol } from "../webviewProtocol";

Check warning on line 12 in extensions/vscode/src/autocomplete/completionProvider.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

`../webviewProtocol` import should occur before import of `diff`

import { getDefinitionsFromLsp } from "./lsp";

Check warning on line 14 in extensions/vscode/src/autocomplete/completionProvider.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

`./lsp` import should occur before import of `diff`
import { RecentlyEditedTracker } from "./recentlyEdited";

Check warning on line 15 in extensions/vscode/src/autocomplete/completionProvider.ts

View workflow job for this annotation

GitHub Actions / vscode-checks

`./recentlyEdited` import should occur before import of `diff`
import {
StatusBarStatus,
getStatusBarStatus,
Expand All @@ -23,6 +23,7 @@

import type { IDE } from "core";
import type { TabAutocompleteModel } from "../util/loadAutocompleteModel";
import { startLocalOllama } from "core/util/ollamaHelper";

interface DiffType {
count: number;
Expand All @@ -44,6 +45,8 @@
const options = ["Documentation"];
if (e.message.includes("https://ollama.ai")) {
options.push("Download Ollama");
} else if (e.message.includes("Ollama appears to be stopped")) {
options.unshift("Start Ollama"); // We want "Start" to be the default choice
}

if (e.message.includes("Please sign in with GitHub")) {
Expand All @@ -65,6 +68,8 @@
);
} else if (val === "Download Ollama") {
vscode.env.openExternal(vscode.Uri.parse("https://ollama.ai/download"));
} else if (val == "Start Ollama") {
startLocalOllama(this.ide);
}
});
}
Expand Down
Loading