Skip to content

Commit

Permalink
feat: add support for llamafile as ai model provider (#14281)
Browse files Browse the repository at this point in the history
Support llm execution via llamafiles.

- The configured llamafiles are stored in the preferences
- As a user you have to provide:
  - a custom modelname
  - the file uri to the llamafile
  - the port to run the llama.cpp server on
- Llamafiles can be started and stopped via Theia commands
- The llamafile output is send to a new output channel <mymodel>-llamafile

The current implementation does not support tools/functions so that
agents like the workspace-agent don't work well with models provided
via llamafiles.

fixes #14286
  • Loading branch information
eneufeld authored Oct 16, 2024
1 parent 8bff722 commit 7e51b89
Show file tree
Hide file tree
Showing 18 changed files with 726 additions and 0 deletions.
1 change: 1 addition & 0 deletions examples/browser/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
"@theia/ai-code-completion": "1.54.0",
"@theia/ai-core": "1.54.0",
"@theia/ai-history": "1.54.0",
"@theia/ai-llamafile": "1.54.0",
"@theia/ai-ollama": "1.54.0",
"@theia/ai-openai": "1.54.0",
"@theia/ai-terminal": "1.54.0",
Expand Down
3 changes: 3 additions & 0 deletions examples/browser/tsconfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,9 @@
{
"path": "../../packages/ai-history"
},
{
"path": "../../packages/ai-llamafile"
},
{
"path": "../../packages/ai-ollama"
},
Expand Down
1 change: 1 addition & 0 deletions examples/electron/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
"@theia/ai-code-completion": "1.54.0",
"@theia/ai-core": "1.54.0",
"@theia/ai-history": "1.54.0",
"@theia/ai-llamafile": "1.54.0",
"@theia/ai-ollama": "1.54.0",
"@theia/ai-openai": "1.54.0",
"@theia/ai-terminal": "1.54.0",
Expand Down
3 changes: 3 additions & 0 deletions examples/electron/tsconfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,9 @@
{
"path": "../../packages/ai-history"
},
{
"path": "../../packages/ai-llamafile"
},
{
"path": "../../packages/ai-ollama"
},
Expand Down
10 changes: 10 additions & 0 deletions packages/ai-llamafile/.eslintrc.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
/** @type {import('eslint').Linter.Config} */
module.exports = {
extends: [
'../../configs/build.eslintrc.json'
],
parserOptions: {
tsconfigRootDir: __dirname,
project: 'tsconfig.json'
}
};
57 changes: 57 additions & 0 deletions packages/ai-llamafile/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
# AI Llamafile Integration

The AI Llamafile package provides an integration that allows users to manage and interact with Llamafile language models within Theia IDE.

## Features

- Start and stop Llamafile language servers.

## Commands

### Start Llamafile

- **Command ID:** `llamafile.start`
- **Label:** `Start Llamafile`
- **Functionality:** Allows you to start a Llamafile language server by selecting from a list of configured Llamafiles.

### Stop Llamafile

- **Command ID:** `llamafile.stop`
- **Label:** `Stop Llamafile`
- **Functionality:** Allows you to stop a running Llamafile language server by selecting from a list of currently running Llamafiles.

## Usage

1. **Starting a Llamafile Language Server:**

- Use the command palette to invoke `Start Llamafile`.
- A quick pick menu will appear with a list of configured Llamafiles.
- Select a Llamafile to start its language server.

2. **Stopping a Llamafile Language Server:**
- Use the command palette to invoke `Stop Llamafile`.
- A quick pick menu will display a list of currently running Llamafiles.
- Select a Llamafile to stop its language server.

## Dependencies

This extension depends on the `@theia/ai-core` package for AI-related services and functionalities.

## Configuration

Make sure to configure your Llamafiles properly within the preference settings.
This setting is an array of objects, where each object defines a llamafile with a user-friendly name, the file uri, and the port to start the server on.

Example Configuration:

```json
{
"ai-features.llamafile.llamafiles": [
{
"name": "MyLlamaFile",
"uri": "file:///path/to/my.llamafile",
"port": 30000
}
]
}
```
50 changes: 50 additions & 0 deletions packages/ai-llamafile/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
{
"name": "@theia/ai-llamafile",
"version": "1.54.0",
"description": "Theia - Llamafile Integration",
"dependencies": {
"@theia/ai-core": "1.54.0",
"@theia/core": "1.54.0",
"@theia/output": "1.54.0",
"tslib": "^2.6.2"
},
"publishConfig": {
"access": "public"
},
"theiaExtensions": [
{
"frontend": "lib/browser/llamafile-frontend-module",
"backend": "lib/node/llamafile-backend-module"
}
],
"keywords": [
"theia-extension"
],
"license": "EPL-2.0 OR GPL-2.0-only WITH Classpath-exception-2.0",
"repository": {
"type": "git",
"url": "https://github.com/eclipse-theia/theia.git"
},
"bugs": {
"url": "https://github.com/eclipse-theia/theia/issues"
},
"homepage": "https://github.com/eclipse-theia/theia",
"files": [
"lib",
"src"
],
"scripts": {
"build": "theiaext build",
"clean": "theiaext clean",
"compile": "theiaext compile",
"lint": "theiaext lint",
"test": "theiaext test",
"watch": "theiaext watch"
},
"devDependencies": {
"@theia/ext-scripts": "1.54.0"
},
"nyc": {
"extends": "../../configs/nyc.json"
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
// *****************************************************************************
// Copyright (C) 2024 EclipseSource GmbH.
//
// This program and the accompanying materials are made available under the
// terms of the Eclipse Public License v. 2.0 which is available at
// http://www.eclipse.org/legal/epl-2.0.
//
// This Source Code may also be made available under the following Secondary
// Licenses when the conditions for such availability set forth in the Eclipse
// Public License v. 2.0 are satisfied: GNU General Public License, version 2
// with the GNU Classpath Exception which is available at
// https://www.gnu.org/software/classpath/license.html.
//
// SPDX-License-Identifier: EPL-2.0 OR GPL-2.0-only WITH Classpath-exception-2.0
// *****************************************************************************
import { AICommandHandlerFactory } from '@theia/ai-core/lib/browser/ai-command-handler-factory';
import { CommandContribution, CommandRegistry, MessageService } from '@theia/core';
import { PreferenceService, QuickInputService } from '@theia/core/lib/browser';
import { inject, injectable } from '@theia/core/shared/inversify';
import { LlamafileEntry, LlamafileManager } from '../common/llamafile-manager';
import { PREFERENCE_LLAMAFILE } from './llamafile-preferences';

export const StartLlamafileCommand = {
id: 'llamafile.start',
label: 'Start Llamafile',
};
export const StopLlamafileCommand = {
id: 'llamafile.stop',
label: 'Stop Llamafile',
};

@injectable()
export class LlamafileCommandContribution implements CommandContribution {

@inject(QuickInputService)
protected readonly quickInputService: QuickInputService;

@inject(AICommandHandlerFactory)
protected readonly commandHandlerFactory: AICommandHandlerFactory;

@inject(PreferenceService)
protected preferenceService: PreferenceService;

@inject(MessageService)
protected messageService: MessageService;

@inject(LlamafileManager)
protected llamafileManager: LlamafileManager;

registerCommands(commandRegistry: CommandRegistry): void {
commandRegistry.registerCommand(StartLlamafileCommand, this.commandHandlerFactory({
execute: async () => {
try {
const llamaFiles = this.preferenceService.get<LlamafileEntry[]>(PREFERENCE_LLAMAFILE);
if (llamaFiles === undefined || llamaFiles.length === 0) {
this.messageService.error('No Llamafiles configured.');
return;
}
const options = llamaFiles.map(llamaFile => ({ label: llamaFile.name }));
const result = await this.quickInputService.showQuickPick(options);
if (result === undefined) {
return;
}
this.llamafileManager.startServer(result.label);
} catch (error) {
console.error('Something went wrong during the llamafile start.', error);
this.messageService.error(`Something went wrong during the llamafile start: ${error.message}.\nFor more information, see the console.`);
}
}
}));
commandRegistry.registerCommand(StopLlamafileCommand, this.commandHandlerFactory({
execute: async () => {
try {
const llamaFiles = await this.llamafileManager.getStartedLlamafiles();
if (llamaFiles === undefined || llamaFiles.length === 0) {
this.messageService.error('No Llamafiles running.');
return;
}
const options = llamaFiles.map(llamaFile => ({ label: llamaFile }));
const result = await this.quickInputService.showQuickPick(options);
if (result === undefined) {
return;
}
this.llamafileManager.stopServer(result.label);
} catch (error) {
console.error('Something went wrong during the llamafile stop.', error);
this.messageService.error(`Something went wrong during the llamafile stop: ${error.message}.\nFor more information, see the console.`);
}
}
}));
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
// *****************************************************************************
// Copyright (C) 2024 EclipseSource GmbH.
//
// This program and the accompanying materials are made available under the
// terms of the Eclipse Public License v. 2.0 which is available at
// http://www.eclipse.org/legal/epl-2.0.
//
// This Source Code may also be made available under the following Secondary
// Licenses when the conditions for such availability set forth in the Eclipse
// Public License v. 2.0 are satisfied: GNU General Public License, version 2
// with the GNU Classpath Exception which is available at
// https://www.gnu.org/software/classpath/license.html.
//
// SPDX-License-Identifier: EPL-2.0 OR GPL-2.0-only WITH Classpath-exception-2.0
// *****************************************************************************

import { FrontendApplicationContribution, PreferenceService } from '@theia/core/lib/browser';
import { inject, injectable } from '@theia/core/shared/inversify';
import { LlamafileEntry, LlamafileManager } from '../common/llamafile-manager';
import { PREFERENCE_LLAMAFILE } from './llamafile-preferences';

@injectable()
export class LlamafileFrontendApplicationContribution implements FrontendApplicationContribution {

@inject(PreferenceService)
protected preferenceService: PreferenceService;

@inject(LlamafileManager)
protected llamafileManager: LlamafileManager;

private _knownLlamaFiles: Map<string, LlamafileEntry> = new Map();

onStart(): void {
this.preferenceService.ready.then(() => {
const llamafiles = this.preferenceService.get<LlamafileEntry[]>(PREFERENCE_LLAMAFILE, []);
this.llamafileManager.addLanguageModels(llamafiles);
llamafiles.forEach(model => this._knownLlamaFiles.set(model.name, model));

this.preferenceService.onPreferenceChanged(event => {
if (event.preferenceName === PREFERENCE_LLAMAFILE) {
// only new models which are actual LLamaFileEntries
const newModels = event.newValue.filter((llamafileEntry: unknown) => LlamafileEntry.is(llamafileEntry)) as LlamafileEntry[];

const llamafilesToAdd = newModels.filter(llamafile =>
!this._knownLlamaFiles.has(llamafile.name) || !LlamafileEntry.equals(this._knownLlamaFiles.get(llamafile.name)!, llamafile));

const llamafileIdsToRemove = [...this._knownLlamaFiles.values()].filter(llamafile =>
!newModels.find(a => LlamafileEntry.equals(a, llamafile))).map(a => a.name);

this.llamafileManager.removeLanguageModels(llamafileIdsToRemove);
llamafileIdsToRemove.forEach(model => this._knownLlamaFiles.delete(model));

this.llamafileManager.addLanguageModels(llamafilesToAdd);
llamafilesToAdd.forEach(model => this._knownLlamaFiles.set(model.name, model));
}
});
});
}
}
45 changes: 45 additions & 0 deletions packages/ai-llamafile/src/browser/llamafile-frontend-module.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
// *****************************************************************************
// Copyright (C) 2024 EclipseSource GmbH.
//
// This program and the accompanying materials are made available under the
// terms of the Eclipse Public License v. 2.0 which is available at
// http://www.eclipse.org/legal/epl-2.0.
//
// This Source Code may also be made available under the following Secondary
// Licenses when the conditions for such availability set forth in the Eclipse
// Public License v. 2.0 are satisfied: GNU General Public License, version 2
// with the GNU Classpath Exception which is available at
// https://www.gnu.org/software/classpath/license.html.
//
// SPDX-License-Identifier: EPL-2.0 OR GPL-2.0-only WITH Classpath-exception-2.0
// *****************************************************************************
import { CommandContribution } from '@theia/core';
import { FrontendApplicationContribution, RemoteConnectionProvider, ServiceConnectionProvider } from '@theia/core/lib/browser';
import { ContainerModule } from '@theia/core/shared/inversify';
import { OutputChannelManager, OutputChannelSeverity } from '@theia/output/lib/browser/output-channel';
import { LlamafileManager, LlamafileManagerPath, LlamafileServerManagerClient } from '../common/llamafile-manager';
import { LlamafileCommandContribution } from './llamafile-command-contribution';
import { LlamafileFrontendApplicationContribution } from './llamafile-frontend-application-contribution';
import { bindAILlamafilePreferences } from './llamafile-preferences';

export default new ContainerModule(bind => {
bind(FrontendApplicationContribution).to(LlamafileFrontendApplicationContribution).inSingletonScope();
bind(CommandContribution).to(LlamafileCommandContribution).inSingletonScope();
bind(LlamafileManager).toDynamicValue(ctx => {
const connection = ctx.container.get<ServiceConnectionProvider>(RemoteConnectionProvider);
const outputChannelManager = ctx.container.get(OutputChannelManager);
const client: LlamafileServerManagerClient = {
error: (llamafileName, message) => {
const channel = outputChannelManager.getChannel(`${llamafileName}-llamafile`);
channel.appendLine(message, OutputChannelSeverity.Error);
},
log: (llamafileName, message) => {
const channel = outputChannelManager.getChannel(`${llamafileName}-llamafile`);
channel.appendLine(message, OutputChannelSeverity.Info);
}
};
return connection.createProxy<LlamafileManager>(LlamafileManagerPath, client);
}).inSingletonScope();

bindAILlamafilePreferences(bind);
});
Loading

0 comments on commit 7e51b89

Please sign in to comment.