Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Support Azure OpenAI #14722

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
54 changes: 54 additions & 0 deletions packages/ai-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,12 +27,66 @@ You can configure the end points via the `ai-features.openAiCustom.customOpenAiM
url: string
id?: string
apiKey?: string | true
apiVersion?: string | true
supportsDeveloperMessage: boolean
enableStreaming: boolean
}
```

- `model` and `url` are mandatory attributes, indicating the end point and model to use
- `id` is an optional attribute which is used in the UI to refer to this configuration
- `apiKey` is either the key to access the API served at the given URL or `true` to use the global OpenAI API key. If not given 'no-key' will be used.
- `apiVersion` is either the api version to access the API served at the given URL in Azure or `true` to use the global OpenAI API version.
- `supportsDeveloperMessage` is a flag that indicates whether the model supports the `developer` role or not. `true` by default.
- `enableStreaming` is a flag that indicates whether the streaming API shall be used or not. `true` by default.

### Azure OpenAI

To use a custom OpenAI model hosted on Azure, the `AzureOpenAI` class needs to be used, as described in the
[openai-node docs](https://github.com/openai/openai-node?tab=readme-ov-file#microsoft-azure-openai).

Requests to an OpenAI model hosted on Azure need an `apiVersion`. To configure a custom OpenAI model in Theia you therefore need to configure the `apiVersion` with the end point.
Note that if you don't configure an `apiVersion`, the default `OpenAI` object is used for initialization and a connection to an Azure hosted OpenAI model will fail.

An OpenAI model version deployed on Azure might not support the `developer` role. In that case it is possible to configure whether the `developer` role is supported or not via the
`supportsDeveloperMessage` option, which defaults to `true`.

The following snippet shows a possible configuration to access an OpenAI model hosted on Azure. The `AZURE_OPENAI_API_BASE_URL` needs to be given without the `/chat/completions`
path and without the `api-version` parameter, e.g. _`https://<my_prefix>.openai.azure.com/openai/deployments/<my_deployment>`_

```json
{
"ai-features.AiEnable.enableAI": true,
"ai-features.openAiCustom.customOpenAiModels": [
{
"model": "gpt4o",
"url": "<AZURE_OPENAI_API_BASE_URL>",
"id": "azure-deployment",
"apiKey": "<AZURE_OPENAI_API_KEY>",
"apiVersion": "<AZURE_OPENAI_API_VERSION>",
"supportsDeveloperMessage": false
}
],
"ai-features.agentSettings": {
"Universal": {
"languageModelRequirements": [
{
"purpose": "chat",
"identifier": "azure-deployment"
}
]
},
"Orchestrator": {
"languageModelRequirements": [
{
"purpose": "agent-selection",
"identifier": "azure-deployment"
}
]
}
}
}
```

## Additional Information

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,8 @@ export class OpenAiFrontendApplicationContribution implements FrontendApplicatio
model.model === newModel.model &&
model.url === newModel.url &&
model.apiKey === newModel.apiKey &&
model.apiVersion === newModel.apiVersion &&
model.supportsDeveloperMessage === newModel.supportsDeveloperMessage &&
model.enableStreaming === newModel.enableStreaming));

this.manager.removeLanguageModels(...modelsToRemove.map(model => model.id));
Expand All @@ -113,6 +115,8 @@ export class OpenAiFrontendApplicationContribution implements FrontendApplicatio
id: id,
model: modelId,
apiKey: true,
apiVersion: true,
supportsDeveloperMessage: !openAIModelsSupportingDeveloperMessages.includes(modelId),
enableStreaming: !openAIModelsWithDisabledStreaming.includes(modelId),
defaultRequestSettings: modelRequestSetting?.requestSettings
};
Expand All @@ -136,6 +140,8 @@ export class OpenAiFrontendApplicationContribution implements FrontendApplicatio
model: pref.model,
url: pref.url,
apiKey: typeof pref.apiKey === 'string' || pref.apiKey === true ? pref.apiKey : undefined,
apiVersion: typeof pref.apiVersion === 'string' || pref.apiVersion === true ? pref.apiVersion : undefined,
supportsDeveloperMessage: pref.supportsDeveloperMessage ?? true,
enableStreaming: pref.enableStreaming ?? true,
defaultRequestSettings: modelRequestSetting?.requestSettings
}
Expand All @@ -160,3 +166,4 @@ export class OpenAiFrontendApplicationContribution implements FrontendApplicatio
}

const openAIModelsWithDisabledStreaming = ['o1-preview', 'o1-mini'];
const openAIModelsSupportingDeveloperMessages = ['o1-preview', 'o1-mini'];
12 changes: 12 additions & 0 deletions packages/ai-openai/src/browser/openai-preferences.ts
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,10 @@ export const OpenAiPreferencesSchema: PreferenceSchema = {
\n\
- provide an `apiKey` to access the API served at the given url. Use `true` to indicate the use of the global OpenAI API key.\
\n\
- provide an `apiVersion` to access the API served at the given url in Azure. Use `true` to indicate the use of the global OpenAI API version.\
\n\
- specify `supportsDeveloperMessage: false` to indicate that the developer role shall not be used.\
\n\
- specify `enableStreaming: false` to indicate that streaming shall not be used.\
\n\
Refer to [our documentation](https://theia-ide.org/docs/user_ai/#openai-compatible-models-eg-via-vllm) for more information.',
Expand All @@ -73,6 +77,14 @@ export const OpenAiPreferencesSchema: PreferenceSchema = {
type: ['string', 'boolean'],
title: 'Either the key to access the API served at the given url or `true` to use the global OpenAI API key',
},
apiVersion: {
type: ['string', 'boolean'],
title: 'Either the version to access the API served at the given url in Azure or `true` to use the global OpenAI API version',
},
supportsDeveloperMessage: {
type: 'boolean',
title: 'Indicates whether the model supports the `developer` role. `true` by default.',
},
enableStreaming: {
type: 'boolean',
title: 'Indicates whether the streaming API shall be used. `true` by default.',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,10 +32,18 @@ export interface OpenAiModelDescription {
* The key for the model. If 'true' is provided the global OpenAI API key will be used.
*/
apiKey: string | true | undefined;
/**
* The version for the api. If 'true' is provided the global OpenAI version will be used.
*/
apiVersion: string | true | undefined;
/**
* Indicate whether the streaming API shall be used.
*/
enableStreaming: boolean;
/**
* Flag to configure whether the OpenAPI model supports the `developer` role. Default is `true`.
*/
supportsDeveloperMessage: boolean;
/**
* Default request settings for the OpenAI model.
*/
Expand All @@ -44,6 +52,7 @@ export interface OpenAiModelDescription {
export interface OpenAiLanguageModelsManager {
apiKey: string | undefined;
setApiKey(key: string | undefined): void;
setApiVersion(version: string | undefined): void;
createOrUpdateLanguageModels(...models: OpenAiModelDescription[]): Promise<void>;
removeLanguageModels(...modelIds: string[]): void
}
26 changes: 15 additions & 11 deletions packages/ai-openai/src/node/openai-language-model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ import {
LanguageModelTextResponse
} from '@theia/ai-core';
import { CancellationToken } from '@theia/core';
import OpenAI from 'openai';
import { OpenAI, AzureOpenAI } from 'openai';
import { ChatCompletionStream } from 'openai/lib/ChatCompletionStream';
import { RunnableToolFunctionWithoutParse } from 'openai/lib/RunnableFunction';
import { ChatCompletionMessageParam } from 'openai/resources';
Expand All @@ -38,6 +38,8 @@ export class OpenAiModel implements LanguageModel {
* @param model the model id as it is used by the OpenAI API
* @param enableStreaming whether the streaming API shall be used
* @param apiKey a function that returns the API key to use for this model, called on each request
* @param apiVersion a function that returns the OpenAPI version to use for this model, called on each request
* @param supportsDeveloperMessage whether the model supports the `developer` role
* @param url the OpenAI API compatible endpoint where the model is hosted. If not provided the default OpenAI endpoint will be used.
* @param defaultRequestSettings optional default settings for requests made using this model.
*/
Expand All @@ -46,6 +48,8 @@ export class OpenAiModel implements LanguageModel {
public model: string,
public enableStreaming: boolean,
public apiKey: () => string | undefined,
public apiVersion: () => string | undefined,
public supportsDeveloperMessage: boolean,
public url: string | undefined,
public defaultRequestSettings?: { [key: string]: unknown }
) { }
Expand Down Expand Up @@ -164,7 +168,7 @@ export class OpenAiModel implements LanguageModel {
protected toOpenAiRole(message: LanguageModelRequestMessage): 'developer' | 'user' | 'assistant' {
switch (message.actor) {
case 'system':
return this.supportsDeveloperMessage() ? 'developer' : 'user';
return this.supportsDeveloperMessage ? 'developer' : 'user';
case 'ai':
return 'assistant';
default:
Expand All @@ -185,13 +189,6 @@ export class OpenAiModel implements LanguageModel {
].includes(this.model);
}

protected supportsDeveloperMessage(): boolean {
return ![
'o1-preview',
'o1-mini'
].includes(this.model);
}

protected async handleStructuredOutputRequest(openai: OpenAI, request: LanguageModelRequest): Promise<LanguageModelParsedResponse> {
const settings = this.getSettings(request);
// TODO implement tool support for structured output (parse() seems to require different tool format)
Expand Down Expand Up @@ -235,7 +232,14 @@ export class OpenAiModel implements LanguageModel {
if (!apiKey && !(this.url)) {
throw new Error('Please provide OPENAI_API_KEY in preferences or via environment variable');
}
// We need to hand over "some" key, even if a custom url is not key protected as otherwise the OpenAI client will throw an error
return new OpenAI({ apiKey: apiKey ?? 'no-key', baseURL: this.url });

const apiVersion = this.apiVersion();
if (apiVersion) {
// We need to hand over "some" key, even if a custom url is not key protected as otherwise the OpenAI client will throw an error
return new AzureOpenAI({ apiKey: apiKey ?? 'no-key', baseURL: this.url, apiVersion: apiVersion });
} else {
// We need to hand over "some" key, even if a custom url is not key protected as otherwise the OpenAI client will throw an error
return new OpenAI({ apiKey: apiKey ?? 'no-key', baseURL: this.url });
}
}
}
26 changes: 26 additions & 0 deletions packages/ai-openai/src/node/openai-language-models-manager-impl.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ import { OpenAiLanguageModelsManager, OpenAiModelDescription } from '../common';
export class OpenAiLanguageModelsManagerImpl implements OpenAiLanguageModelsManager {

protected _apiKey: string | undefined;
protected _apiVersion: string | undefined;

@inject(LanguageModelRegistry)
protected readonly languageModelRegistry: LanguageModelRegistry;
Expand All @@ -31,6 +32,10 @@ export class OpenAiLanguageModelsManagerImpl implements OpenAiLanguageModelsMana
return this._apiKey ?? process.env.OPENAI_API_KEY;
}

get apiVersion(): string | undefined {
return this._apiVersion ?? process.env.OPENAI_API_VERSION;
}

// Triggered from frontend. In case you want to use the models on the backend
// without a frontend then call this yourself
async createOrUpdateLanguageModels(...modelDescriptions: OpenAiModelDescription[]): Promise<void> {
Expand All @@ -45,6 +50,15 @@ export class OpenAiLanguageModelsManagerImpl implements OpenAiLanguageModelsMana
}
return undefined;
};
const apiVersionProvider = () => {
if (modelDescription.apiVersion === true) {
return this.apiVersion;
}
if (modelDescription.apiVersion) {
return modelDescription.apiVersion;
}
return undefined;
};

if (model) {
if (!(model instanceof OpenAiModel)) {
Expand All @@ -55,6 +69,8 @@ export class OpenAiLanguageModelsManagerImpl implements OpenAiLanguageModelsMana
model.enableStreaming = modelDescription.enableStreaming;
model.url = modelDescription.url;
model.apiKey = apiKeyProvider;
model.apiVersion = apiVersionProvider;
model.supportsDeveloperMessage = modelDescription.supportsDeveloperMessage;
model.defaultRequestSettings = modelDescription.defaultRequestSettings;
} else {
this.languageModelRegistry.addLanguageModels([
Expand All @@ -63,6 +79,8 @@ export class OpenAiLanguageModelsManagerImpl implements OpenAiLanguageModelsMana
modelDescription.model,
modelDescription.enableStreaming,
apiKeyProvider,
apiVersionProvider,
modelDescription.supportsDeveloperMessage,
modelDescription.url,
modelDescription.defaultRequestSettings
)
Expand All @@ -82,4 +100,12 @@ export class OpenAiLanguageModelsManagerImpl implements OpenAiLanguageModelsMana
this._apiKey = undefined;
}
}

setApiVersion(apiVersion: string | undefined): void {
if (apiVersion) {
this._apiVersion = apiVersion;
} else {
this._apiVersion = undefined;
}
}
}