Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add configuration for Azure Llama models #39

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 15 additions & 1 deletion cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,9 @@ In general each tool result flag accecpts a comma-separated list of paths to fil

**NOTE:** It is _not_ allowed to provide multiple SARIF inputs _for the same tool_ in a single invocation of the codemodder. For example, it is not possible to provide two Semgrep SARIF files, although it would be possible to provide e.g. a Semgrep SARIF file and a CodeQL JSON file in the same invocation.

## Configuring OpenAI
## Configuring LLM Support

### Using OpenAI

You can optionally allow codemods to access OpenAI by running with the following environment variable during execution:
```bash
Expand All @@ -123,6 +125,18 @@ CODEMODDER_AZURE_OPENAI_GPT_4_TURBO_2024_04_12_API_DEPLOYMENT=<DEPLOYMENT_NAME>
- If using Azure OpenAI and a codemod requests access to a model for which there is no corresponding `CODEMODDER_AZURE_OPENAI_(MODELNAME)_DEPLOYMENT` variable, the deployment name will be assumed to be the name of the model (e.g., "gpt-4o").
- If both Azure and OpenAI instructions are available, Azure will be preferred.

### Using Llama (Azure)

Llama models hosted within Azure can be used with the following environment variables:

```bash
CODEMODDER_AZURE_LLAMA_API_KEY=<KEY>
CODEMODDER_AZURE_LLAMA_ENDPOINT=<ENDPOINT>
```

- Providing `CODEMODDER_AZURE_LLAMA_API_KEY` without `CODEMODDER_AZURE_LLAMA_ENDPOINT` (and vice versa) will cause a failure on startup.
- Configuring the Azure Llama client is orthogonal to configuring the Azure OpenAI client. Both can be used in the same codemodder run.

## Console output

The codemodder CLI output is described in [its own specification](logging.md).
Expand Down