Skip to content

Commit

Permalink
Add responseLogProbs and logProbs parameters to generateContentReq (#266
Browse files Browse the repository at this point in the history
)

* Add responseLogProbs and logProbs parameters to generateContentReq

* update docs & test

* Update docs and add avglogprobs and logprobsresult as output

* update variable names in responses.ts

* Move parameters to GenerationConfig

* Update test cases for new parameters to test generationConfig

* Updated generatecontentresponse testcase

* Update case of logprobs

* put back parameters in test case
  • Loading branch information
shilpakancharla authored Sep 26, 2024
1 parent dda0b5c commit b3d1e01
Show file tree
Hide file tree
Showing 30 changed files with 420 additions and 68 deletions.
2 changes: 1 addition & 1 deletion .changeset/cyan-pants-move.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@
"@google/generative-ai": minor
---

Add `frequencyPenalty` and `presencePenalty` parameters support for `generateContent()`
Add `frequencyPenalty`, `presencePenalty`, `responseLogprobs`, and `logProbs` parameters support for `generationConfig`. Added `avgLogprobs` and `logprobsResult` to `GenerateContentResponse`. Updated test cases.
30 changes: 24 additions & 6 deletions common/api-review/generative-ai.api.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,8 @@

// @public
export interface BaseParams {
frequencyPenalty?: number;
// (undocumented)
generationConfig?: GenerationConfig;
presencePenalty?: number;
// (undocumented)
safetySettings?: SafetySetting[];
}
Expand Down Expand Up @@ -371,6 +369,7 @@ export interface FunctionResponsePart {

// @public
export interface GenerateContentCandidate {
avgLogprobs?: number;
// (undocumented)
citationMetadata?: CitationMetadata;
// (undocumented)
Expand All @@ -381,6 +380,7 @@ export interface GenerateContentCandidate {
finishReason?: FinishReason;
// (undocumented)
index: number;
logprobsResult?: LogprobsResult;
// (undocumented)
safetyRatings?: SafetyRating[];
}
Expand Down Expand Up @@ -429,8 +429,12 @@ export interface GenerateContentStreamResult {
export interface GenerationConfig {
// (undocumented)
candidateCount?: number;
frequencyPenalty?: number;
logprobs?: number;
// (undocumented)
maxOutputTokens?: number;
presencePenalty?: number;
responseLogprobs?: boolean;
responseMimeType?: string;
responseSchema?: ResponseSchema;
// (undocumented)
Expand Down Expand Up @@ -460,17 +464,13 @@ export class GenerativeModel {
cachedContent: CachedContent;
countTokens(request: CountTokensRequest | string | Array<string | Part>, requestOptions?: SingleRequestOptions): Promise<CountTokensResponse>;
embedContent(request: EmbedContentRequest | string | Array<string | Part>, requestOptions?: SingleRequestOptions): Promise<EmbedContentResponse>;
// (undocumented)
frequencyPenalty?: number;
generateContent(request: GenerateContentRequest | string | Array<string | Part>, requestOptions?: SingleRequestOptions): Promise<GenerateContentResult>;
generateContentStream(request: GenerateContentRequest | string | Array<string | Part>, requestOptions?: SingleRequestOptions): Promise<GenerateContentStreamResult>;
// (undocumented)
generationConfig: GenerationConfig;
// (undocumented)
model: string;
// (undocumented)
presencePenalty?: number;
// (undocumented)
safetySettings: SafetySetting[];
startChat(startChatParams?: StartChatParams): ChatSession;
// (undocumented)
Expand Down Expand Up @@ -577,6 +577,19 @@ export interface InlineDataPart {
text?: never;
}

// @public
export interface LogprobsCandidate {
logProbability: number;
token: string;
tokenID: number;
}

// @public
export interface LogprobsResult {
chosenCandidates: LogprobsCandidate[];
topCandidates: TopCandidates[];
}

// @public
export interface ModelParams extends BaseParams {
// (undocumented)
Expand Down Expand Up @@ -730,6 +743,11 @@ export interface ToolConfig {
functionCallingConfig: FunctionCallingConfig;
}

// @public
export interface TopCandidates {
candidates: LogprobsCandidate[];
}

// @public
export interface UsageMetadata {
cachedContentTokenCount?: number;
Expand Down
2 changes: 0 additions & 2 deletions docs/reference/main/generative-ai.baseparams.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,6 @@ export interface BaseParams

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [frequencyPenalty?](./generative-ai.baseparams.frequencypenalty.md) | | number | _(Optional)_ Frequency penalty applied to the next token's logprobs, multiplied by the number of times each token has been seen in the respponse so far. |
| [generationConfig?](./generative-ai.baseparams.generationconfig.md) | | [GenerationConfig](./generative-ai.generationconfig.md) | _(Optional)_ |
| [presencePenalty?](./generative-ai.baseparams.presencepenalty.md) | | number | _(Optional)_ Presence penalty applied to the next token's logprobs if the token has already been seen in the response. |
| [safetySettings?](./generative-ai.baseparams.safetysettings.md) | | [SafetySetting](./generative-ai.safetysetting.md)<!-- -->\[\] | _(Optional)_ |

Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerateContentCandidate](./generative-ai.generatecontentcandidate.md) &gt; [avgLogprobs](./generative-ai.generatecontentcandidate.avglogprobs.md)

## GenerateContentCandidate.avgLogprobs property

Average log probability score of the candidate.

**Signature:**

```typescript
avgLogprobs?: number;
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerateContentCandidate](./generative-ai.generatecontentcandidate.md) &gt; [logprobsResult](./generative-ai.generatecontentcandidate.logprobsresult.md)

## GenerateContentCandidate.logprobsResult property

Log-likelihood scores for the response tokens and top tokens.

**Signature:**

```typescript
logprobsResult?: LogprobsResult;
```
2 changes: 2 additions & 0 deletions docs/reference/main/generative-ai.generatecontentcandidate.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,12 @@ export interface GenerateContentCandidate

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [avgLogprobs?](./generative-ai.generatecontentcandidate.avglogprobs.md) | | number | _(Optional)_ Average log probability score of the candidate. |
| [citationMetadata?](./generative-ai.generatecontentcandidate.citationmetadata.md) | | [CitationMetadata](./generative-ai.citationmetadata.md) | _(Optional)_ |
| [content](./generative-ai.generatecontentcandidate.content.md) | | [Content](./generative-ai.content.md) | |
| [finishMessage?](./generative-ai.generatecontentcandidate.finishmessage.md) | | string | _(Optional)_ |
| [finishReason?](./generative-ai.generatecontentcandidate.finishreason.md) | | [FinishReason](./generative-ai.finishreason.md) | _(Optional)_ |
| [index](./generative-ai.generatecontentcandidate.index.md) | | number | |
| [logprobsResult?](./generative-ai.generatecontentcandidate.logprobsresult.md) | | [LogprobsResult](./generative-ai.logprobsresult.md) | _(Optional)_ Log-likelihood scores for the response tokens and top tokens. |
| [safetyRatings?](./generative-ai.generatecontentcandidate.safetyratings.md) | | [SafetyRating](./generative-ai.safetyrating.md)<!-- -->\[\] | _(Optional)_ |

Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [BaseParams](./generative-ai.baseparams.md) &gt; [frequencyPenalty](./generative-ai.baseparams.frequencypenalty.md)
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerationConfig](./generative-ai.generationconfig.md) &gt; [frequencyPenalty](./generative-ai.generationconfig.frequencypenalty.md)

## BaseParams.frequencyPenalty property
## GenerationConfig.frequencyPenalty property

Frequency penalty applied to the next token's logprobs, multiplied by the number of times each token has been seen in the respponse so far.

Expand Down
13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.generationconfig.logprobs.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerationConfig](./generative-ai.generationconfig.md) &gt; [logprobs](./generative-ai.generationconfig.logprobs.md)

## GenerationConfig.logprobs property

Valid if responseLogProbs is set to True. This will set the number of top logprobs to return at each decoding step in the logprobsResult.

**Signature:**

```typescript
logprobs?: number;
```
4 changes: 4 additions & 0 deletions docs/reference/main/generative-ai.generationconfig.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,11 @@ export interface GenerationConfig
| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [candidateCount?](./generative-ai.generationconfig.candidatecount.md) | | number | _(Optional)_ |
| [frequencyPenalty?](./generative-ai.generationconfig.frequencypenalty.md) | | number | _(Optional)_ Frequency penalty applied to the next token's logprobs, multiplied by the number of times each token has been seen in the respponse so far. |
| [logprobs?](./generative-ai.generationconfig.logprobs.md) | | number | _(Optional)_ Valid if responseLogProbs is set to True. This will set the number of top logprobs to return at each decoding step in the logprobsResult. |
| [maxOutputTokens?](./generative-ai.generationconfig.maxoutputtokens.md) | | number | _(Optional)_ |
| [presencePenalty?](./generative-ai.generationconfig.presencepenalty.md) | | number | _(Optional)_ Presence penalty applied to the next token's logprobs if the token has already been seen in the response. |
| [responseLogprobs?](./generative-ai.generationconfig.responselogprobs.md) | | boolean | _(Optional)_ If True, export the logprobs results in response. |
| [responseMimeType?](./generative-ai.generationconfig.responsemimetype.md) | | string | _(Optional)_ Output response mimetype of the generated candidate text. Supported mimetype: <code>text/plain</code>: (default) Text output. <code>application/json</code>: JSON response in the candidates. |
| [responseSchema?](./generative-ai.generationconfig.responseschema.md) | | [ResponseSchema](./generative-ai.responseschema.md) | _(Optional)_ Output response schema of the generated candidate text. Note: This only applies when the specified <code>responseMIMEType</code> supports a schema; currently this is limited to <code>application/json</code>. |
| [stopSequences?](./generative-ai.generationconfig.stopsequences.md) | | string\[\] | _(Optional)_ |
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [BaseParams](./generative-ai.baseparams.md) &gt; [presencePenalty](./generative-ai.baseparams.presencepenalty.md)
[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerationConfig](./generative-ai.generationconfig.md) &gt; [presencePenalty](./generative-ai.generationconfig.presencepenalty.md)

## BaseParams.presencePenalty property
## GenerationConfig.presencePenalty property

Presence penalty applied to the next token's logprobs if the token has already been seen in the response.

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerationConfig](./generative-ai.generationconfig.md) &gt; [responseLogprobs](./generative-ai.generationconfig.responselogprobs.md)

## GenerationConfig.responseLogprobs property

If True, export the logprobs results in response.

**Signature:**

```typescript
responseLogprobs?: boolean;
```

This file was deleted.

2 changes: 0 additions & 2 deletions docs/reference/main/generative-ai.generativemodel.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,8 @@ export declare class GenerativeModel
| --- | --- | --- | --- |
| [apiKey](./generative-ai.generativemodel.apikey.md) | | string | |
| [cachedContent](./generative-ai.generativemodel.cachedcontent.md) | | [CachedContent](./generative-ai.cachedcontent.md) | |
| [frequencyPenalty?](./generative-ai.generativemodel.frequencypenalty.md) | | number | _(Optional)_ |
| [generationConfig](./generative-ai.generativemodel.generationconfig.md) | | [GenerationConfig](./generative-ai.generationconfig.md) | |
| [model](./generative-ai.generativemodel.model.md) | | string | |
| [presencePenalty?](./generative-ai.generativemodel.presencepenalty.md) | | number | _(Optional)_ |
| [safetySettings](./generative-ai.generativemodel.safetysettings.md) | | [SafetySetting](./generative-ai.safetysetting.md)<!-- -->\[\] | |
| [systemInstruction?](./generative-ai.generativemodel.systeminstruction.md) | | [Content](./generative-ai.content.md) | _(Optional)_ |
| [toolConfig?](./generative-ai.generativemodel.toolconfig.md) | | [ToolConfig](./generative-ai.toolconfig.md) | _(Optional)_ |
Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsCandidate](./generative-ai.logprobscandidate.md) &gt; [logProbability](./generative-ai.logprobscandidate.logprobability.md)

## LogprobsCandidate.logProbability property

The candidate's log probability.

**Signature:**

```typescript
logProbability: number;
```
22 changes: 22 additions & 0 deletions docs/reference/main/generative-ai.logprobscandidate.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsCandidate](./generative-ai.logprobscandidate.md)

## LogprobsCandidate interface

Candidate for the logprobs token and score.

**Signature:**

```typescript
export interface LogprobsCandidate
```

## Properties

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [logProbability](./generative-ai.logprobscandidate.logprobability.md) | | number | The candidate's log probability. |
| [token](./generative-ai.logprobscandidate.token.md) | | string | The candidate's token string value. |
| [tokenID](./generative-ai.logprobscandidate.tokenid.md) | | number | The candidate's token id value. |

13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.logprobscandidate.token.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsCandidate](./generative-ai.logprobscandidate.md) &gt; [token](./generative-ai.logprobscandidate.token.md)

## LogprobsCandidate.token property

The candidate's token string value.

**Signature:**

```typescript
token: string;
```
13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.logprobscandidate.tokenid.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsCandidate](./generative-ai.logprobscandidate.md) &gt; [tokenID](./generative-ai.logprobscandidate.tokenid.md)

## LogprobsCandidate.tokenID property

The candidate's token id value.

**Signature:**

```typescript
tokenID: number;
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md) &gt; [chosenCandidates](./generative-ai.logprobsresult.chosencandidates.md)

## LogprobsResult.chosenCandidates property

Length = total number of decoding steps. The chosen candidates may or may not be in topCandidates.

**Signature:**

```typescript
chosenCandidates: LogprobsCandidate[];
```
21 changes: 21 additions & 0 deletions docs/reference/main/generative-ai.logprobsresult.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md)

## LogprobsResult interface

Logprobs Result

**Signature:**

```typescript
export interface LogprobsResult
```

## Properties

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [chosenCandidates](./generative-ai.logprobsresult.chosencandidates.md) | | [LogprobsCandidate](./generative-ai.logprobscandidate.md)<!-- -->\[\] | Length = total number of decoding steps. The chosen candidates may or may not be in topCandidates. |
| [topCandidates](./generative-ai.logprobsresult.topcandidates.md) | | [TopCandidates](./generative-ai.topcandidates.md)<!-- -->\[\] | Length = total number of decoding steps. |

13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.logprobsresult.topcandidates.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md) &gt; [topCandidates](./generative-ai.logprobsresult.topcandidates.md)

## LogprobsResult.topCandidates property

Length = total number of decoding steps.

**Signature:**

```typescript
topCandidates: TopCandidates[];
```
3 changes: 3 additions & 0 deletions docs/reference/main/generative-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,8 @@
| [GenerationConfig](./generative-ai.generationconfig.md) | Config options for content-related requests |
| [GenerativeContentBlob](./generative-ai.generativecontentblob.md) | Interface for sending an image. |
| [InlineDataPart](./generative-ai.inlinedatapart.md) | Content part interface if the part represents an image. |
| [LogprobsCandidate](./generative-ai.logprobscandidate.md) | Candidate for the logprobs token and score. |
| [LogprobsResult](./generative-ai.logprobsresult.md) | Logprobs Result |
| [ModelParams](./generative-ai.modelparams.md) | Params passed to [GoogleGenerativeAI.getGenerativeModel()](./generative-ai.googlegenerativeai.getgenerativemodel.md)<!-- -->. |
| [PromptFeedback](./generative-ai.promptfeedback.md) | If the prompt was blocked, this will be populated with <code>blockReason</code> and the relevant <code>safetyRatings</code>. |
| [RequestOptions](./generative-ai.requestoptions.md) | Params passed to getGenerativeModel() or GoogleAIFileManager(). |
Expand All @@ -85,6 +87,7 @@
| [StartChatParams](./generative-ai.startchatparams.md) | Params for [GenerativeModel.startChat()](./generative-ai.generativemodel.startchat.md)<!-- -->. |
| [TextPart](./generative-ai.textpart.md) | Content part interface if the part represents a text string. |
| [ToolConfig](./generative-ai.toolconfig.md) | Tool config. This config is shared for all tools provided in the request. |
| [TopCandidates](./generative-ai.topcandidates.md) | Candidates with top log probabilities at each decoding step |
| [UsageMetadata](./generative-ai.usagemetadata.md) | Metadata on the generation request's token usage. |

## Variables
Expand Down
13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.topcandidates.candidates.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [TopCandidates](./generative-ai.topcandidates.md) &gt; [candidates](./generative-ai.topcandidates.candidates.md)

## TopCandidates.candidates property

Sorted by log probability in descending order.

**Signature:**

```typescript
candidates: LogprobsCandidate[];
```
Loading

0 comments on commit b3d1e01

Please sign in to comment.