Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Update openinference-vercel semantic convention mapping for output #1203

Merged
merged 2 commits into from
Jan 18, 2025

Conversation

luke-moehlenbrock
Copy link
Contributor

When using openinference-vercel with the Vercel AI SDK, spans don't have the output attribute.

This is due to Vercel updating their semantic conventions to use ai.response.text for the model response instead of ai.result.text.

ai.response.text was added to the semantic conventions and ai.result.text was marked as deprecated in this commit and ai.result.text was removed entirely in this commit

With the current mapping, there's no output attribute which results in the Phoenix UI only showing the input for the span:
Screenshot 2025-01-15 at 5 37 46 PM
Screenshot 2025-01-15 at 5 37 55 PM

Updating the prefix from result to response fixes this:
Screenshot 2025-01-15 at 5 38 11 PM
Screenshot 2025-01-15 at 5 38 19 PM

@luke-moehlenbrock luke-moehlenbrock requested a review from a team as a code owner January 15, 2025 22:42
Copy link
Contributor

github-actions bot commented Jan 15, 2025

CLA Assistant Lite bot All contributors have signed the CLA ✍️ ✅

@luke-moehlenbrock
Copy link
Contributor Author

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Jan 15, 2025
@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Jan 15, 2025
@mikeldking
Copy link
Contributor

Thanks @luke-moehlenbrock ! Great catch.

@@ -13,7 +13,7 @@ const AIPrefixes = {
telemetry: "telemetry",
prompt: "prompt",
toolCall: "toolCall",
result: "result",
result: "response",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
result: "response",
response: "response",

Shouldn't this be the case here? @Parker-Stafford

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah for consistency lets do that, then you'll want to change these
https://github.com/Arize-ai/openinference/blob/main/js/packages/openinference-vercel/src/AISemanticConventions.ts#L53-L60
to response as well and key off AIPrefixes.response

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the suggestions @Parker-Stafford and @mikeldking! I went through and changed result to response throughout the package

@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. and removed size:XS This PR changes 0-9 lines, ignoring generated files. labels Jan 16, 2025
@mikeldking
Copy link
Contributor

Thanks for the fix!

@mikeldking mikeldking merged commit 365a3c2 into Arize-ai:main Jan 18, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:M This PR changes 30-99 lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

3 participants