-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Update openinference-vercel semantic convention mapping for output #1203
Conversation
CLA Assistant Lite bot All contributors have signed the CLA ✍️ ✅ |
I have read the CLA Document and I hereby sign the CLA |
Thanks @luke-moehlenbrock ! Great catch. |
@@ -13,7 +13,7 @@ const AIPrefixes = { | |||
telemetry: "telemetry", | |||
prompt: "prompt", | |||
toolCall: "toolCall", | |||
result: "result", | |||
result: "response", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
result: "response", | |
response: "response", |
Shouldn't this be the case here? @Parker-Stafford
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah for consistency lets do that, then you'll want to change these
https://github.com/Arize-ai/openinference/blob/main/js/packages/openinference-vercel/src/AISemanticConventions.ts#L53-L60
to response as well and key off AIPrefixes.response
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the suggestions @Parker-Stafford and @mikeldking! I went through and changed result to response throughout the package
51f6b92
to
d944a82
Compare
Thanks for the fix! |
When using openinference-vercel with the Vercel AI SDK, spans don't have the output attribute.
This is due to Vercel updating their semantic conventions to use
ai.response.text
for the model response instead ofai.result.text
.ai.response.text
was added to the semantic conventions andai.result.text
was marked as deprecated in this commit andai.result.text
was removed entirely in this commitWith the current mapping, there's no output attribute which results in the Phoenix UI only showing the input for the span:
Updating the prefix from
result
toresponse
fixes this: