Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot exclude temperature param (it is unsupported by OpenAI o1) #909

Open
sansari opened this issue Jan 23, 2025 · 0 comments
Open

Cannot exclude temperature param (it is unsupported by OpenAI o1) #909

sansari opened this issue Jan 23, 2025 · 0 comments

Comments

@sansari
Copy link

sansari commented Jan 23, 2025

Describe the bug
When using OpenAI's o1 model (e.g. o1-2024-12-17), the temperature param is not supported. However, if you want to use the chat method and pass no temperature, it's not currently possible, due to the fallback to DEFAULTS:

Image

To Reproduce

chat(messages: [...], model: "o1-2024-12-17", temperature: nil)
chat(messages: [...], model: "o1-2024-12-17")

Both of the above will end up using the DEFAULTS temperature value which is setup inside the internals of the library and cannot be easily overridden.

Expected behavior
We probably want something similar to this in how it removes the dimension parameter for a particular model: https://github.com/patterns-ai-core/langchainrb/blob/main/lib/langchain/llm/openai.rb#L86

Screenshots

Image

Desktop (please complete the following information):

  • OS: Linux
  • Ruby version 3.3.5
  • Langchain.rb version 0.19.2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant