Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration Support for GPT-4 Turbo and Updated GPT 3.5 Turbo Models #707

Closed
alex-feel opened this issue Nov 7, 2023 · 3 comments
Closed

Comments

@alex-feel
Copy link

I'd like to request the integration of the new GPT-4 Turbo (gpt-4-1106-preview) and Updated GPT 3.5 Turbo (gpt-3.5-turbo-1106) models as announced in OpenAI's recent blog post.

The current documentation for litellm mentions support for only gpt-4-1106-preview. It would be beneficial for us to support both models mentioned above.

Please note, this might require an update of the openai module to incorporate bug fixes from this PR. However, the latest litellm package version 0.13.2 is compatible with up to openai package version 0.28.1, which should be taken into account to ensure compatibility is maintained.

@yuhongsun96
Copy link
Contributor

Good suggestion, will look into it and also ask the LiteLLM folks

@photonn
Copy link

photonn commented Jan 9, 2024

In Azure, gpt-4-1106-preview IS gpt-4 turbo. I'm using it with danswer without issues since a few weeks ago:

https://techcommunity.microsoft.com/t5/ai-azure-ai-services-blog/azure-openai-service-launches-gpt-4-turbo-and-gpt-3-5-turbo-1106/ba-p/3985962

Is curious as Microsoft announced 120k token window in ignite but I have only 80k available. Maybe they make a mistake with the window of 3.5-turbo, which is 120k

@yuhongsun96
Copy link
Contributor

Yup, gpt-4-turbo has been supported for quite some time now, going to close this one now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants