You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'd like to request the integration of the new GPT-4 Turbo (gpt-4-1106-preview) and Updated GPT 3.5 Turbo (gpt-3.5-turbo-1106) models as announced in OpenAI's recent blog post.
The current documentation for litellm mentions support for only gpt-4-1106-preview. It would be beneficial for us to support both models mentioned above.
Please note, this might require an update of the openai module to incorporate bug fixes from this PR. However, the latest litellm package version 0.13.2 is compatible with up to openai package version 0.28.1, which should be taken into account to ensure compatibility is maintained.
The text was updated successfully, but these errors were encountered:
Is curious as Microsoft announced 120k token window in ignite but I have only 80k available. Maybe they make a mistake with the window of 3.5-turbo, which is 120k
I'd like to request the integration of the new GPT-4 Turbo (
gpt-4-1106-preview
) and Updated GPT 3.5 Turbo (gpt-3.5-turbo-1106
) models as announced in OpenAI's recent blog post.The current documentation for
litellm
mentions support for onlygpt-4-1106-preview
. It would be beneficial for us to support both models mentioned above.Please note, this might require an update of the
openai
module to incorporate bug fixes from this PR. However, the latestlitellm
package version 0.13.2 is compatible with up toopenai
package version 0.28.1, which should be taken into account to ensure compatibility is maintained.The text was updated successfully, but these errors were encountered: