You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The /chat/completions endpoint allows sending a list of all messages. Maybe sending the full chat history improves quality?
curl -X POST 'https://litellm.netzbegruenung.verdigado.net/chat/completions' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer SECRET' \
-d '{ "model": "llama3.3", "messages": [ { "role": "system", "content": "You are a helpful assistant." }, { "role": "user", "content": "Who won the world series in 2020?" }, { "role": "assistant", "content": "The Los Angeles Dodgers won the 2020 World Series, defeating the Tampa Bay Rays in the series 4 games to 2. This was the Dodgers first World Series title since 1988. The final game was played on October 27, 2020, at Globe Life Field in Arlington, Texas." }, { "role": "user", "content": "And who won in 1992?" } ]}'| jq .
To accomplish this, we need to support a list of messages in our chatanswers endpoint. Additionally, we need to fetch the data in the Integreat CMS and pass it to the chat back end. The get_messages() method can be used and the result passed into the process_user_message().
If we do this in languages that are not supported by the LLM, translations will be quite costly. So we may have to think about how we can maybe do some caching in the chat back end.
The text was updated successfully, but these errors were encountered:
The
/chat/completions
endpoint allows sending a list of all messages. Maybe sending the full chat history improves quality?To accomplish this, we need to support a list of messages in our chatanswers endpoint. Additionally, we need to fetch the data in the Integreat CMS and pass it to the chat back end. The
get_messages()
method can be used and the result passed into theprocess_user_message()
.If we do this in languages that are not supported by the LLM, translations will be quite costly. So we may have to think about how we can maybe do some caching in the chat back end.
The text was updated successfully, but these errors were encountered: