Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluate sending full message history #126

Open
svenseeberg opened this issue Jan 9, 2025 · 1 comment
Open

Evaluate sending full message history #126

svenseeberg opened this issue Jan 9, 2025 · 1 comment
Labels
analysis Analyse/comparative study of features enhancement New feature or request prio:low

Comments

@svenseeberg
Copy link
Member

svenseeberg commented Jan 9, 2025

The /chat/completions endpoint allows sending a list of all messages. Maybe sending the full chat history improves quality?

curl -X POST 'https://litellm.netzbegruenung.verdigado.net/chat/completions' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer SECRET' \
-d '{                                        
    "model": "llama3.3",
    "messages": [
      {                                                       
        "role": "system",                                       
        "content": "You are a helpful assistant."
      },
      {
        "role": "user",
        "content": "Who won the world series in 2020?"
      },
      {
        "role": "assistant",
        "content": "The Los Angeles Dodgers won the 2020 World Series, defeating the Tampa Bay Rays in the series 4 games to 2. This was the Dodgers first World Series title since 1988. The final game was played on October 27, 2020, at Globe Life Field in Arlington, Texas."
      },
      {
        "role": "user",
        "content": "And who won in 1992?"
      }
    ]
}' | jq .

To accomplish this, we need to support a list of messages in our chatanswers endpoint. Additionally, we need to fetch the data in the Integreat CMS and pass it to the chat back end. The get_messages() method can be used and the result passed into the process_user_message().

If we do this in languages that are not supported by the LLM, translations will be quite costly. So we may have to think about how we can maybe do some caching in the chat back end.

@svenseeberg svenseeberg added enhancement New feature or request analysis Analyse/comparative study of features labels Jan 9, 2025
@svenseeberg svenseeberg changed the title Evaulate sending full message history Evaluate sending full message history Jan 9, 2025
@svenseeberg
Copy link
Member Author

Caching is already implemented, see #127

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
analysis Analyse/comparative study of features enhancement New feature or request prio:low
Projects
None yet
Development

No branches or pull requests

1 participant