Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support specification of llm model in Bee UI #177

Open
planetf1 opened this issue Jan 9, 2025 · 0 comments
Open

Support specification of llm model in Bee UI #177

planetf1 opened this issue Jan 9, 2025 · 0 comments

Comments

@planetf1
Copy link

planetf1 commented Jan 9, 2025

Related Slack thread

Related issues

  • Link any previous issues that might be related here.

Description

Allow LLM Model used by agent to be modified via UI when running the bee stack

Motivation

An agent created via the Bee UI will always use the default LLM for the provider configured via LLM_BACKEND

A new agent can be created via the API. This is documented in the bee stack readme ie:

➜  bee-stack git:(main) curl -X POST \
  "${BEE_API:-localhost:4000}/v1/assistants" \
  -H "Authorization: Bearer ${BEE_API_KEY:-sk-proj-testkey}" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Granite assistant",
    "model": "granite3.1-dense:8b"
  }'

However I think this would be more user friendly if available through the UI, especially as different agents will work better/worse with different LLMs, so this facilitates experimentation.

Ideas

  1. Consider supporting OLLAMA_MODEL or similar to specify the default model used by Bee when creating an agent
  2. Further, Allow the Model to be updated in the Bee UI

Open Questions

Additional context

Add any other context or screenshots here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant