Replies: 1 comment
-
Hi, sorry about the late response. We haven't yet set up automatic tagging on discussions. Eidolon does support local plan via ollama. In our documentation you can read how to configure the ollama llm provider. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Does Eidolon-AI support the use of self-hosted local LLMs? This would provide users with the flexibility to keep their data within their organization, avoiding external transmission.
Beta Was this translation helpful? Give feedback.
All reactions