You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have seen many big and small projects implementing AI in such a way that people can use locally with ollama or through litellm they can access 100+ llm easily without been forced or limiting them to specific provider like yourself . This is pretty basic thing so i wonder why things are not implemented in this way for this project ?
To be honest with you it doesn't look good when great open source projects themselves dont integrate well with other powerful and capable opensource projects.
Open source project like logseq or even closed source but quite reputable project like obsidian have beautifully integrated with local llm's through litellm and ollama so hopefully in future we can see better implementation of AI in this project.
The text was updated successfully, but these errors were encountered:
I have seen many big and small projects implementing AI in such a way that people can use locally with ollama or through litellm they can access 100+ llm easily without been forced or limiting them to specific provider like yourself . This is pretty basic thing so i wonder why things are not implemented in this way for this project ?
To be honest with you it doesn't look good when great open source projects themselves dont integrate well with other powerful and capable opensource projects.
Open source project like logseq or even closed source but quite reputable project like obsidian have beautifully integrated with local llm's through litellm and ollama so hopefully in future we can see better implementation of AI in this project.
The text was updated successfully, but these errors were encountered: