Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] What's the need for logging to AppFlowy Cloud to access AI settings when we have local models? #7234

Open
Greatz08 opened this issue Jan 17, 2025 · 0 comments
Assignees

Comments

@Greatz08
Copy link

Image

I have seen many big and small projects implementing AI in such a way that people can use locally with ollama or through litellm they can access 100+ llm easily without been forced or limiting them to specific provider like yourself . This is pretty basic thing so i wonder why things are not implemented in this way for this project ?
To be honest with you it doesn't look good when great open source projects themselves dont integrate well with other powerful and capable opensource projects.
Open source project like logseq or even closed source but quite reputable project like obsidian have beautifully integrated with local llm's through litellm and ollama so hopefully in future we can see better implementation of AI in this project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants