Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for openai-compatible APIs that do not support streaming #353

Open
jackmpcollins opened this issue Oct 20, 2024 · 1 comment

Comments

@jackmpcollins
Copy link
Owner

Currently OpenaiChatModel has stream=True and parses streamed responses. For some openai-compatible APIs streaming is not supported, which means they cannot be used with magentic.

For example ollama does not support streaming: #207 Also openai's o1-preview does not support streaming (or function calling) currently, so cannot be used in magentic even just to generate string responses.

Implementation could probably be just converting a non-streamed response into the streamed format (with just one chunk, or one chunk per content/tool call section), and then reuse the existing code.

@mnicstruwig
Copy link
Contributor

Ah nice -- ran into this issue while implementing a custom copilot for SambaNova: OpenBB-finance/copilot-for-openbb#28

We had to abandon magentic's chat models for this reason (but still borrowed the function schema machinery).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants