You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently OpenaiChatModel has stream=True and parses streamed responses. For some openai-compatible APIs streaming is not supported, which means they cannot be used with magentic.
For example ollama does not support streaming: #207 Also openai's o1-preview does not support streaming (or function calling) currently, so cannot be used in magentic even just to generate string responses.
Implementation could probably be just converting a non-streamed response into the streamed format (with just one chunk, or one chunk per content/tool call section), and then reuse the existing code.
The text was updated successfully, but these errors were encountered:
Currently
OpenaiChatModel
hasstream=True
and parses streamed responses. For some openai-compatible APIs streaming is not supported, which means they cannot be used with magentic.For example ollama does not support streaming: #207 Also openai's
o1-preview
does not support streaming (or function calling) currently, so cannot be used in magentic even just to generate string responses.Implementation could probably be just converting a non-streamed response into the streamed format (with just one chunk, or one chunk per content/tool call section), and then reuse the existing code.
The text was updated successfully, but these errors were encountered: