Does Autogen 0.4 support streaming mode? #5154
Unanswered
easonktsai
asked this question in
Q&A
Replies: 2 comments 1 reply
-
Looks down a bit and you will find |
Beta Was this translation helpful? Give feedback.
0 replies
-
How can I enable streaming responses for the AzureOpenAIChatCompletionClient in RoundRobinGroupChat? from autogen_ext.models.openai import AzureOpenAIChatCompletionClient
model_client = AzureOpenAIChatCompletionClient(
azure_endpoint="",
api_version="2024-08-01-preview",
api_key="",
model='gpt-4o',
)
async def get_weather(location: str) -> str:
return f"The weather in {location} is sunny."
assistant = AssistantAgent(
"Assistant",
model_client=model_client,
tools=[get_weather],
)
termination = TextMentionTermination("TERMINATE")
team = RoundRobinGroupChat([assistant], termination_condition=termination)
await Console(team.run_stream(task="What's the weather in New York?")) |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I noticed that in the class BaseOpenAIChatCompletionClient, the 'stream' parameter in self._client.chat.completions.create is hardcoded to 'False'. Does this mean that streaming is not currently supported?
Beta Was this translation helpful? Give feedback.
All reactions