-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added Azure AI Chat Completion Client #4723
Added Azure AI Chat Completion Client #4723
Conversation
@yanivvak can you review this? |
@ekzhu @rohanthacker
|
python/packages/autogen-ext/src/autogen_ext/models/azure/_azure_ai_client.py
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/azure/_azure_ai_client.py
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/azure/_azure_ai_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/azure/_azure_ai_client.py
Show resolved
Hide resolved
@lspinheiro could you help reviewing this PR? |
python/packages/autogen-ext/src/autogen_ext/models/azure/_azure_ai_client.py
Outdated
Show resolved
Hide resolved
Looks quite good and is consistent with the openai client. I have a minor comment about the config validation. @jackgerrits may have more options since a lot of the design decisions here are driven by his original implementation of the openai client. If anything doesn't make since in this context he would be the best person to evaluate. |
* Added: object-level usage data * Added: doc string * Added: check existing response_format value * Added: _validate_config and _create_client
d53421b
to
daf43de
Compare
python/packages/autogen-ext/src/autogen_ext/models/azure/_azure_ai_client.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rohanthacker Is this PR ready to review? It looks good from code perspective. @srjoglekar246 could you help to use it in an assistant agent and running some teams on it to test it out?
python/packages/autogen-ext/src/autogen_ext/models/azure/_azure_ai_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/azure/_azure_ai_client.py
Outdated
Show resolved
Hide resolved
@rohanthacker I made this PR ready for review. |
@ekzhu For context, this works for Azure AI Inference. I tested on a Phi-4 deployment I created. from semantic_kernel import Kernel
from semantic_kernel.memory.null_memory import NullMemory
from semantic_kernel.connectors.ai.azure_ai_inference import AzureAIInferenceChatCompletion
from semantic_kernel.connectors.ai.azure_ai_inference import AzureAIInferenceChatPromptExecutionSettings
from autogen_core.models import SystemMessage, UserMessage, LLMMessage
from autogen_ext.models.semantic_kernel import SKChatCompletionAdapter
kernel = Kernel(memory=NullMemory())
execution_settings = AzureAIInferenceChatPromptExecutionSettings(
max_tokens=100,
temperature=0.5,
top_p=0.9,
)
chat_completion_service = AzureAIInferenceChatCompletion(ai_model_id="Phi-4")
model_adapter = SKChatCompletionAdapter(sk_client=chat_completion_service)
messages: list[LLMMessage] = [
SystemMessage(content="You are a helpful assistant."),
UserMessage(content="What is 2 + 2?", source="user"),
]
azure_result = await model_adapter.create(
messages=messages,
extra_create_args={"kernel": kernel, "prompt_execution_settings": execution_settings},
)
print("Azure result:", azure_result.content) |
To avoid confusion and prevent overwriting each others work. I am closing this pull request in favor of #5153 However I would really appreciate a callout (reddit/docs, etc) if and when this goes live |
Thanks @rohanthacker , I had just fixed the typing/linting issues in your PR but I couldn't push the changes to your branch. If I knew you would pick it up again I could have made a PR in your fork. Anyway, I will try to get this done today. |
@lspinheiro No worries, Thank you for the guidance and suggestions on the PR. |
Thanks for all of your work to get it to this point @rohanthacker! We'll make sure to call out, and since the @lspinheiro's PR continues based on your commits you will also be credited as co-author in the commit history |
Related issue number
#4683 Adds initial support for Azure AI Chat Completion Client
Checks