-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
With current source code getting error openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}} on running the example_userproxy.py #4936
Comments
When you getting 404 error from the model API endpoint, it means that the resource you are accessing is not available. Can you make sure you have passed in the correct deployment name and the right end point URL? |
@ekzhu as I have mentioned above source code cloned on 18 Nov 2024 works fine with same variable values. Problem is with current (master) copy. |
The instruction for model configuration has been updated. Please read: https://github.com/microsoft/autogen/tree/main/python/packages/autogen-magentic-one#azure-openai-service Also, we are phasing out the |
@ekzhu I did update the configuration as per 0.4.0.dev13 documentation, but still getting the same error. Below are the details, please let me know what is missing. set CHAT_COMPLETION_PROVIDER=azure set CHAT_COMPLETION_KWARGS_JSON={"api_version": "2024-11-20","azure_endpoint": "https://*********.openai.azure.com/","model_capabilities": {"function_calling": true,"json_output": true,"vision": true},"azure_ad_token_provider": "DEFAULT","model":"gpt-4o"} az login python example_userproxy.py |
You need to update the JSON as @ekzhu mentioned. https://github.com/microsoft/autogen/tree/main/python/packages/autogen-magentic-one#environment-configuration-for-chat-completion-client |
Where is that json supposed to go? I hacked it into example.py along with the call to create the client, but I suspect that's not the right place. Also, But that doesn't match the example in the readme: I don't see anything like the readme example in AI Studio. |
You can save it to a file and load it when you run it. The input config should be a dictionary which you load from the JSON. Again, we are phasing out the package. Please use the
The example is using the Azure OpenAI endpoint, not AI Studio. For AI Studio, we are adding a different model client: #4723 |
Below config worked for me, a detailed instructions in setup documentation will be very helpfula and saves lot of time. set CHAT_COMPLETION_PROVIDER=azure |
What happened?
I am getting below error with the current (Master) source code where as it works fine with the source code downloaded on 18 Nov 2024
My environment: Windows 11
Example used: example_userproxy.py
Error: openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
Steps used:
git clone https://github.com/microsoft/autogen.git
cd autogen/python
uv sync --all-extras
set CHAT_COMPLETION_PROVIDER=azure
set CHAT_COMPLETION_KWARGS_JSON={"model":"gpt-3.5-turbo-16k", "api_key":"**********************", "api_version":"2024-08-01-preview", "base_url":"https://**************.openai.azure.com", "api_type":"azure", "azure_deployment":"gpt-35-turbo-16k", "model_capabilities": {"function_calling": true,"json_output": true,"vision": true}}
cd .venv/Scripts
activate
cd ..
cd ..
cd packages/autogen-magentic-one
pip install -e .
cd examples
python example_userproxy.py
Detailed error:
(python) C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\examples>python example_userproxy.py
User input ('exit' to quit): who is modi
[2025-01-08T12:07:12.214499], UserProxy:
who is modi
[2025-01-08T12:07:12.215653], orchestrator (thought):
Next speaker Coder
Error processing publish message for orchestrator/default
Traceback (most recent call last):
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 409, in _on_message
return await agent.on_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 149, in wrapper
return_value = await func(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 81, in handle_incoming_message
await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 47, in _process
await self._handle_broadcast(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_orchestrator.py", line 95, in _handle_broadcast
await self.send_message(request_reply_message, next_agent.id, cancellation_token=ctx.cancellation_token)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 130, in send_message
return await self._runtime.send_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 232, in send_message
return await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 321, in _process_send
response = await recipient_agent.on_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 149, in wrapper
return_value = await func(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 81, in handle_incoming_message
await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 45, in _process
await self._handle_request_reply(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_worker.py", line 42, in _handle_request_reply
request_halt, response = await self._generate_reply(ctx.cancellation_token)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\coder.py", line 55, in _generate_reply
response = await self._model_client.create(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-ext\src\autogen_ext\models\openai_openai_client.py", line 494, in create
result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai\resources\chat\completions.py", line 1720, in create
return await self._post(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1843, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1537, in request
return await self._request(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1638, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
Exception ignored in: <function _ProactorBasePipeTransport.del at 0x0000025BC91FBE20>
Traceback (most recent call last):
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\proactor_events.py", line 116, in del
self.close()
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\proactor_events.py", line 108, in close
self._loop.call_soon(self._call_connection_lost, None)
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 745, in call_soon
self._check_closed()
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 510, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
What did you expect to happen?
Example should work fine
How can we reproduce it (as minimally and precisely as possible)?
git clone https://github.com/microsoft/autogen.git
cd autogen/python
uv sync --all-extras
set CHAT_COMPLETION_PROVIDER=azure
set CHAT_COMPLETION_KWARGS_JSON={"model":"gpt-3.5-turbo-16k", "api_key":"**********************", "api_version":"2024-08-01-preview", "base_url":"https://**************.openai.azure.com", "api_type":"azure", "azure_deployment":"gpt-35-turbo-16k", "model_capabilities": {"function_calling": true,"json_output": true,"vision": true}}
cd .venv/Scripts
activate
cd ..
cd ..
cd packages/autogen-magentic-one
pip install -e .
cd examples
python example_userproxy.py
AutoGen version
Current
Which package was this bug in
Magentic One
Model used
gpt-3.5-turbo-16k
Python version
Python 3.10.1
Operating system
Windows 11
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered: