Issues getting agent based on non-openai models to use local function/tool call #1323
Replies: 2 comments 11 replies
-
First of all, regarding the function not being available, it seems that the user proxy agent is in code execution mode not in function calling mode. I have seen this happen for Non-OpenAI models because they don't follow the OpenAI's function/tool call message format. So there is no To make user proxy use the registered functions, the message from assistant must include a structured Lastly, I you can use |
Beta Was this translation helpful? Give feedback.
-
Are skills in autogenstudio the same as functioning calling/tool calling? @ekzhu @sonichi It seems that it does not have the function of functional calling. Why can it be run in autogenstudio?
|
Beta Was this translation helpful? Give feedback.
-
I'm fairly new to autogen but can't really make local function calls work. I tried with openai 3.5 turbo, it appears to work, when running mistral-7b-v0.1.Q6_K.gguf or mistral-7b-instruct-v0.2.Q4_K_M.gguf the model appears to be completely ignorant about local functions, maybe there is something I'm missing. So first I added a python function that can call our erp system and get the description of a machine id
I import the skills
I then created 2 models like this
So far so good, when calling open ai 3.5 for whatever reason it works. But with mistral the llm just returns
It appears that mistral isn't capable to retrieve the function list I passed. Is that correct? Is it "just" the model having less features than openai?
Update
So I just told the system message that it should know the functions (I added another function to the original post)
Then it works in a sense, that the provided code uses those functions, BUT : I get the following errors
Why isn't the function not available?
Beta Was this translation helpful? Give feedback.
All reactions