Replies: 2 comments
-
Thanks for the issue. Looks like you can set a default reply message in the user proxy: see |
Beta Was this translation helpful? Give feedback.
0 replies
-
If you would like it to terminate when default_auto_reply is reached, you can set the default_auto_reply to "TERMINATE", or set is_termination_message on the recipient to check for empty messages. If you would like the agent to call the LLM, add an llm_config to user proxy. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
when user_proxy use NEVER hunman input mode, in the second round of dialogue,user_proxy send empty message to assistant:
Steps to reproduce
the code is agentchat_teaching.ipynb:
Model Used
GLM-4
Expected Behavior
I think maybe when message is empty, the agent should stop,or return some prompt to the caller?
Screenshots and logs
the result is:
Additional Information
the env is docker
Beta Was this translation helpful? Give feedback.
All reactions