-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add developer message for o1 models #4923
base: main
Are you sure you want to change the base?
Changes from all commits
c0cb82e
e4de2b7
2f218bb
27046ed
5dfa51d
508496d
1870250
0a63c3b
e1885fa
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -24,6 +24,7 @@ | |
FunctionExecutionResult, | ||
FunctionExecutionResultMessage, | ||
SystemMessage, | ||
DeveloperMessage, | ||
UserMessage, | ||
) | ||
from autogen_core.tools import FunctionTool, Tool | ||
|
@@ -238,15 +239,20 @@ | |
system_message: ( | ||
str | None | ||
) = "You are a helpful AI assistant. Solve tasks using your tools. Reply with TERMINATE when the task has been completed.", | ||
developer_message: (str | None) = None, | ||
reflect_on_tool_use: bool = False, | ||
tool_call_summary_format: str = "{result}", | ||
): | ||
super().__init__(name=name, description=description) | ||
self._model_client = model_client | ||
if system_message is None: | ||
if system_message is None or developer_message is not None: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Rather than checking which system message type we've received, we should be checking which model family we were given. We can then load defaults accordingly, and we can turn the system message into a developer message as needed (again, I might recommend a name like |
||
self._system_messages = [] | ||
else: | ||
self._system_messages = [SystemMessage(content=system_message)] | ||
if developer_message is None: | ||
self._developer_messages = [] | ||
else: | ||
self._developer_messages = [DeveloperMessage(content=developer_message)] | ||
Check warning on line 255 in python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py Codecov / codecov/patchpython/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py#L255
|
||
self._tools: List[Tool] = [] | ||
if tools is not None: | ||
if model_client.model_info["function_calling"] is False: | ||
|
@@ -326,7 +332,10 @@ | |
inner_messages: List[AgentEvent | ChatMessage] = [] | ||
|
||
# Generate an inference result based on the current model context. | ||
llm_messages = self._system_messages + await self._model_context.get_messages() | ||
if len(self._developer_messages) > 0: | ||
llm_messages = self._developer_messages + await self._model_context.get_messages() | ||
Check warning on line 336 in python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py Codecov / codecov/patchpython/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py#L336
|
||
else: | ||
llm_messages = self._system_messages + await self._model_context.get_messages() | ||
result = await self._model_client.create( | ||
llm_messages, tools=self._tools + self._handoff_tools, cancellation_token=cancellation_token | ||
) | ||
|
@@ -379,7 +388,10 @@ | |
|
||
if self._reflect_on_tool_use: | ||
# Generate another inference result based on the tool call and result. | ||
llm_messages = self._system_messages + await self._model_context.get_messages() | ||
if len(self._developer_messages) > 0: | ||
llm_messages = self._developer_messages + await self._model_context.get_messages() | ||
Check warning on line 392 in python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py Codecov / codecov/patchpython/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py#L392
|
||
else: | ||
llm_messages = self._system_messages + await self._model_context.get_messages() | ||
result = await self._model_client.create(llm_messages, cancellation_token=cancellation_token) | ||
assert isinstance(result.content, str) | ||
# Add the response to the model context. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At this level, an agent, I think we can combine system and developer messages. Maybe it needs a new name or something, but ideally by the time we've made it this high up the stack, we've decided that either model can be used OR we've created a separate variant of this agent for the o1 models -- whatever testing shows is appropriate.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe at this level of abstraction it is an "instruction" message, or simply agent instructions