Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add developer message for o1 models #4923

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
FunctionExecutionResult,
FunctionExecutionResultMessage,
SystemMessage,
DeveloperMessage,
UserMessage,
)
from autogen_core.tools import FunctionTool, Tool
Expand Down Expand Up @@ -238,15 +239,20 @@
system_message: (
str | None
) = "You are a helpful AI assistant. Solve tasks using your tools. Reply with TERMINATE when the task has been completed.",
developer_message: (str | None) = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At this level, an agent, I think we can combine system and developer messages. Maybe it needs a new name or something, but ideally by the time we've made it this high up the stack, we've decided that either model can be used OR we've created a separate variant of this agent for the o1 models -- whatever testing shows is appropriate.

Copy link
Member

@jackgerrits jackgerrits Jan 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe at this level of abstraction it is an "instruction" message, or simply agent instructions

reflect_on_tool_use: bool = False,
tool_call_summary_format: str = "{result}",
):
super().__init__(name=name, description=description)
self._model_client = model_client
if system_message is None:
if system_message is None or developer_message is not None:
Copy link
Member

@afourney afourney Jan 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rather than checking which system message type we've received, we should be checking which model family we were given. We can then load defaults accordingly, and we can turn the system message into a developer message as needed (again, I might recommend a name like sys_or_dev_message -- but that would be a breaking change I guess)

self._system_messages = []
else:
self._system_messages = [SystemMessage(content=system_message)]
if developer_message is None:
self._developer_messages = []
else:
self._developer_messages = [DeveloperMessage(content=developer_message)]

Check warning on line 255 in python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py#L255

Added line #L255 was not covered by tests
self._tools: List[Tool] = []
if tools is not None:
if model_client.model_info["function_calling"] is False:
Expand Down Expand Up @@ -326,7 +332,10 @@
inner_messages: List[AgentEvent | ChatMessage] = []

# Generate an inference result based on the current model context.
llm_messages = self._system_messages + await self._model_context.get_messages()
if len(self._developer_messages) > 0:
llm_messages = self._developer_messages + await self._model_context.get_messages()

Check warning on line 336 in python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py#L336

Added line #L336 was not covered by tests
else:
llm_messages = self._system_messages + await self._model_context.get_messages()
result = await self._model_client.create(
llm_messages, tools=self._tools + self._handoff_tools, cancellation_token=cancellation_token
)
Expand Down Expand Up @@ -379,7 +388,10 @@

if self._reflect_on_tool_use:
# Generate another inference result based on the tool call and result.
llm_messages = self._system_messages + await self._model_context.get_messages()
if len(self._developer_messages) > 0:
llm_messages = self._developer_messages + await self._model_context.get_messages()

Check warning on line 392 in python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py#L392

Added line #L392 was not covered by tests
else:
llm_messages = self._system_messages + await self._model_context.get_messages()
result = await self._model_client.create(llm_messages, cancellation_token=cancellation_token)
assert isinstance(result.content, str)
# Add the response to the model context.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
LLMMessage,
RequestUsage,
SystemMessage,
DeveloperMessage,
TopLogprob,
UserMessage,
)
Expand All @@ -17,6 +18,7 @@
"ModelCapabilities",
"ChatCompletionClient",
"SystemMessage",
"DeveloperMessage",
"UserMessage",
"AssistantMessage",
"FunctionExecutionResult",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,11 @@ class SystemMessage(BaseModel):
type: Literal["SystemMessage"] = "SystemMessage"


class DeveloperMessage(BaseModel):
content: str
type: Literal["DeveloperMessage"] = "DeveloperMessage"


class UserMessage(BaseModel):
content: Union[str, List[Union[str, Image]]]

Expand Down Expand Up @@ -42,7 +47,8 @@ class FunctionExecutionResultMessage(BaseModel):


LLMMessage = Annotated[
Union[SystemMessage, UserMessage, AssistantMessage, FunctionExecutionResultMessage], Field(discriminator="type")
Union[SystemMessage, DeveloperMessage, UserMessage, AssistantMessage, FunctionExecutionResultMessage],
Field(discriminator="type"),
]


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@
ModelInfo,
RequestUsage,
SystemMessage,
DeveloperMessage,
TopLogprob,
UserMessage,
)
Expand All @@ -58,6 +59,7 @@
ChatCompletionMessageToolCallParam,
ChatCompletionRole,
ChatCompletionSystemMessageParam,
ChatCompletionDeveloperMessageParam,
ChatCompletionToolMessageParam,
ChatCompletionToolParam,
ChatCompletionUserMessageParam,
Expand Down Expand Up @@ -172,6 +174,13 @@
)


def developer_message_to_oai(message: DeveloperMessage) -> ChatCompletionDeveloperMessageParam:
return ChatCompletionDeveloperMessageParam(

Check warning on line 178 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L178

Added line #L178 was not covered by tests
content=message.content,
role="developer",
)


def func_call_to_oai(message: FunctionCall) -> ChatCompletionMessageToolCallParam:
return ChatCompletionMessageToolCallParam(
id=message.id,
Expand Down Expand Up @@ -212,6 +221,8 @@
def to_oai_type(message: LLMMessage) -> Sequence[ChatCompletionMessageParam]:
if isinstance(message, SystemMessage):
return [system_message_to_oai(message)]
elif isinstance(message, DeveloperMessage):
return [developer_message_to_oai(message)]

Check warning on line 225 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L225

Added line #L225 was not covered by tests
elif isinstance(message, UserMessage):
return [user_message_to_oai(message)]
elif isinstance(message, AssistantMessage):
Expand Down
Loading