Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add developer message for o1 models #4923

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
Original file line number Diff line number Diff line change
Expand Up @@ -209,6 +209,27 @@
)


def prepare_o1_messages(messages: Sequence[LLMMessage]) -> Sequence[LLMMessage]:
system_content = ""
user_messages = []

Check warning on line 214 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L213-L214

Added lines #L213 - L214 were not covered by tests

# Separate system and user messages
for msg in messages:
if isinstance(msg, SystemMessage):
system_content += msg.content + "\n"
elif isinstance(msg, UserMessage):
user_messages.append(msg)

Check warning on line 221 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L217-L221

Added lines #L217 - L221 were not covered by tests

# Ensure there's at least one user message to attach the system content
if not user_messages:
raise ValueError("No UserMessage found to append SystemMessage content.")

Check warning on line 225 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L224-L225

Added lines #L224 - L225 were not covered by tests

# Prepend the collected system content to the first user message
user_messages[0].content = f"{system_content.strip()}\n\n{user_messages[0].content.strip()}"

Check warning on line 228 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L228

Added line #L228 was not covered by tests
bassmang marked this conversation as resolved.
Show resolved Hide resolved

return user_messages

Check warning on line 230 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L230

Added line #L230 was not covered by tests


def to_oai_type(message: LLMMessage) -> Sequence[ChatCompletionMessageParam]:
if isinstance(message, SystemMessage):
return [system_message_to_oai(message)]
Expand Down Expand Up @@ -427,6 +448,8 @@
if self.capabilities["json_output"] is False and json_output is True:
raise ValueError("Model does not support JSON output")

if self.model_info["family"] == "o1":
messages = prepare_o1_messages(messages)

Check warning on line 452 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L452

Added line #L452 was not covered by tests
bassmang marked this conversation as resolved.
Show resolved Hide resolved
oai_messages_nested = [to_oai_type(m) for m in messages]
oai_messages = [item for sublist in oai_messages_nested for item in sublist]

Expand Down
Loading