Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add developer message for o1 models #4923

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -207,6 +207,25 @@
name=message.source,
)

def prepare_o1_messages(messages: Sequence[LLMMessage]) -> Sequence[LLMMessage]:
system_content = ""
user_messages = []

Check warning on line 212 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L211-L212

Added lines #L211 - L212 were not covered by tests

# Separate system and user messages
for msg in messages:
if isinstance(msg, SystemMessage):
system_content += msg.content + "\n"
elif isinstance(msg, UserMessage):
user_messages.append(msg)

Check warning on line 219 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L215-L219

Added lines #L215 - L219 were not covered by tests

# Ensure there's at least one user message to attach the system content
if not user_messages:
raise ValueError("No UserMessage found to append SystemMessage content.")

Check warning on line 223 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L222-L223

Added lines #L222 - L223 were not covered by tests

# Prepend the collected system content to the first user message
user_messages[0].content = f"{system_content.strip()}\n\n{user_messages[0].content.strip()}"

Check warning on line 226 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L226

Added line #L226 was not covered by tests
bassmang marked this conversation as resolved.
Show resolved Hide resolved

return user_messages

Check warning on line 228 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L228

Added line #L228 was not covered by tests

def to_oai_type(message: LLMMessage) -> Sequence[ChatCompletionMessageParam]:
if isinstance(message, SystemMessage):
Expand Down Expand Up @@ -426,6 +445,8 @@
if self.capabilities["json_output"] is False and json_output is True:
raise ValueError("Model does not support JSON output")

if self.model_info["family"] == "o1":
messages = prepare_o1_messages(messages)

Check warning on line 449 in python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py

View check run for this annotation

Codecov / codecov/patch

python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py#L449

Added line #L449 was not covered by tests
bassmang marked this conversation as resolved.
Show resolved Hide resolved
oai_messages_nested = [to_oai_type(m) for m in messages]
oai_messages = [item for sublist in oai_messages_nested for item in sublist]

Expand Down
Loading