Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'developer' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}} #17395

Open
lori-kuo opened this issue Dec 31, 2024 · 11 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@lori-kuo
Copy link

lori-kuo commented Dec 31, 2024

Bug Description

from llama_index.llms.openai import OpenAI
Settings.llm = OpenAI(model="o1-preview")
chat_engine = index.as_chat_engine(chat_mode="openai", llm=Settings.llm, verbose=True)
response = chat_engine.chat(prompt_temp) # prompt_temp

Error:
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'developer' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

Version

0.12.8

Steps to Reproduce

Settings.llm = OpenAI(model="o1-preview")
Settings.embed_model = OpenAIEmbedding(model = 'text-embedding-3-small')
Settings.node_parser = SentenceSplitter(chunk_size=512, chunk_overlap=20)
chat_engine = index.as_chat_engine(chat_mode="openai", llm=Settings.llm, verbose=True)
response = chat_engine.chat(prompt_temp) # prompt_temp
print(response)

Relevant Logs/Tracbacks

No response

@lori-kuo lori-kuo added bug Something isn't working triage Issue needs to be triaged/prioritized labels Dec 31, 2024
Copy link

dosubot bot commented Dec 31, 2024

Hey @lori-kuo! I'm here to help you with any bugs, questions, or contributions. Let's tackle this together while we wait for a human maintainer.

The error you're encountering is due to an unsupported value for messages[0].role. When using the 'o1-preview' model in the LlamaIndex framework, the valid values for messages[0].role are MessageRole.SYSTEM, MessageRole.USER, and MessageRole.ASSISTANT [1]. The role 'developer' is not supported, which is causing the BadRequestError.

To resolve this issue, ensure that the role specified in your messages aligns with one of the supported values. Adjust your code to use one of the valid roles for messages[0].role.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@lori-kuo
Copy link
Author

@dosu so how should the code be written?

Copy link

dosubot bot commented Dec 31, 2024

To ensure the correct role is used for messages[0].role in the LlamaIndex framework with the 'o1-preview' model, you should avoid using the 'developer' role, as it is not supported. Instead, use one of the valid roles such as MessageRole.SYSTEM, MessageRole.USER, or MessageRole.ASSISTANT. Here's how you can adjust your code:

from llama_index.llms.openai import OpenAI

# Set the model to 'o1-preview'
Settings.llm = OpenAI(model="o1-preview")

# Create the chat engine
chat_engine = index.as_chat_engine(chat_mode="openai", llm=Settings.llm, verbose=True)

# Ensure the prompt uses a supported role
response = chat_engine.chat(prompt_temp)  # prompt_temp should be structured with valid roles
print(response)

Make sure that the messages you are sending to the chat engine are structured with one of the supported roles. If you have a system message, ensure it is correctly identified and processed, or defaults to a predefined system prompt if no system message is present.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@lori-kuo
Copy link
Author

@dosu give me the full code

Copy link

dosubot bot commented Dec 31, 2024

Here's a complete example of how you can structure your code to use the o1-preview model with the LlamaIndex framework, ensuring that the roles in your messages are valid:

from llama_index.llms.openai import OpenAI
from llama_index.core.chat_engine import ChatEngine
from llama_index.core.prompts import ChatMessage, MessageRole

# Initialize the OpenAI model
Settings.llm = OpenAI(model="o1-preview")

# Define your prompt with valid roles
prompt_temp = [
    ChatMessage(content="This is a system message.", role=MessageRole.SYSTEM),
    ChatMessage(content="Hello, how can I assist you today?", role=MessageRole.USER),
    # Add more messages as needed
]

# Create the chat engine
chat_engine = index.as_chat_engine(chat_mode="openai", llm=Settings.llm, verbose=True)

# Get a response from the chat engine
response = chat_engine.chat(prompt_temp)
print(response)

Ensure that your prompt_temp variable is structured with ChatMessage objects using valid roles like MessageRole.SYSTEM, MessageRole.USER, or MessageRole.ASSISTANT [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@lori-kuo
Copy link
Author

@dosu
but your code didn't work, the return as follows

pydantic_core._pydantic_core.ValidationError: 1 validation error for Task
input
  Input should be a valid string [type=string_type, input_value=[ChatMessage(role=<Messag..."","evidence": ""} ')])], input_type=list]
    For further information visit https://errors.pydantic.dev/2.10/v/string_type

Copy link

dosubot bot commented Dec 31, 2024

The error you're encountering suggests that the chat method is expecting a string input rather than a list of ChatMessage objects. Here's how you can adjust your code to resolve this issue:

from llama_index.llms.openai import OpenAI
from llama_index.core.chat_engine import ChatEngine

# Initialize the OpenAI model
Settings.llm = OpenAI(model="o1-preview")

# Create the chat engine
chat_engine = index.as_chat_engine(chat_mode="openai", llm=Settings.llm, verbose=True)

# Define your prompt as a string
prompt_temp = "Hello, how can I assist you today?"

# Get a response from the chat engine
response = chat_engine.chat(prompt_temp)
print(response)

Ensure that prompt_temp is a string, as the chat method in this context expects a string input for the message parameter [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@lori-kuo
Copy link
Author

@dosu
So the problem goes back to the beginning

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'developer' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}} 

Copy link

dosubot bot commented Dec 31, 2024

The error you're encountering is due to the use of an unsupported role, 'developer', in your message configuration. When using the 'o1-preview' model in the LlamaIndex framework, the valid roles for messages are SYSTEM, USER, ASSISTANT, FUNCTION, TOOL, CHATBOT, and MODEL [1]. Ensure that your message roles are set to one of these valid options to avoid the unsupported_value error. Adjust your code to replace 'developer' with one of these supported roles.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@logan-markewich
Copy link
Collaborator

@lori-kuo o1-preview does not support system prompts. I thiiiiink it should be doing it's best that avoid any built in system prompts

What version of llama-index-llms-openai are you on?

@lori-kuo
Copy link
Author

@logan-markewich
llama-index-llms-openai 0.3.12

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

2 participants