-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'developer' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}} #17395
Comments
Hey @lori-kuo! I'm here to help you with any bugs, questions, or contributions. Let's tackle this together while we wait for a human maintainer. The error you're encountering is due to an unsupported value for To resolve this issue, ensure that the role specified in your messages aligns with one of the supported values. Adjust your code to use one of the valid roles for To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
@dosu so how should the code be written? |
To ensure the correct role is used for from llama_index.llms.openai import OpenAI
# Set the model to 'o1-preview'
Settings.llm = OpenAI(model="o1-preview")
# Create the chat engine
chat_engine = index.as_chat_engine(chat_mode="openai", llm=Settings.llm, verbose=True)
# Ensure the prompt uses a supported role
response = chat_engine.chat(prompt_temp) # prompt_temp should be structured with valid roles
print(response) Make sure that the messages you are sending to the chat engine are structured with one of the supported roles. If you have a system message, ensure it is correctly identified and processed, or defaults to a predefined system prompt if no system message is present. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
@dosu give me the full code |
Here's a complete example of how you can structure your code to use the from llama_index.llms.openai import OpenAI
from llama_index.core.chat_engine import ChatEngine
from llama_index.core.prompts import ChatMessage, MessageRole
# Initialize the OpenAI model
Settings.llm = OpenAI(model="o1-preview")
# Define your prompt with valid roles
prompt_temp = [
ChatMessage(content="This is a system message.", role=MessageRole.SYSTEM),
ChatMessage(content="Hello, how can I assist you today?", role=MessageRole.USER),
# Add more messages as needed
]
# Create the chat engine
chat_engine = index.as_chat_engine(chat_mode="openai", llm=Settings.llm, verbose=True)
# Get a response from the chat engine
response = chat_engine.chat(prompt_temp)
print(response) Ensure that your To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
@dosu
|
The error you're encountering suggests that the from llama_index.llms.openai import OpenAI
from llama_index.core.chat_engine import ChatEngine
# Initialize the OpenAI model
Settings.llm = OpenAI(model="o1-preview")
# Create the chat engine
chat_engine = index.as_chat_engine(chat_mode="openai", llm=Settings.llm, verbose=True)
# Define your prompt as a string
prompt_temp = "Hello, how can I assist you today?"
# Get a response from the chat engine
response = chat_engine.chat(prompt_temp)
print(response) Ensure that To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
@dosu
|
The error you're encountering is due to the use of an unsupported role, 'developer', in your message configuration. When using the 'o1-preview' model in the LlamaIndex framework, the valid roles for messages are To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
@lori-kuo o1-preview does not support system prompts. I thiiiiink it should be doing it's best that avoid any built in system prompts What version of |
@logan-markewich |
Bug Description
Error:
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'developer' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
Version
0.12.8
Steps to Reproduce
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: