How to use Custom Prompts on Chat Engine? #17459
Replies: 1 comment
-
Hey @andreabrgit! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together! Yes, it is possible to use custom templates with the Chat Engine in LlamaIndex when constructing a RAG system. The Additionally, LlamaIndex provides default prompt templates for various tasks, including SQL queries with PGVector, which can be customized or used as a reference for creating your own templates. This flexibility allows you to define custom behavior for how the chat engine interacts with the user and processes information, which can be particularly useful when integrating with Django and Postgres PGVector in a RAG system [2]. For your specific case, ensure that the custom prompt template is correctly integrated into the chat engine's initialization and that the logic for handling responses aligns with your requirements. If the Chat Engine is still providing answers when it shouldn't, you might need to adjust the logic or conditions under which the responses are generated. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
Hi,
I'm using Llamaindex with Django and Postgres PGVector to construct a RAG System. I'd like do customize the Prompts of chat, specially because my native language is not english and I'd like add some extra guardrails and informations too.
I created this prompt, for example:
qa_prompt_str = ("Below is the information where you should look for an answer\n"
"If the answer to the question is not explicitly in the information provided, always respond this way: 'We were unable to find an answer to the question you asked. It may be that the information is not in the documents or it may be necessary to supplement the prompt of the question'\n"
"{context_str}\n"
"Question: {query_str}\n")
When I use the PromptTemplate together with Query Engine like this
chat_text_qa_msgs = [
("user", qa_prompt_str)
]
text_qa_template = ChatPromptTemplate.from_messages(chat_text_qa_msgs)
query= index.as_query_engine(text_qa_template=text_qa_template)
and do a question that is not in context, the response is
correct according to asked on prompt: 'We were unable to find an answer to the question you asked. It may be that the information is not in the documents or it may be necessary to supplement the prompt of the question'
But when I use the PromptTemplate together with Chat Engine (chat = index.as_chat_engine(text_qa_template=text_qa_template)), even if I do a question that is not in context, the LLM answer.
Using LlamaDebugHandler inside a CallbackManager, I can verify on stack of chat (sequential_events) that in one moment the answer is correct according the expected, but the stack continues some steps until find a answer.
I'd like to know if is possible use Custom Templates with Chat Engine.
Sorry for long text.
Beta Was this translation helpful? Give feedback.
All reactions