Support for different Prompt Presets/Formats #432
TahaScripts
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Expanded support for being able to define your own prompt formats or import prompt presets (like in LM Studio). Or is that auto-supported and I'm being an idiot? I know that I can pre-format myself before sending to server, but enforcing prompt format for the server endpoints would be awesome (i.e. error response from server: "Does not comply with ChatML format").
Some examples of prompt formats:
Llama-2-Chat
[INST] <<SYS>> {system_message} <</SYS>> {prompt} [/INST]
Mistral 8x7b Instruct
<s> [INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
Beta Was this translation helpful? Give feedback.
All reactions