Replies: 1 comment 2 replies
-
Would this help? https://microsoft.github.io/autogen/docs/Use-Cases/enhanced_inference#templating |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've now encountered public models that sound like they would fit agent styles/roles I've found interesting, but these models require a particular prompt format. I'll use StableBeluga as an example; at a glance it seems quite apropos to autogen:
https://huggingface.co/stabilityai/StableBeluga-13B
It uses a multi-stop(?) format:
I guess this is similar to Google Vertex's "Context" component.
Assuming I have the model available via a local, openai endpoint (e.g thru LM Studio) how might I integrate it, say, for the critic role of the research https://github.com/microsoft/autogen/blob/main/notebook/agentchat_groupchat_research.ipynb example?
Note I'm not specifically looking to use beluga, and I realize I could just prefix the messages with '### System:\nThis is your role...' but I'm also considering more nuanced than that, if that isn't a little more than autogen is ready for yet :)
Beta Was this translation helpful? Give feedback.
All reactions