Enable to specify OpenAI API URL to enable compatible services #134
Replies: 4 comments
-
This is a thing I will also implement soon 🙂 Thank you for kind words. |
Beta Was this translation helpful? Give feedback.
-
One step further, a lot of us homelabbers use a local OpenAI API to run open source models rather than Ollama. Personally I use VLLM since it's easier to keep a single model loaded into VRAM rather than loading/unloading like Ollama tends to do. It would be great to be able to specify any address for OpenAI as well as the model name rather than the dropdown with 3 options that's currently implemented. It would also be easier to upkeep I think, since Ollama itself can serve an OpenAI API, so you could do a single configuration option. |
Beta Was this translation helpful? Give feedback.
-
Yes, this would be extremely helpful to also use proxy services such as OpenRouter.ai It offers OpenAI compatible API (under |
Beta Was this translation helpful? Give feedback.
-
Done my friends :) |
Beta Was this translation helpful? Give feedback.
-
I would like to use the Microsoft Azure OpenAI API instead of the OpenAI API.
I believe only the URL needs to change, everything else should be the same.
It would be sufficient to specify an API URL when selecting AI Provider: OpenAI (ChatGPT)
<3 the project
Beta Was this translation helpful? Give feedback.
All reactions