You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This PR introduced new manifests file and README to support remote inference endpoint, but this is not supported by helm chart deployment yet. opea-project/GenAIExamples#1149
We should support this feature as it's common for user to choose a cloud provided endpoint.
The text was updated successfully, but these errors were encountered:
Another thing is to add envs for passing TOKEN. In the above commit, they are CLIENTID, CLIENT_SECRET and TOKEN_URL for Oauth.
I'm not familiar with this and haven't figure out how the above 3 envs will be passed to the specified llm endpoint from the code.
Also we should consider the following:
Security way to protect the SECRET.
Other authentication method?
Priority
P2-High
OS type
Ubuntu
Hardware type
Xeon-GNR
Running nodes
Single Node
Description
This PR introduced new manifests file and README to support remote inference endpoint, but this is not supported by helm chart deployment yet.
opea-project/GenAIExamples#1149
We should support this feature as it's common for user to choose a cloud provided endpoint.
The text was updated successfully, but these errors were encountered: