- Original Idea was to built a platform where we can simultaneously prompt to multiple llm models.
- Especially Gemini and GPT and gather as much as information to solve our problem.
- But since gpt apis are less generous🥲 and not free to use even for devs I built this.
- Enhanced chatbot's responses by using the previous conversation to understand the context.
- This can improve the efficiency of llm to solve your problem.
- Conversations remains persisted across sessions.
- Users can visit their old conversations even in offline mode.
- Enables low-latency, real-time chat conversions with large language models.
- Elevated user experience with syntax highlighting and typing effect.
- Automatic scrolling for generated content.
apiKey.js
const apiKey = "YOUR_GEMINI_API_KEY";
export default apiKey;
Add your gemini api key inside gen-ai folder and rename the file as apiKey.js. Remember to export apiKey variable . You can also set it as your environment variable and then use it.
- Type-Safe version of code
- Support of Images and Audio
- Backend in Python for more features.
- Copy exact code or markdown rather than full response.