Welcome to the documentation for the LLM Interface package. This documentation provides comprehensive guides on how to set up, configure, and use the LLM Interface with various Language Model providers.
- Introduction
- Installation
- API Keys
- Usage
- LLMInterface
- getAllModelNames()
- getEmbeddingsModelAlias(interfaceName, alias)
- getInterfaceConfigValue(interfaceName, key)
- getModelAlias(interfaceName, alias)
- setApiKey(interfaceNames, apiKey)
- setEmbeddingsModelAlias(interfaceName, alias, name)
- setModelAlias(interfaceName, alias, name)
- configureCache(cacheConfig = {})
- flushCache()
- sendMessage(interfaceName, message, options = {}, interfaceOptions = {})
- streamMessage(interfaceName, message, options = {})
- embeddings(interfaceName, embeddingString, options = {}, interfaceOptions = {})
- chat.completions.create(interfaceName, message, options = {}, interfaceOptions = {})
- LLMInterfaceSendMessage
- LLMInterfaceStreamMessage
- Message Object
- Options Object
- Interface Options Object
- Caching
- LLMInterface
- Supported Providers
- Model Aliases
- Jailbreaking
- Glossary
- Examples
The LLM Interface npm module provides a unified interface for interacting with various large language models (LLMs). This documentation covers setup, configuration, usage, and examples to help you integrate LLMs into your projects efficiently.
To interact with different LLM providers, you will need API keys. Refer to API Keys for detailed instructions on obtaining and configuring API keys for supported providers.
The Usage section contains detailed documentation on how to use the LLM Interface npm module. This includes:
This is a legacy function and will be depreciated.
This is a legacy function and will be depreciated.
A complete list of supported providers is available here.
The LLMInterface supports multiple model aliases for different providers. See Models for a list of model aliases and their descriptions.
For more detailed information, please refer to the respective sections in the documentation.
If you'd like to attempt to jailbreak your AI model, you can try a version of the message object found here.
Thanks to Shuttle AI for the original concept!
A glossary of terms is available here.
Check out Examples for practical demonstrations of how to use the LLM Interface npm module in various scenarios.