Skip to content

Latest commit

 

History

History
158 lines (101 loc) · 8.98 KB

README.md

File metadata and controls

158 lines (101 loc) · 8.98 KB

LLM Interface Documentation

Welcome to the documentation for the LLM Interface package. This documentation provides comprehensive guides on how to set up, configure, and use the LLM Interface with various Language Model providers.

Table of Contents

Introduction

The LLM Interface npm module provides a unified interface for interacting with various large language models (LLMs). This documentation covers setup, configuration, usage, and examples to help you integrate LLMs into your projects efficiently.

Installation

API Keys

To interact with different LLM providers, you will need API keys. Refer to API Keys for detailed instructions on obtaining and configuring API keys for supported providers.

Usage

The Usage section contains detailed documentation on how to use the LLM Interface npm module. This includes:

LLMInterface

LLMInterfaceSendMessage

This is a legacy function and will be depreciated.

LLMInterfaceStreamMessage

This is a legacy function and will be depreciated.

Message Object

Options Object

Interface Options Object

Caching

Supported Providers

A complete list of supported providers is available here.

Model Aliases

The LLMInterface supports multiple model aliases for different providers. See Models for a list of model aliases and their descriptions.

For more detailed information, please refer to the respective sections in the documentation.

Jailbreaking

If you'd like to attempt to jailbreak your AI model, you can try a version of the message object found here.

Thanks to Shuttle AI for the original concept!

Glossary

A glossary of terms is available here.

Examples

Check out Examples for practical demonstrations of how to use the LLM Interface npm module in various scenarios.