Skip to content

nearai/nearai_langchain

Repository files navigation

NearAI LangChain Integration

nearai_langchain provides seamless integration between NearAI and LangChain, allowing developers to use NearAI's capabilities within their LangChain applications.

🚧 Development Status

This library is currently in active development.

🎯 Key Purposes

  1. Optional NearAI Inference Integration

    • Access model inference through NearAI's optimized infrastructure
    • Maintain compatibility with standard LangChain for other use cases
    • Seamlessly switch between NearAI and LangChain inference
  2. NearAI Registry Integration

    • Register and manage agents in the NearAI registry
    • Optionally, enable agent-to-agent interaction and make your agents callable by other agents in the NearAI registry
    • Auto-generate or validate agent metadata
    • Example metadata.json:
      {
        "category": "agent",
        "name": "example_agent",
        "namespace": "user.nearai",
        "tags": ["example"],
        "details": {
          "agent": {
            "defaults": {
              "model": "llama-v3p1-70b-instruct",
              "provider": "fireworks"
            },
            "framework": "langchain"  // Use "langchain" or "nearai" for inference
          }
        }
      }
  3. Agent Intercommunication

    • Upload agents to be used by other agents
    • Call other agents from the registry in your own agents
    • Framework-agnostic: works with both NearAI and LangChain inference
  4. Benchmarking and Evaluation

    • Run popular or user owned benchmarks on agents
    • Optionally, upload evaluation results to NearAI evaluation table
    • Support for both NearAI and LangChain inference frameworks

🌟 Features

  • Drop-in replacement for LangChain chat models
  • Support for multiple model providers
  • Flexible framework switching between LangChain and NearAI
  • Type-safe interfaces

🚀 Quick Start

import getpass
import os

import nearai_langchain
from langchain_core.messages import HumanMessage, SystemMessage

NearaiLangchain.init() # Reads metadata and inits either Langchain or NearAI. Supports "langchain" or "nearai" frameworks.

model = InferenceProvider()

messages = [
    SystemMessage("Translate the following from English into Italian"),
    HumanMessage("hi!"),
]

model.invoke(messages)

📦 Installation

pip install nearai-langchain

🛠️ Development Setup

  1. Clone the repository:

    git clone https://github.com/nearai/nearai_langchain.git
    cd nearai_langchain
  2. Install dependencies:

    ./install.sh
  3. Development tools:

  • Run format check: ./scripts/format_check.sh
  • Run linting: ./scripts/lint_check.sh
  • Run type check: ./scripts/type_check.sh

About

NearAI integration for LangChain

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published