Novita AI & LangChain Integration Guide
Integrate Novita AI with LangChain to build intelligent, language model-driven applications. Learn setup, API usage, and function calling workflows.
This guide will walk you through the process of integrating Novita AI with LangChain. You’ll be able to use Novita AI’s powerful language models with LangChain’s robust tools for building language model-driven applications.
What is LangChain?
LangChain is a framework for developing applications powered by language models. It enables applications that:
-
Are context-aware: LangChain connects a language model to sources of context (such as prompt instructions, few-shot examples, content to ground its response in, etc.).
-
Reason: LangChain allows language models to reason—whether it’s determining how to answer based on provided context or deciding what actions to take.
With LangChain, you can build complex workflows, enhance model behavior with external knowledge, and create intelligent systems that can interact dynamically with users and data sources.
Prerequisites
Before you start, make sure you have the following:
-
Novita AI LLM API Key:
-
Visit Novita AI’s website and create an account.
-
After logging in, go to the Key Management page to generate your API Key. This key is required to connect Novita AI’s models to LangChain.
-
-
A basic understanding of Node.js, JavaScript, and how to use environment variables.
Integration
Step 1: Set the API Key
For most environments, set the environment variable NOVITA_API_KEY
as follows:
Make sure to replace your-api-key
with the actual key you got from Novita AI.
Step 2: Install the Required Packages
To integrate Novita AI with LangChain, you need to install the @langchain-community
package, which includes the Novita AI integration.
Choose one of the following commands to install the necessary packages:
Using npm:
Using yarn:
Using pnpm:
Step 3: Instantiate the Novita AI Model
Once you’ve installed the necessary packages, you can instantiate the Novita AI model using the ChatNovitaAI
class.
Here’s an example that demonstrates how to do that:
Step 4: Invoke the Model for Chat Completion
Once the model is instantiated, you can use it to generate chat completions by invoking it with a message.
Here’s an example of how to send a message and get a response:
Step 5: Chain Model with Prompt Templates
LangChain allows you to create powerful workflows by chaining models together with prompt templates. This can be especially useful when you want to reuse the same format for multiple inputs.
Here’s an example where we chain the Novita AI model with a custom prompt template for translating between languages:
Step 6: Customize the Workflow
You can modify the temperature, add more messages, or tweak other parameters depending on your use case. LangChain is highly flexible, allowing you to design complex interactions by chaining multiple prompts, adding conditional logic, or working with different models.
Function Calling with Novita AI and LangChain
To implement function calling (or tool usage) with Novita AI’s LLM API, LangChain can serve as a convenient framework. In this example, we’ll create a simple math application that allows the model to perform addition and multiplication operations via function calls.
💡 While this guide uses LangChain for convenience, implementing function calling doesn’t require any specific framework. The key is designing the right prompts to make the model understand and correctly invoke functions. LangChain is used here simply to streamline the implementation.
Prerequisites
First, install the required packages:
Setting Up the Environment
Create a .env
file in your project root and add your Novita AI API key:
Implementation Steps
- Define the Tools
First, let’s create two simple mathematical tools using LangChain’s @tool
decorator:
- Create the Tool Execution Function
Next, implement a function to execute the tools:
- Set Up the LangChain Pipeline
Create a chain that uses Novita AI’s LLM to select and prepare tool calls:
- Create the Main Processing Function
Implement the main function that processes mathematical queries:
- Usage Example
Here’s how to use the implementation:
Was this page helpful?