LangChain is a versatile Python library that simplifies the process of building applications with large language models (LLMs). One of its most powerful features is the ability to create custom tools, which allow you to extend the capabilities of your AI systems. In this blog post, we'll explore how to harness the power of custom tools and function calling in LangChain.
Custom tools in LangChain enable you to:
Let's start by creating a simple custom tool in LangChain. We'll build a tool that converts temperatures between Celsius and Fahrenheit.
from langchain.tools import BaseTool from pydantic import BaseModel, Field class TemperatureInput(BaseModel): temperature: float = Field(description="The temperature to convert") unit: str = Field(description="The unit of the input temperature (C or F)") class TemperatureConverter(BaseTool): name = "temperature_converter" description = "Converts temperatures between Celsius and Fahrenheit" args_schema = TemperatureInput def _run(self, temperature: float, unit: str): if unit.upper() == 'C': result = (temperature * 9/5) + 32 return f"{temperature}°C is equal to {result:.2f}°F" elif unit.upper() == 'F': result = (temperature - 32) * 5/9 return f"{temperature}°F is equal to {result:.2f}°C" else: return "Invalid unit. Please use 'C' for Celsius or 'F' for Fahrenheit." def _arun(self, temperature: float, unit: str): # For async implementation raise NotImplementedError("TemperatureConverter does not support async")
In this example, we've created a TemperatureConverter
tool that inherits from BaseTool
. We've defined the input schema using Pydantic models and implemented the _run
method to perform the conversion.
Now that we have our custom tool, let's see how to integrate it with a LangChain agent:
from langchain.agents import initialize_agent, Tool from langchain.llms import OpenAI llm = OpenAI(temperature=0) tools = [ Tool( name="TemperatureConverter", func=TemperatureConverter().run, description="Useful for converting temperatures between Celsius and Fahrenheit" ) ] agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True) result = agent.run("Convert 25 degrees Celsius to Fahrenheit") print(result)
This code snippet demonstrates how to create an agent that can use our custom temperature conversion tool.
Function calling is another powerful feature in LangChain that allows you to define specific functions that language models can use. This capability enables more structured and predictable outputs from your AI systems.
Let's create a simple function that calculates the area of a circle:
import math from langchain.llms import OpenAI from langchain.prompts import PromptTemplate from langchain.chains import LLMChain def calculate_circle_area(radius: float) -> float: return math.pi * radius ** 2 llm = OpenAI(temperature=0) prompt = PromptTemplate( input_variables=["radius"], template="The area of a circle with radius {radius} is {area:.2f} square units." ) chain = LLMChain(llm=llm, prompt=prompt) radius = 5 area = calculate_circle_area(radius) result = chain.run(radius=radius, area=area) print(result)
In this example, we've defined a calculate_circle_area
function and used it in conjunction with an LLMChain to generate a natural language response about the area of a circle.
The real power of LangChain comes from combining custom tools and function calling. Let's create a more complex example that uses both:
from langchain.tools import BaseTool from langchain.agents import initialize_agent, Tool from langchain.llms import OpenAI from pydantic import BaseModel, Field import math class CircleInput(BaseModel): radius: float = Field(description="The radius of the circle") class CircleCalculator(BaseTool): name = "circle_calculator" description = "Calculates various properties of a circle" args_schema = CircleInput def _run(self, radius: float): area = math.pi * radius ** 2 circumference = 2 * math.pi * radius return f"For a circle with radius {radius}:\nArea: {area:.2f}\nCircumference: {circumference:.2f}" def _arun(self, radius: float): raise NotImplementedError("CircleCalculator does not support async") llm = OpenAI(temperature=0) tools = [ Tool( name="CircleCalculator", func=CircleCalculator().run, description="Useful for calculating properties of a circle given its radius" ) ] agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True) result = agent.run("What are the area and circumference of a circle with radius 7?") print(result)
This example combines a custom tool (CircleCalculator
) with function calling capabilities, allowing the agent to perform complex calculations and provide informative responses about circle properties.
Custom tools and function calling in LangChain open up a world of possibilities for creating powerful and flexible AI applications. By combining these features, you can build sophisticated systems that leverage the strengths of language models while incorporating your own domain-specific knowledge and functionalities.
14/11/2024 | Python
17/11/2024 | Python
08/11/2024 | Python
08/12/2024 | Python
25/09/2024 | Python
15/11/2024 | Python
15/10/2024 | Python
06/10/2024 | Python
14/11/2024 | Python
15/10/2024 | Python
22/11/2024 | Python