Skip to main content

Function calling

A growing number of chat models, like OpenAI, Gemini, etc., have a function-calling API that lets you describe functions and their arguments, and have the model return a JSON object with a function to invoke and the inputs to that function. Function-calling is extremely useful for building tool-using chains and agents, and for getting structured outputs from models more generally.

LangChain comes with a number of utilities to make function-calling easy. Namely, it comes with:

  • simple syntax for binding functions to models
  • converters for formatting various types of objects to the expected function schemas
  • output parsers for extracting the function invocations from API responses
  • chains for getting structured outputs from a model, built on top of function calling

We’ll focus here on the first two points. For a detailed guide on output parsing check out the OpenAI Tools output parsers and to see the structured output chains check out the Structured output guide.

Before getting started make sure you have langchain-core installed.

%pip install -qU langchain-core langchain-openai
import getpass
import os

Binding functions​

A number of models implement helper methods that will take care of formatting and binding different function-like objects to the model. Let’s take a look at how we might take the following Pydantic function schema and get different models to invoke it:

from langchain_core.pydantic_v1 import BaseModel, Field


# Note that the docstrings here are crucial, as they will be passed along
# to the model along with the class name.
class Multiply(BaseModel):
"""Multiply two integers together."""

a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")

Set up dependencies and API keys:

%pip install -qU langchain-openai
os.environ["OPENAI_API_KEY"] = getpass.getpass()

We can use the ChatOpenAI.bind_tools() method to handle converting Multiply to an OpenAI function and binding it to the model (i.e., passing it in each time the model is invoked).

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-3.5-turbo-0125", temperature=0)
llm_with_tools = llm.bind_tools([Multiply])
llm_with_tools.invoke("what's 3 * 12")
AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_Q8ZQ97Qrj5zalugSkYMGV1Uo', 'function': {'arguments': '{"a":3,"b":12}', 'name': 'Multiply'}, 'type': 'function'}]})

We can add a tool parser to extract the tool calls from the generated message to JSON:

from langchain_core.output_parsers.openai_tools import JsonOutputToolsParser

tool_chain = llm_with_tools | JsonOutputToolsParser()
tool_chain.invoke("what's 3 * 12")
[{'type': 'Multiply', 'args': {'a': 3, 'b': 12}}]

Or back to the original Pydantic class:

from langchain_core.output_parsers.openai_tools import PydanticToolsParser

tool_chain = llm_with_tools | PydanticToolsParser(tools=[Multiply])
tool_chain.invoke("what's 3 * 12")
[Multiply(a=3, b=12)]

If we wanted to force that a tool is used (and that it is used only once), we can set the tool_choice argument:

llm_with_multiply = llm.bind_tools([Multiply], tool_choice="Multiply")
llm_with_multiply.invoke(
"make up some numbers if you really want but I'm not forcing you"
)
AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_f3DApOzb60iYjTfOhVFhDRMI', 'function': {'arguments': '{"a":5,"b":10}', 'name': 'Multiply'}, 'type': 'function'}]})

For more see the ChatOpenAI API reference.

Defining functions schemas​

In case you need to access function schemas directly, LangChain has a built-in converter that can turn Python functions, Pydantic classes, and LangChain Tools into the OpenAI format JSON schema:

Python function​

import json

from langchain_core.utils.function_calling import convert_to_openai_tool


def multiply(a: int, b: int) -> int:
"""Multiply two integers together.

Args:
a: First integer
b: Second integer
"""
return a * b


print(json.dumps(convert_to_openai_tool(multiply), indent=2))
{
"type": "function",
"function": {
"name": "multiply",
"description": "Multiply two integers together.",
"parameters": {
"type": "object",
"properties": {
"a": {
"type": "integer",
"description": "First integer"
},
"b": {
"type": "integer",
"description": "Second integer"
}
},
"required": [
"a",
"b"
]
}
}
}

Pydantic class​

from langchain_core.pydantic_v1 import BaseModel, Field


class multiply(BaseModel):
"""Multiply two integers together."""

a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")


print(json.dumps(convert_to_openai_tool(multiply), indent=2))
{
"type": "function",
"function": {
"name": "multiply",
"description": "Multiply two integers together.",
"parameters": {
"type": "object",
"properties": {
"a": {
"description": "First integer",
"type": "integer"
},
"b": {
"description": "Second integer",
"type": "integer"
}
},
"required": [
"a",
"b"
]
}
}
}

LangChain Tool​

from typing import Any, Type

from langchain_core.tools import BaseTool


class MultiplySchema(BaseModel):
"""Multiply tool schema."""

a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")


class Multiply(BaseTool):
args_schema: Type[BaseModel] = MultiplySchema
name: str = "multiply"
description: str = "Multiply two integers together."

def _run(self, a: int, b: int, **kwargs: Any) -> Any:
return a * b


# Note: we're passing in a Multiply object not the class itself.
print(json.dumps(convert_to_openai_tool(Multiply()), indent=2))
{
"type": "function",
"function": {
"name": "multiply",
"description": "Multiply two integers together.",
"parameters": {
"type": "object",
"properties": {
"a": {
"description": "First integer",
"type": "integer"
},
"b": {
"description": "Second integer",
"type": "integer"
}
},
"required": [
"a",
"b"
]
}
}
}

Next steps​


Help us out by providing feedback on this documentation page: