Skip to content

如何使用 ToolNode 调用工具

本指南介绍了如何使用 LangGraph 预构建的 ToolNode 进行工具调用。

ToolNode 是一个 LangChain 可运行对象,它以图状态(包含消息列表)作为输入,并输出包含工具调用结果的状态更新。它旨在与 LangGraph 预构建的 ReAct 代理 开箱即用,但只要其状态包含带有适当归约器的 messages 键,也可以与任何 StateGraph 一起使用(请参阅 MessagesState)。

安装设置

首先,让我们安装所需的软件包并设置我们的 API 密钥。

%%capture --no-stderr
%pip install --quiet -U langgraph langchain_anthropic
import getpass
import os


def _set_env(var: str):
    if not os.environ.get(var):
        os.environ[var] = getpass.getpass(f"{var}: ")


_set_env("ANTHROPIC_API_KEY")

为 LangGraph 开发设置 LangSmith

注册 LangSmith 以快速发现问题并提升你的 LangGraph 项目的性能。LangSmith 允许你使用跟踪数据来调试、测试和监控使用 LangGraph 构建的大语言模型应用程序 — 点击 此处 了解更多关于如何开始使用的信息。

定义工具

from langchain_core.messages import AIMessage
from langchain_core.tools import tool

from langgraph.prebuilt import ToolNode

API Reference: AIMessage | tool

@tool
def get_weather(location: str):
    """Call to get the current weather."""
    if location.lower() in ["sf", "san francisco"]:
        return "It's 60 degrees and foggy."
    else:
        return "It's 90 degrees and sunny."


@tool
def get_coolest_cities():
    """Get a list of coolest cities"""
    return "nyc, sf"
tools = [get_weather, get_coolest_cities]
tool_node = ToolNode(tools)

手动调用 ToolNode

ToolNode 使用消息列表对图状态进行操作。它期望列表中的最后一条消息是带有 tool_calls 参数的 AIMessage

让我们首先看看如何手动调用工具节点:

message_with_single_tool_call = AIMessage(
    content="",
    tool_calls=[
        {
            "name": "get_weather",
            "args": {"location": "sf"},
            "id": "tool_call_id",
            "type": "tool_call",
        }
    ],
)

tool_node.invoke({"messages": [message_with_single_tool_call]})
{'messages': [ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='tool_call_id')]}

请注意,通常你不需要手动创建 AIMessage,任何支持工具调用的 LangChain 聊天模型都会自动生成它。

如果你将多个工具调用传递给 AIMessagetool_calls 参数,你还可以使用 ToolNode 进行并行工具调用:

message_with_multiple_tool_calls = AIMessage(
    content="",
    tool_calls=[
        {
            "name": "get_coolest_cities",
            "args": {},
            "id": "tool_call_id_1",
            "type": "tool_call",
        },
        {
            "name": "get_weather",
            "args": {"location": "sf"},
            "id": "tool_call_id_2",
            "type": "tool_call",
        },
    ],
)

tool_node.invoke({"messages": [message_with_multiple_tool_calls]})
{'messages': [ToolMessage(content='nyc, sf', name='get_coolest_cities', tool_call_id='tool_call_id_1'),
  ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='tool_call_id_2')]}

与聊天模型一起使用

在我们的示例中,我们将使用 Anthropic 的一个小型聊天模型。要使用支持工具调用的聊天模型,我们首先需要确保模型知晓可用的工具。我们通过在 ChatAnthropic 模型上调用 .bind_tools 方法来实现这一点。

from typing import Literal

from langchain_anthropic import ChatAnthropic
from langgraph.graph import StateGraph, MessagesState
from langgraph.prebuilt import ToolNode


model_with_tools = ChatAnthropic(
    model="claude-3-haiku-20240307", temperature=0
).bind_tools(tools)
model_with_tools.invoke("what's the weather in sf?").tool_calls
[{'name': 'get_weather',
  'args': {'location': 'San Francisco'},
  'id': 'toolu_01Fwm7dg1mcJU43Fkx2pqgm8',
  'type': 'tool_call'}]

如你所见,聊天模型生成的 AI 消息已经填充了 tool_calls,因此我们可以直接将其传递给 ToolNode

tool_node.invoke({"messages": [model_with_tools.invoke("what's the weather in sf?")]})
{'messages': [ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='toolu_01LFvAVT3xJMeZS6kbWwBGZK')]}

ReAct 代理

接下来,让我们看看如何在 LangGraph 图中使用 ToolNode。让我们来实现一个 ReAct 代理 的图。这个代理将一些查询作为输入,然后反复调用工具,直到有足够的信息来解决该查询。我们将使用 ToolNode 和我们刚刚定义的工具结合 Anthropic 模型。

from typing import Literal

from langgraph.graph import StateGraph, MessagesState, START, END


def should_continue(state: MessagesState):
    messages = state["messages"]
    last_message = messages[-1]
    if last_message.tool_calls:
        return "tools"
    return END


def call_model(state: MessagesState):
    messages = state["messages"]
    response = model_with_tools.invoke(messages)
    return {"messages": [response]}


workflow = StateGraph(MessagesState)

# Define the two nodes we will cycle between
workflow.add_node("agent", call_model)
workflow.add_node("tools", tool_node)

workflow.add_edge(START, "agent")
workflow.add_conditional_edges("agent", should_continue, ["tools", END])
workflow.add_edge("tools", "agent")

app = workflow.compile()
from IPython.display import Image, display

try:
    display(Image(app.get_graph().draw_mermaid_png()))
except Exception:
    # This requires some extra dependencies and is optional
    pass

让我们来试试看!

# example with a single tool call
for chunk in app.stream(
    {"messages": [("human", "what's the weather in sf?")]}, stream_mode="values"
):
    chunk["messages"][-1].pretty_print()
================================ Human Message =================================

what's the weather in sf?
================================== Ai Message ==================================

[{'text': "Okay, let's check the weather in San Francisco:", 'type': 'text'}, {'id': 'toolu_01LdmBXYeccWKdPrhZSwFCDX', 'input': {'location': 'San Francisco'}, 'name': 'get_weather', 'type': 'tool_use'}]
Tool Calls:
  get_weather (toolu_01LdmBXYeccWKdPrhZSwFCDX)
 Call ID: toolu_01LdmBXYeccWKdPrhZSwFCDX
  Args:
    location: San Francisco
================================= Tool Message =================================
Name: get_weather

It's 60 degrees and foggy.
================================== Ai Message ==================================

The weather in San Francisco is currently 60 degrees with foggy conditions.

# example with a multiple tool calls in succession

for chunk in app.stream(
    {"messages": [("human", "what's the weather in the coolest cities?")]},
    stream_mode="values",
):
    chunk["messages"][-1].pretty_print()
================================ Human Message =================================

what's the weather in the coolest cities?
================================== Ai Message ==================================

[{'text': "Okay, let's find out the weather in the coolest cities:", 'type': 'text'}, {'id': 'toolu_01LFZUWTccyveBdaSAisMi95', 'input': {}, 'name': 'get_coolest_cities', 'type': 'tool_use'}]
Tool Calls:
  get_coolest_cities (toolu_01LFZUWTccyveBdaSAisMi95)
 Call ID: toolu_01LFZUWTccyveBdaSAisMi95
  Args:
================================= Tool Message =================================
Name: get_coolest_cities

nyc, sf
================================== Ai Message ==================================

[{'text': "Now let's get the weather for those cities:", 'type': 'text'}, {'id': 'toolu_01RHPQBhT1u6eDnPqqkGUpsV', 'input': {'location': 'nyc'}, 'name': 'get_weather', 'type': 'tool_use'}]
Tool Calls:
  get_weather (toolu_01RHPQBhT1u6eDnPqqkGUpsV)
 Call ID: toolu_01RHPQBhT1u6eDnPqqkGUpsV
  Args:
    location: nyc
================================= Tool Message =================================
Name: get_weather

It's 90 degrees and sunny.
================================== Ai Message ==================================

[{'id': 'toolu_01W5sFGF8PfgYzdY4CqT5c6e', 'input': {'location': 'sf'}, 'name': 'get_weather', 'type': 'tool_use'}]
Tool Calls:
  get_weather (toolu_01W5sFGF8PfgYzdY4CqT5c6e)
 Call ID: toolu_01W5sFGF8PfgYzdY4CqT5c6e
  Args:
    location: sf
================================= Tool Message =================================
Name: get_weather

It's 60 degrees and foggy.
================================== Ai Message ==================================

Based on the results, it looks like the weather in the coolest cities is:
- New York City: 90 degrees and sunny
- San Francisco: 60 degrees and foggy

So the weather in the coolest cities is a mix of warm and cool temperatures, with some sunny and some foggy conditions.
ToolNode 还可以处理工具执行期间的错误。你可以通过设置 handle_tool_errors=True(默认启用)来启用/禁用此功能。请参阅我们关于在 ToolNode 中处理错误的指南此处

Comments