Skip to main content

从 LLMMathChain 迁移

LLMMathChain 使得可以评估由大型语言模型生成的数学表达式。生成表达式的指令被格式化为提示词,并在使用 numexpr 库进行评估之前,从字符串响应中解析出表达式。

这可以通过 工具调用 更自然地实现。我们可以为聊天模型配备一个简单的计算器工具,利用 numexpr 并围绕它构建一个简单的链,使用 LangGraph。这种方法的一些优点包括:

  • 利用为此目的进行微调的聊天模型的工具调用能力;
  • 减少从字符串 LLM 响应中提取表达式时的解析错误;
  • 将指令委派给 消息角色(例如,聊天模型可以理解 ToolMessage 代表什么,而无需额外的提示);
  • 支持流式处理,包括单个标记和链步骤。
%pip install --upgrade --quiet numexpr
import os
from getpass import getpass

if "OPENAI_API_KEY" not in os.environ:
os.environ["OPENAI_API_KEY"] = getpass()

传统

Details
<!--IMPORTS:[{"imported": "LLMMathChain", "source": "langchain.chains", "docs": "https://python.langchain.com/api_reference/langchain/chains/langchain.chains.llm_math.base.LLMMathChain.html", "title": "Migrating from LLMMathChain"}, {"imported": "ChatPromptTemplate", "source": "langchain_core.prompts", "docs": "https://python.langchain.com/api_reference/core/prompts/langchain_core.prompts.chat.ChatPromptTemplate.html", "title": "Migrating from LLMMathChain"}, {"imported": "ChatOpenAI", "source": "langchain_openai", "docs": "https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html", "title": "Migrating from LLMMathChain"}]-->
from langchain.chains import LLMMathChain
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4o-mini")

chain = LLMMathChain.from_llm(llm)

chain.invoke("What is 551368 divided by 82?")
{'question': 'What is 551368 divided by 82?', 'answer': 'Answer: 6724.0'}

LangGraph

Details
<!--IMPORTS:[{"imported": "BaseMessage", "source": "langchain_core.messages", "docs": "https://python.langchain.com/api_reference/core/messages/langchain_core.messages.base.BaseMessage.html", "title": "Migrating from LLMMathChain"}, {"imported": "RunnableConfig", "source": "langchain_core.runnables", "docs": "https://python.langchain.com/api_reference/core/runnables/langchain_core.runnables.config.RunnableConfig.html", "title": "Migrating from LLMMathChain"}, {"imported": "tool", "source": "langchain_core.tools", "docs": "https://python.langchain.com/api_reference/core/tools/langchain_core.tools.convert.tool.html", "title": "Migrating from LLMMathChain"}, {"imported": "ChatOpenAI", "source": "langchain_openai", "docs": "https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html", "title": "Migrating from LLMMathChain"}]-->
import math
from typing import Annotated, Sequence

import numexpr
from langchain_core.messages import BaseMessage
from langchain_core.runnables import RunnableConfig
from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langgraph.graph import END, StateGraph
from langgraph.graph.message import add_messages
from langgraph.prebuilt.tool_node import ToolNode
from typing_extensions import TypedDict


@tool
def calculator(expression: str) -> str:
"""Calculate expression using Python's numexpr library.

Expression should be a single line mathematical expression
that solves the problem.

Examples:
"37593 * 67" for "37593 times 67"
"37593**(1/5)" for "37593^(1/5)"
"""
local_dict = {"pi": math.pi, "e": math.e}
return str(
numexpr.evaluate(
expression.strip(),
global_dict={}, # restrict access to globals
local_dict=local_dict, # add common mathematical functions
)
)


llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
tools = [calculator]
llm_with_tools = llm.bind_tools(tools, tool_choice="any")


class ChainState(TypedDict):
"""LangGraph state."""

messages: Annotated[Sequence[BaseMessage], add_messages]


async def acall_chain(state: ChainState, config: RunnableConfig):
last_message = state["messages"][-1]
response = await llm_with_tools.ainvoke(state["messages"], config)
return {"messages": [response]}


async def acall_model(state: ChainState, config: RunnableConfig):
response = await llm.ainvoke(state["messages"], config)
return {"messages": [response]}


graph_builder = StateGraph(ChainState)
graph_builder.add_node("call_tool", acall_chain)
graph_builder.add_node("execute_tool", ToolNode(tools))
graph_builder.add_node("call_model", acall_model)
graph_builder.set_entry_point("call_tool")
graph_builder.add_edge("call_tool", "execute_tool")
graph_builder.add_edge("execute_tool", "call_model")
graph_builder.add_edge("call_model", END)
chain = graph_builder.compile()
# Visualize chain:

from IPython.display import Image

Image(chain.get_graph().draw_mermaid_png())

# Stream chain steps:

example_query = "What is 551368 divided by 82"

events = chain.astream(
{"messages": [("user", example_query)]},
stream_mode="values",
)
async for event in events:
event["messages"][-1].pretty_print()
================================ Human Message =================================

What is 551368 divided by 82
================================== Ai Message ==================================
Tool Calls:
calculator (call_1ic3gjuII0Aq9vxlSYiwvjSb)
Call ID: call_1ic3gjuII0Aq9vxlSYiwvjSb
Args:
expression: 551368 / 82
================================= Tool Message =================================
Name: calculator

6724.0
================================== Ai Message ==================================

551368 divided by 82 equals 6724.

下一步

查看构建和使用工具的指南 这里

查看 LangGraph 文档 以获取有关使用 LangGraph 构建的详细信息。


Was this page helpful?


You can also leave detailed feedback on GitHub.

扫我,入群扫我,找书