如何为不支持流式传输的模型禁用流式传输¶
一些聊天模型,包括来自OpenAI的新O1模型(取决于您阅读本文的时间),不支持流式传输。这可能会导致使用astream_events API时出现问题,因为它会以流式传输模式调用模型,期望流式传输能够正常工作。
在本指南中,我们将向您展示如何为不支持流式传输的模型禁用流式传输,确保它们永远不会以流式传输模式被调用,即使通过astream_events API调用也是如此。
from langchain_openai import ChatOpenAI
from langgraph.graph import MessagesState
from langgraph.graph import StateGraph, START, END
llm = ChatOpenAI(model="o1-preview", temperature=1)
graph_builder = StateGraph(MessagesState)
def chatbot(state: MessagesState):
return {"messages": [llm.invoke(state["messages"])]}
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("chatbot", END)
graph = graph_builder.compile()
API Reference: ChatOpenAI | StateGraph | START | END
不禁用流式处理¶
现在我们已经定义了图形,让我们尝试调用astream_events
而不禁用流式处理。这应该会抛出一个错误,因为o1
模型不支持原生的流式处理:
input = {"messages": {"role": "user", "content": "how many r's are in strawberry?"}}
try:
async for event in graph.astream_events(input, version="v2"):
if event["event"] == "on_chat_model_end":
print(event["data"]["output"].content, end="", flush=True)
except:
print("Streaming not supported!")
禁用流式传输¶
现在,无需对我们的图进行任何更改,让我们将模型的disable_streaming参数设置为True
,这将解决该问题:
llm = ChatOpenAI(model="o1-preview", temperature=1, disable_streaming=True)
graph_builder = StateGraph(MessagesState)
def chatbot(state: MessagesState):
return {"messages": [llm.invoke(state["messages"])]}
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("chatbot", END)
graph = graph_builder.compile()
现在,使用相同的输入重新运行,我们应该看不到任何错误:
input = {"messages": {"role": "user", "content": "how many r's are in strawberry?"}}
async for event in graph.astream_events(input, version="v2"):
if event["event"] == "on_chat_model_end":
print(event["data"]["output"].content, end="", flush=True)