跳到主要内容

如何在 LangGraph 中使用 BaseChatMessageHistory

先决条件

本指南假设您熟悉以下概念

我们建议新的 LangChain 应用程序利用内置的 LangGraph 持久性来实现内存功能。

在某些情况下,用户可能需要继续使用现有的持久化解决方案来存储聊天消息历史记录。

在这里,我们将展示如何将LangChain 聊天消息历史记录BaseChatMessageHistory 的实现)与 LangGraph 一起使用。

设置

%%capture --no-stderr
%pip install --upgrade --quiet langchain-anthropic langgraph
import os
from getpass import getpass

if "ANTHROPIC_API_KEY" not in os.environ:
os.environ["ANTHROPIC_API_KEY"] = getpass()

ChatMessageHistory

消息历史记录需要通过会话 ID 或(用户 ID,会话 ID)的 2 元组进行参数化。

许多LangChain 聊天消息历史记录将具有 session_id 或某些 namespace 来允许跟踪不同的会话。 请参考具体的实现来检查它是如何参数化的。

内置的 InMemoryChatMessageHistory 不包含这样的参数化,因此我们将创建一个字典来跟踪消息历史记录。

import uuid

from langchain_core.chat_history import InMemoryChatMessageHistory

chats_by_session_id = {}


def get_chat_history(session_id: str) -> InMemoryChatMessageHistory:
chat_history = chats_by_session_id.get(session_id)
if chat_history is None:
chat_history = InMemoryChatMessageHistory()
chats_by_session_id[session_id] = chat_history
return chat_history

与 LangGraph 一起使用

接下来,我们将使用 LangGraph 设置一个基本的聊天机器人。如果您不熟悉 LangGraph,您应该查看以下快速入门教程

我们将为聊天模型创建一个LangGraph 节点,并手动管理对话历史记录,同时考虑到作为 RunnableConfig 一部分传递的对话 ID。

会话 ID 可以作为 RunnableConfig 的一部分(就像我们在这里所做的那样)或者作为图状态的一部分传递。

import uuid

from langchain_anthropic import ChatAnthropic
from langchain_core.messages import BaseMessage, HumanMessage
from langchain_core.runnables import RunnableConfig
from langgraph.graph import START, MessagesState, StateGraph

# Define a new graph
builder = StateGraph(state_schema=MessagesState)

# Define a chat model
model = ChatAnthropic(model="claude-3-haiku-20240307")


# Define the function that calls the model
def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# Make sure that config is populated with the session id
if "configurable" not in config or "session_id" not in config["configurable"]:
raise ValueError(
"Make sure that the config includes the following information: {'configurable': {'session_id': 'some_value'}}"
)
# Fetch the history of messages and append to it any new messages.
chat_history = get_chat_history(config["configurable"]["session_id"])
messages = list(chat_history.messages) + state["messages"]
ai_message = model.invoke(messages)
# Finally, update the chat message history to include
# the new input message from the user together with the
# repsonse from the model.
chat_history.add_messages(state["messages"] + [ai_message])
return {"messages": ai_message}


# Define the two nodes we will cycle between
builder.add_edge(START, "model")
builder.add_node("model", call_model)

graph = builder.compile()

# Here, we'll create a unique session ID to identify the conversation
session_id = uuid.uuid4()
config = {"configurable": {"session_id": session_id}}

input_message = HumanMessage(content="hi! I'm bob")
for event in graph.stream({"messages": [input_message]}, config, stream_mode="values"):
event["messages"][-1].pretty_print()

# Here, let's confirm that the AI remembers our name!
input_message = HumanMessage(content="what was my name?")
for event in graph.stream({"messages": [input_message]}, config, stream_mode="values"):
event["messages"][-1].pretty_print()
================================ Human Message =================================

hi! I'm bob
================================== Ai Message ==================================

Hello Bob! It's nice to meet you. I'm Claude, an AI assistant created by Anthropic. How are you doing today?
================================ Human Message =================================

what was my name?
================================== Ai Message ==================================

You introduced yourself as Bob when you said "hi! I'm bob".

如果使用 langgraph >= 0.2.28,它也支持逐个 token 流式传输 LLM 内容。

from langchain_core.messages import AIMessageChunk

first = True

for msg, metadata in graph.stream(
{"messages": input_message}, config, stream_mode="messages"
):
if msg.content and not isinstance(msg, HumanMessage):
print(msg.content, end="|", flush=True)
API 参考:AIMessageChunk
You| sai|d your| name was Bob.|

与 RunnableWithMessageHistory 一起使用

本操作指南直接使用了 BaseChatMessageHistorymessagesadd_messages 接口。

或者,您可以使用 RunnableWithMessageHistory,因为 LCEL 可以用在任何 LangGraph 节点内。

为此,请替换以下代码

def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# Make sure that config is populated with the session id
if "configurable" not in config or "session_id" not in config["configurable"]:
raise ValueError(
"You make sure that the config includes the following information: {'configurable': {'session_id': 'some_value'}}"
)
# Fetch the history of messages and append to it any new messages.
chat_history = get_chat_history(config["configurable"]["session_id"])
messages = list(chat_history.messages) + state["messages"]
ai_message = model.invoke(messages)
# Finally, update the chat message history to include
# the new input message from the user together with the
# repsonse from the model.
chat_history.add_messages(state["messages"] + [ai_message])
# hilight-end
return {"messages": ai_message}

替换为您当前应用程序中定义的 RunnableWithMessageHistory 的相应实例。

runnable = RunnableWithMessageHistory(...) # From existing code

def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# RunnableWithMessageHistory takes care of reading the message history
# and updating it with the new human message and ai response.
ai_message = runnable.invoke(state['messages'], config)
return {
"messages": ai_message
}

此页面是否对您有所帮助?