Langchain 的 Conversation summary memory

Langchain 的 Conversation summary memory

现在让我们看一下使用稍微复杂的内存类型 - ConversationSummaryMemory 。这种类型的记忆会随着时间的推移创建对话的摘要。这对于随着时间的推移压缩对话中的信息非常有用。对话摘要内存对发生的对话进行总结,并将当前摘要存储在内存中。然后可以使用该内存将迄今为止的对话摘要注入提示/链中。此内存对于较长的对话最有用,因为在提示中逐字保留过去的消息历史记录会占用太多令牌。

我们首先来探讨一下这种存储器的基本功能。

示例代码,

from langchain.memory import ConversationSummaryMemory, ChatMessageHistory
from langchain.llms import OpenAI

memory = ConversationSummaryMemory(llm=OpenAI(temperature=0))
memory.save_context({"input": "hi"}, {"output": "whats up"})

memory.load_memory_variables({})

输出结果,

    {'history': '\nThe human greets the AI, to which the AI responds.'}

我们还可以获取历史记录作为消息列表(如果您将其与聊天模型一起使用,这非常有用)。

memory = ConversationSummaryMemory(llm=OpenAI(temperature=0), return_messages=True)
memory.save_context({"input": "hi"}, {"output": "whats up"})

memory.load_memory_variables({})

输出结果,

    {'history': [SystemMessage(content='\nThe human greets the AI, to which the AI responds.', additional_kwargs={})]}

我们也可以直接使用 predict_new_summary 方法。

messages = memory.chat_memory.messages
previous_summary = ""
memory.predict_new_summary(messages, previous_summary)

输出结果,

    '\nThe human greets the AI, to which the AI responds.'

Initializing with messages

如果您有此类之外的消息,您可以使用 ChatMessageHistory 轻松初始化该类。加载期间,将计算摘要。

示例代码,

history = ChatMessageHistory()
history.add_user_message("hi")
history.add_ai_message("hi there!")

memory = ConversationSummaryMemory.from_messages(llm=OpenAI(temperature=0), chat_memory=history, return_messages=True)

memory.buffer

输出结果,

    '\nThe human greets the AI, to which the AI responds with a friendly greeting.'

Using in a chain

让我们看一下在链中使用它的示例,再次设置 verbose=True 以便我们可以看到提示。

示例代码,

from langchain.llms import OpenAI
from langchain.chains import ConversationChain
llm = OpenAI(temperature=0)
conversation_with_summary = ConversationChain(
    llm=llm, 
    memory=ConversationSummaryMemory(llm=OpenAI()),
    verbose=True
)
conversation_with_summary.predict(input="Hi, what's up?")

输出结果,

    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    
    Human: Hi, what's up?
    AI:
    
    > Finished chain.





    " Hi there! I'm doing great. I'm currently helping a customer with a technical issue. How about you?"

示例代码,

conversation_with_summary.predict(input="Tell me more about it!")

输出结果,

    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    
    The human greeted the AI and asked how it was doing. The AI replied that it was doing great and was currently helping a customer with a technical issue.
    Human: Tell me more about it!
    AI:
    
    > Finished chain.





    " Sure! The customer is having trouble with their computer not connecting to the internet. I'm helping them troubleshoot the issue and figure out what the problem is. So far, we've tried resetting the router and checking the network settings, but the issue still persists. We're currently looking into other possible solutions."

示例代码,

conversation_with_summary.predict(input="Very cool -- what is the scope of the project?")

输出结果,

    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    
    The human greeted the AI and asked how it was doing. The AI replied that it was doing great and was currently helping a customer with a technical issue where their computer was not connecting to the internet. The AI was troubleshooting the issue and had already tried resetting the router and checking the network settings, but the issue still persisted and they were looking into other possible solutions.
    Human: Very cool -- what is the scope of the project?
    AI:
    
    > Finished chain.





    " The scope of the project is to troubleshoot the customer's computer issue and find a solution that will allow them to connect to the internet. We are currently exploring different possibilities and have already tried resetting the router and checking the network settings, but the issue still persists."

完结!

相关推荐
ILUUSION_S3 小时前
学习路程五 向量数据库Milvus操作
数据库·python·学习·langchain·milvus
CodeCaster16 小时前
他来了,为大模型量身定制的响应式编程范式(1) —— 从接入 DeepSeek 开始吧
java·ai·langchain
chaplinthink1 天前
LangChain大模型框架& Dify低代码 AI 开发平台
ai·langchain·dify
火云牌神1 天前
本地大模型编程实战(22)用langchain实现基于SQL数据构建问答系统(1)
python·sql·langchain·llama·qwen2.5
又菜又爱玩呜呜呜~2 天前
langchain-go调用deepseek
开发语言·golang·langchain
背太阳的牧羊人5 天前
RAG检索中使用一个 长上下文重排序器(Long Context Reorder) 对检索到的文档进行进一步的处理和排序,优化输出顺序
开发语言·人工智能·python·langchain·rag
Java知识技术分享5 天前
使用LangChain构建第一个ReAct Agent
python·react.js·ai·语言模型·langchain
圆内~搁浅6 天前
langchain本地知识库问答机器人集成本地知识库
数据库·langchain·机器人
Neo很努力7 天前
【deepseek】本地部署+RAG知识库挂载+对话测试
自然语言处理·chatgpt·langchain·aigc·llama
merlin-mm8 天前
langchain应用-agent
langchain