Langchain 的 Conversation summary memory

Langchain 的 Conversation summary memory

现在让我们看一下使用稍微复杂的内存类型 - ConversationSummaryMemory 。这种类型的记忆会随着时间的推移创建对话的摘要。这对于随着时间的推移压缩对话中的信息非常有用。对话摘要内存对发生的对话进行总结,并将当前摘要存储在内存中。然后可以使用该内存将迄今为止的对话摘要注入提示/链中。此内存对于较长的对话最有用,因为在提示中逐字保留过去的消息历史记录会占用太多令牌。

我们首先来探讨一下这种存储器的基本功能。

示例代码,

from langchain.memory import ConversationSummaryMemory, ChatMessageHistory
from langchain.llms import OpenAI

memory = ConversationSummaryMemory(llm=OpenAI(temperature=0))
memory.save_context({"input": "hi"}, {"output": "whats up"})

memory.load_memory_variables({})

输出结果,

    {'history': '\nThe human greets the AI, to which the AI responds.'}

我们还可以获取历史记录作为消息列表(如果您将其与聊天模型一起使用,这非常有用)。

memory = ConversationSummaryMemory(llm=OpenAI(temperature=0), return_messages=True)
memory.save_context({"input": "hi"}, {"output": "whats up"})

memory.load_memory_variables({})

输出结果,

    {'history': [SystemMessage(content='\nThe human greets the AI, to which the AI responds.', additional_kwargs={})]}

我们也可以直接使用 predict_new_summary 方法。

messages = memory.chat_memory.messages
previous_summary = ""
memory.predict_new_summary(messages, previous_summary)

输出结果,

    '\nThe human greets the AI, to which the AI responds.'

Initializing with messages

如果您有此类之外的消息,您可以使用 ChatMessageHistory 轻松初始化该类。加载期间,将计算摘要。

示例代码,

history = ChatMessageHistory()
history.add_user_message("hi")
history.add_ai_message("hi there!")

memory = ConversationSummaryMemory.from_messages(llm=OpenAI(temperature=0), chat_memory=history, return_messages=True)

memory.buffer

输出结果,

    '\nThe human greets the AI, to which the AI responds with a friendly greeting.'

Using in a chain

让我们看一下在链中使用它的示例,再次设置 verbose=True 以便我们可以看到提示。

示例代码,

from langchain.llms import OpenAI
from langchain.chains import ConversationChain
llm = OpenAI(temperature=0)
conversation_with_summary = ConversationChain(
    llm=llm, 
    memory=ConversationSummaryMemory(llm=OpenAI()),
    verbose=True
)
conversation_with_summary.predict(input="Hi, what's up?")

输出结果,

    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    
    Human: Hi, what's up?
    AI:
    
    > Finished chain.





    " Hi there! I'm doing great. I'm currently helping a customer with a technical issue. How about you?"

示例代码,

conversation_with_summary.predict(input="Tell me more about it!")

输出结果,

    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    
    The human greeted the AI and asked how it was doing. The AI replied that it was doing great and was currently helping a customer with a technical issue.
    Human: Tell me more about it!
    AI:
    
    > Finished chain.





    " Sure! The customer is having trouble with their computer not connecting to the internet. I'm helping them troubleshoot the issue and figure out what the problem is. So far, we've tried resetting the router and checking the network settings, but the issue still persists. We're currently looking into other possible solutions."

示例代码,

conversation_with_summary.predict(input="Very cool -- what is the scope of the project?")

输出结果,

    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    
    The human greeted the AI and asked how it was doing. The AI replied that it was doing great and was currently helping a customer with a technical issue where their computer was not connecting to the internet. The AI was troubleshooting the issue and had already tried resetting the router and checking the network settings, but the issue still persisted and they were looking into other possible solutions.
    Human: Very cool -- what is the scope of the project?
    AI:
    
    > Finished chain.





    " The scope of the project is to troubleshoot the customer's computer issue and find a solution that will allow them to connect to the internet. We are currently exploring different possibilities and have already tried resetting the router and checking the network settings, but the issue still persists."

完结!

相关推荐
waiting不是违停2 天前
LangChain Ollama实战文献检索助手(二)少样本提示FewShotPromptTemplate示例选择器
langchain·llm·ollama
Y24834908912 天前
05LangChain实战课 - 提示工程与FewShotPromptTemplate的应用
人工智能·langchain
科研小达人2 天前
Langchain调用模型使用FAISS
python·chatgpt·langchain·faiss
小陈phd4 天前
大语言模型及LangChain介绍
人工智能·语言模型·langchain
写程序的小火箭5 天前
如何评估一个RAG系统(RAGas评测框架)-下篇
人工智能·gpt·语言模型·chatgpt·langchain
Stitch .6 天前
小北的字节跳动青训营与 LangChain 实战课:探索 AI 技术的新边界(持续更新中~~~)
人工智能·python·gpt·ai·语言模型·chatgpt·langchain
黑金IT6 天前
掌握AI Prompt的艺术:如何有效引导智能助手
人工智能·langchain·prompt·ai编程
科研小达人6 天前
langchain调用chatgpt对文本进行编码
服务器·langchain
智兔唯新6 天前
【AIGC】COT思维链:让AI学会拆解问题,像人一样思考
人工智能·python·langchain·prompt·aigc
wyh_1117 天前
windows下xinference无法加载本地大模型问题解决
langchain·xinference