自然语言处理从入门到应用——LangChain:记忆(Memory)-[自定义对话记忆与自定义记忆类]

分类目录:《自然语言处理从入门到应用》总目录


自定义对话记忆

本节介绍了几种自定义对话记忆的方法:

from langchain.llms import OpenAI
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory

llm = OpenAI(temperature=0)
AI前缀

第一种方法是通过更改对话摘要中的AI前缀来实现。默认情况下,它设置为AI,但你可以将其设置为任何你想要的内容。需要注意的是,如果我们更改了这个前缀,我们还应该相应地更改链条中使用的提示来反映这个命名更改。让我们通过下面的示例来演示这个过程。

csharp 复制代码
# Here it is by default set to "AI"
conversation = ConversationChain(
    llm=llm, 
    verbose=True, 
    memory=ConversationBufferMemory()
)
conversation.predict(input="Hi there!")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Human: Hi there!
AI:

> Finished ConversationChain chain.

输出:

" Hi there! It's nice to meet you. How can I help you today?"

输入:

conversation.predict(input="What's the weather?")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Human: Hi there!
AI:  Hi there! It's nice to meet you. How can I help you today?
Human: What's the weather?
AI:

> Finished ConversationChain chain.

输出:

' The current weather is sunny and warm with a temperature of 75 degrees Fahrenheit. The forecast for the next few days is sunny with temperatures in the mid-70s.'

输入:

# Now we can override it and set it to "AI Assistant"
from langchain.prompts.prompt import PromptTemplate

template = """The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
{history}
Human: {input}
AI Assistant:"""
PROMPT = PromptTemplate(
    input_variables=["history", "input"], template=template
)
conversation = ConversationChain(
    prompt=PROMPT,
    llm=llm, 
    verbose=True, 
    memory=ConversationBufferMemory(ai_prefix="AI Assistant")
)
conversation.predict(input="Hi there!")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Human: Hi there!
AI Assistant:

> Finished ConversationChain chain.
" Hi there! It's nice to meet you. How can I help you today?"
conversation.predict(input="What's the weather?")
> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Human: Hi there!
AI Assistant:  Hi there! It's nice to meet you. How can I help you today?
Human: What's the weather?
AI Assistant:

> Finished ConversationChain chain.

输出:

The current weather is sunny and warm with a temperature of 75 degrees Fahrenheit. The forecast for the rest of the day is sunny with a high of 78 degrees and a low of 65 degrees.'
人类前缀

第二种方法是通过更改对话摘要中的人类前缀来实现。默认情况下,它设置为Human,但我们可以将其设置为任何我们想要的内容。需要注意的是,如果我们更改了这个前缀,我们还应该相应地更改链条中使用的提示来反映这个命名更改。让我们通过下面的示例来演示这个过程。

csharp 复制代码
# Now we can override it and set it to "Friend"
from langchain.prompts.prompt import PromptTemplate

template = """The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
{history}
Friend: {input}
AI:"""
PROMPT = PromptTemplate(
    input_variables=["history", "input"], template=template
)
conversation = ConversationChain(
    prompt=PROMPT,
    llm=llm, 
    verbose=True, 
    memory=ConversationBufferMemory(human_prefix="Friend")
)
conversation.predict(input="Hi there!")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Friend: Hi there!
AI:

> Finished ConversationChain chain.
" Hi there! It's nice to meet you. How can I help you today?"
conversation.predict(input="What's the weather?")
> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Friend: Hi there!
AI:  Hi there! It's nice to meet you. How can I help you today?
Friend: What's the weather?
AI:

> Finished ConversationChain chain.

输出:

  ' The weather right now is sunny and warm with a temperature of 75 degrees Fahrenheit. The forecast for the rest of the day is mostly sunny with a high of 82 degrees.'

创建自定义记忆类

尽管在LangChain中有几种预定义的记忆类型,但我们很可能希望添加自己的记忆类型,以使其适用于我们的应用程序。在本节中,我们将向ConversationChain添加一个自定义的记忆类型。为了添加自定义的记忆类,我们需要导入基本的记忆类并对其进行子类化。

from langchain import OpenAI, ConversationChain
from langchain.schema import BaseMemory
from pydantic import BaseModel
from typing import List, Dict, Any

在这个示例中,我们将编写一个自定义的记忆类,使用spacy提取实体并将有关它们的信息保存在一个简单的哈希表中。然后,在对话过程中,我们将查看输入文本,提取任何实体,并将关于它们的任何信息放入上下文中。需要注意的是,这种实现相当简单且脆弱,可能在生产环境中不太有用。它的目的是展示我们可以添加自定义的记忆实现。为此,我们需要首先安装spacy

csharp 复制代码
# !pip install spacy
# !python -m spacy download en_core_web_lg
import spacy
nlp = spacy.load('en_core_web_lg')
class SpacyEntityMemory(BaseMemory, BaseModel):
    """Memory class for storing information about entities."""

    # Define dictionary to store information about entities.
    entities: dict = {}
    # Define key to pass information about entities into prompt.
    memory_key: str = "entities"
        
    def clear(self):
        self.entities = {}

    @property
    def memory_variables(self) -> List[str]:
        """Define the variables we are providing to the prompt."""
        return [self.memory_key]

    def load_memory_variables(self, inputs: Dict[str, Any]) -> Dict[str, str]:
        """Load the memory variables, in this case the entity key."""
        # Get the input text and run through spacy
        doc = nlp(inputs[list(inputs.keys())[0]])
        # Extract known information about entities, if they exist.
        entities = [self.entities[str(ent)] for ent in doc.ents if str(ent) in self.entities]
        # Return combined information about entities to put into context.
        return {self.memory_key: "\n".join(entities)}

    def save_context(self, inputs: Dict[str, Any], outputs: Dict[str, str]) -> None:
        """Save context from this conversation to buffer."""
        # Get the input text and run through spacy
        text = inputs[list(inputs.keys())[0]]
        doc = nlp(text)
        # For each entity that was mentioned, save this information to the dictionary.
        for ent in doc.ents:
            ent_str = str(ent)
            if ent_str in self.entities:
                self.entities[ent_str] += f"\n{text}"
            else:
                self.entities[ent_str] = text

我们现在定义一个提示,其中包含有关实体的信息以及用户的输入:

csharp 复制代码
from langchain.prompts.prompt import PromptTemplate

template = """The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know. You are provided with information about entities the Human mentions, if relevant.

Relevant entity information:
{entities}

Conversation:
Human: {input}
AI:"""
prompt = PromptTemplate(
    input_variables=["entities", "input"], template=template
)

现在,我们把它们整合起来:

csharp 复制代码
llm = OpenAI(temperature=0)
conversation = ConversationChain(llm=llm, prompt=prompt, verbose=True, memory=SpacyEntityMemory())

在第一个例子中,由于对Harrison没有先前的了解,"Relevant entity information"部分是空的:

csharp 复制代码
conversation.predict(input="Harrison likes machine learning")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know. You are provided with information about entities the Human mentions, if relevant.

Relevant entity information:


Conversation:
Human: Harrison likes machine learning
AI:

> Finished ConversationChain chain.

输出:

" That's great to hear! Machine learning is a fascinating field of study. It involves using algorithms to analyze data and make predictions. Have you ever studied machine learning, Harrison?"

现在在第二个例子中,我们可以看到它提取了关于Harrison的信息。

csharp 复制代码
conversation.predict(input="What do you think Harrison's favorite subject in college was?")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know. You are provided with information about entities the Human mentions, if relevant.

Relevant entity information:
Harrison likes machine learning

Conversation:
Human: What do you think Harrison's favorite subject in college was?
AI:

> Finished ConversationChain chain.

输出:

' From what I know about Harrison, I believe his favorite subject in college was machine learning. He has expressed a strong interest in the subject and has mentioned it often.'

这个实现方式相对简单且容易出错,可能在实际生产环境中没有太大的用途,但它展示了我们可以添加自定义的内存实现方式。

参考文献:

[1] LangChain官方网站:https://www.langchain.com/

[2] LangChain 🦜️🔗 中文网,跟着LangChain一起学LLM/GPT开发:https://www.langchain.com.cn/

[3] LangChain中文网 - LangChain 是一个用于开发由语言模型驱动的应用程序的框架:http://www.cnlangchain.com/

相关推荐
阡之尘埃1 小时前
Python数据分析案例61——信贷风控评分卡模型(A卡)(scorecardpy 全面解析)
人工智能·python·机器学习·数据分析·智能风控·信贷风控
孙同学要努力3 小时前
全连接神经网络案例——手写数字识别
人工智能·深度学习·神经网络
Eric.Lee20213 小时前
yolo v5 开源项目
人工智能·yolo·目标检测·计算机视觉
其实吧34 小时前
基于Matlab的图像融合研究设计
人工智能·计算机视觉·matlab
丕羽4 小时前
【Pytorch】基本语法
人工智能·pytorch·python
ctrey_4 小时前
2024-11-1 学习人工智能的Day20 openCV(2)
人工智能·opencv·学习
SongYuLong的博客4 小时前
Air780E基于LuatOS编程开发
人工智能
Jina AI4 小时前
RAG 系统的分块难题:小型语言模型如何找到最佳断点?
人工智能·语言模型·自然语言处理
-派神-5 小时前
大语言模型(LLM)量化基础知识(一)
人工智能·语言模型·自然语言处理
johnny_hhh5 小时前
AI大模型重塑软件开发流程:定义、应用场景、优势、挑战及未来展望
人工智能