构建LangChain应用出现的TypeError错误

在阅读LangChain官网给出的一些案列时,实际运行却报错,案列代码如下:

python 复制代码
from langchain.prompts import ChatPromptTemplate
from langchain_community.chat_models import ChatOpenAI
from langchain_core.output_parsers import StrOutputParser

prompt = ChatPromptTemplate.from_template("tell me a short joke about {topic}")
model = ChatOpenAI()
output_parser = StrOutputParser()

chain = prompt | model | output_parser

chain.invoke({"topic": "ice cream"})

错误1:TypeError: Expected a Runnable, callable or dict.Instead got an unsupported type: <class 'langchain_core.output_parsers.string.StrOutputParser'>

完整报错信息如下:

bash 复制代码
Traceback (most recent call last):
  File "~/PycharmProjects/LangChain/main.py", line 14, in <module>
    chain = prompt | model | output_parser
  File "~/opt/anaconda3/envs/langchain/lib/python3.8/site-packages/langchain/schema/runnable/base.py", line 1165, in __or__
    last=coerce_to_runnable(other),
  File "~/opt/anaconda3/envs/langchain/lib/python3.8/site-packages/langchain/schema/runnable/base.py", line 2774, in coerce_to_runnable
    raise TypeError(
TypeError: Expected a Runnable, callable or dict.Instead got an unsupported type: <class 'langchain_core.output_parsers.string.StrOutputParser'>

出现这个错误的原因是因为输出解析器不正确,不支持StrOutputParser而是要使用BaseOutputParser,因此我们可以自己来实现一个BaseOutputParser:

如下:

python 复制代码
class CommaSeparatedListOutputParser(BaseOutputParser):
    """Parse the output of an LLM call to a comma-separated list."""

    def parse(self, text: str):
        """Parse the output of an LLM call."""

        return text.strip()

即更新代码如下:

python 复制代码
import os
from langchain.prompts import ChatPromptTemplate
from langchain_community.chat_models import ChatOpenAI
from langchain.schema import BaseOutputParser

os.environ["OPENAI_API_KEY"] = "xxx"

class CommaSeparatedListOutputParser(BaseOutputParser):
    """Parse the output of an LLM call to a comma-separated list."""

    def parse(self, text: str):
        """Parse the output of an LLM call."""

        return text.strip()


prompt = ChatPromptTemplate.from_template("tell me a short joke about {topic}")
model = ChatOpenAI()
output_parser = CommaSeparatedListOutputParser()

chain = prompt | model | output_parser

chain.invoke({"topic": "ice cream"})

仍然报错2

错误2:TypeError: Got unknown type ('messages', [HumanMessage(content='tell me a short joke about ice cream')])

错误原因是引入ChatOpenAI的包不对,原始的引入是from langchain_community.chat_models import ChatOpenAI改为from langchain.chat_models import ChatOpenAI即可,修改上面2处问题后,即可正确运行代码

完整正确的代码如下

python 复制代码
import os
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models import ChatOpenAI
from langchain.schema import BaseOutputParser

os.environ["OPENAI_API_KEY"] = "xxx"


class CommaSeparatedListOutputParser(BaseOutputParser):
    """Parse the output of an LLM call to a comma-separated list."""

    def parse(self, text: str):
        """Parse the output of an LLM call."""

        return text.strip()


prompt = ChatPromptTemplate.from_template("tell me a short joke about {topic}")
model = ChatOpenAI()
output_parser = CommaSeparatedListOutputParser()

chain = prompt | model | output_parser

res = chain.invoke({"topic": "ice cream"})
print(res)

输出:

bash 复制代码
Why did the ice cream go to therapy?
Because it had too many toppings and couldn't keep its sprinkles together!
相关推荐
H_z_q24012 分钟前
Python动态类型、运算符、输入处理及算法编程问答
python
JJJJ_iii9 分钟前
【机器学习07】 激活函数精讲、Softmax多分类与优化器进阶
人工智能·笔记·python·算法·机器学习·分类·线性回归
PieroPc24 分钟前
用Python Streamlit sqlite3 写一个简单博客
数据库·python·sqlite
新子y1 小时前
【小白笔记】关于 Python 类、初始化以及 PyTorch 数据处理的问题
pytorch·笔记·python
程序员小远1 小时前
如何编写自动化测试用例?
自动化测试·软件测试·python·测试工具·职场和发展·测试用例·接口测试
Hello.Reader1 小时前
在 Flink 中用好 Java 8 Lambda类型推断、`.returns(...)` 与常见坑位
java·python·flink
滑水滑成滑头1 小时前
**发散创新:模拟计算的高级应用与实现**随着科技的飞速发展,模拟计算已经成为了众多领域的核心工
java·服务器·python·科技
程序猿John1 小时前
python深度学习之爬虫篇
开发语言·爬虫·python
少林and叔叔2 小时前
人工智能Pytorch开发环境的搭建
人工智能·pytorch·python·pycharm·conda
电棍2332 小时前
工程实践心得记录-pytorch要安装在哪里
人工智能·pytorch·python