ChatPromptTemplate和AI Message的用法

ChatPromptTemplate的用法

用法1:

python 复制代码
from langchain.chains import LLMChain
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain.chains import LLMMathChain

prompt= ChatPromptTemplate.from_template("tell me the weather of {topic}")
str = prompt.format(topic="shenzhen")
print(str)

打印出:

bash 复制代码
Human: tell me the weather of shenzhen

最终和llm一起使用:

python 复制代码
import ChatGLM
from langchain.chains import LLMChain
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

from langchain_community.tools.tavily_search import TavilySearchResults
from langchain.chains import LLMMathChain


prompt = ChatPromptTemplate.from_template("who is {name}")
# str = prompt.format(name="Bill Gates")
# print(str)
llm = ChatGLM.ChatGLM_LLM()
output_parser = StrOutputParser()
chain05 = prompt| llm | output_parser
print(chain05.invoke({"name": "Bill Gates"}))

用法2:

python 复制代码
import ChatGLM
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages([
                ("system", "You are a helpful AI bot. Your name is {name}."),
                ("human", "Hello, how are you doing?"),
                ("ai", "I'm doing well, thanks!"),
                ("human", "{user_input}"),
            ])

llm = ChatGLM.ChatGLM_LLM()
output_parser = StrOutputParser()
chain05 = prompt| llm | output_parser
print(chain05.invoke({"name": "Bob","user_input": "What is your name"}))

也可以这样:

python 复制代码
import ChatGLM
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

llm = ChatGLM.ChatGLM_LLM()

prompt = ChatPromptTemplate.from_messages([
                ("system", "You are a helpful AI bot. Your name is {name}."),
                ("human", "Hello, how are you doing?"),
                ("ai", "I'm doing well, thanks!"),
                ("human", "{user_input}"),
            ])


# a = prompt.format_prompt({name="Bob"})

a = prompt.format_prompt(name="Bob",user_input="What is your name") 
print(a)
print(llm.invoke(a))

参考: https://python.langchain.com/docs/modules/model_io/prompts/quick_start

https://python.langchain.com/docs/modules/model_io/prompts/composition

相关推荐
香蕉炒肉8 分钟前
Java优化:双重for循环
java·开发语言
超级小忍11 分钟前
如何配置 MySQL 允许远程连接
数据库·mysql·adb
RussellFans18 分钟前
Linux 文本三剑客(grep, awk, sed)
linux·运维·服务器
吹牛不交税23 分钟前
sqlsugar WhereIF条件的大于等于和等于查出来的坑
数据库·mysql
黄雪超29 分钟前
JVM——打开JVM后门的钥匙:反射机制
java·开发语言·jvm
听风吹等浪起33 分钟前
CentOS在vmware局域网内搭建DHCP服务器【踩坑记录】
linux·服务器·centos
有梦想的攻城狮37 分钟前
spring中的@RabbitListener注解详解
java·后端·spring·rabbitlistener
李斯维39 分钟前
循序渐进 Android Binder(二):传递自定义对象和 AIDL 回调
android·java·android studio
androidwork40 分钟前
OkHttp 3.0源码解析:从设计理念到核心实现
android·java·okhttp·kotlin
程序员岳焱42 分钟前
Java 程序员成长记(二):菜鸟入职之 MyBatis XML「陷阱」
java·后端·程序员