ChatPromptTemplate和AI Message的用法

ChatPromptTemplate的用法

用法1:

python 复制代码
from langchain.chains import LLMChain
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain.chains import LLMMathChain

prompt= ChatPromptTemplate.from_template("tell me the weather of {topic}")
str = prompt.format(topic="shenzhen")
print(str)

打印出:

bash 复制代码
Human: tell me the weather of shenzhen

最终和llm一起使用:

python 复制代码
import ChatGLM
from langchain.chains import LLMChain
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

from langchain_community.tools.tavily_search import TavilySearchResults
from langchain.chains import LLMMathChain


prompt = ChatPromptTemplate.from_template("who is {name}")
# str = prompt.format(name="Bill Gates")
# print(str)
llm = ChatGLM.ChatGLM_LLM()
output_parser = StrOutputParser()
chain05 = prompt| llm | output_parser
print(chain05.invoke({"name": "Bill Gates"}))

用法2:

python 复制代码
import ChatGLM
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages([
                ("system", "You are a helpful AI bot. Your name is {name}."),
                ("human", "Hello, how are you doing?"),
                ("ai", "I'm doing well, thanks!"),
                ("human", "{user_input}"),
            ])

llm = ChatGLM.ChatGLM_LLM()
output_parser = StrOutputParser()
chain05 = prompt| llm | output_parser
print(chain05.invoke({"name": "Bob","user_input": "What is your name"}))

也可以这样:

python 复制代码
import ChatGLM
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

llm = ChatGLM.ChatGLM_LLM()

prompt = ChatPromptTemplate.from_messages([
                ("system", "You are a helpful AI bot. Your name is {name}."),
                ("human", "Hello, how are you doing?"),
                ("ai", "I'm doing well, thanks!"),
                ("human", "{user_input}"),
            ])


# a = prompt.format_prompt({name="Bob"})

a = prompt.format_prompt(name="Bob",user_input="What is your name") 
print(a)
print(llm.invoke(a))

参考: https://python.langchain.com/docs/modules/model_io/prompts/quick_start

https://python.langchain.com/docs/modules/model_io/prompts/composition

相关推荐
阿里嘎多哈基米1 小时前
SQL 层面行转列
数据库·sql·状态模式·mapper·行转列
抠脚学代码1 小时前
Ubuntu Qt x64平台搭建 arm64 编译套件
数据库·qt·ubuntu
皮皮林5511 小时前
SpringBoot 全局/局部双模式 Gzip 压缩实战:14MB GeoJSON 秒变 3MB
java·spring boot
jakeswang2 小时前
全解MySQL之死锁问题分析、事务隔离与锁机制的底层原理剖析
数据库·mysql
weixin_456904272 小时前
Spring Boot 用户管理系统
java·spring boot·后端
趁你还年轻_2 小时前
异步编程CompletionService
java
DKPT2 小时前
Java内存区域与内存溢出
java·开发语言·jvm·笔记·学习
sibylyue2 小时前
Guava中常用的工具类
java·guava
Heliotrope_Sun2 小时前
Redis
数据库·redis·缓存
奔跑吧邓邓子2 小时前
【Java实战㉞】从0到1:Spring Boot Web开发与接口设计实战
java·spring boot·实战·web开发·接口设计