Formatting Outputs for ChatPrompt Templates(two)

https://python.langchain.com.cn/docs/modules/model_io/prompts/prompt_templates/format_output

This guide explains how to use the format method of ChatPrompt (in LangChain) to get outputs in three useful formats. All examples use the same core task: creating a prompt for translating English to French. We'll keep code and outputs exactly as in the original source---no changes.

Key Background First

Before diving into formats, remember: A ChatPrompt typically includes a SystemMessage (tells the AI its role) and a HumanMessage (the user's input). For our examples, the ChatPrompt is set up to translate text from an input_language to an output_language (we'll use English → French).

1. Output as a String

The simplest format: a plain text string that combines the system message and human message. There are two equivalent ways to get this.

Method 1: Use chat_prompt.format()

This directly returns the prompt as a string.

Code (From Original Source)
python 复制代码
# Assume `chat_prompt` is already set up for translation (English → French)
output = chat_prompt.format(
    input_language="English", 
    output_language="French", 
    text="I love programming."
)
print(output)
Output (From Original Source)
复制代码
System: You are a helpful assistant that translates English to French.
Human: I love programming.

Method 2: Use chat_prompt.format_prompt().to_string()

This is a two-step way to get the same string. First, format_prompt() creates a ChatPromptValue (see Section 2), then to_string() converts it to text.

Code (From Original Source)
python 复制代码
output_2 = chat_prompt.format_prompt(
    input_language="English", 
    output_language="French", 
    text="I love programming."
).to_string()

# Check if both outputs are identical (they will be!)
assert output == output_2  # No error means they match

What This Means

Both methods give you a readable text string. Use this if you want to quickly check or share the prompt content.

2. Output as a ChatPromptValue

ChatPromptValue is a special LangChain object that stores the full prompt (with messages). It's not just text---it keeps track of the message types (system vs. human).

Code (From Original Source)

python 复制代码
chat_prompt_value = chat_prompt.format_prompt(
    input_language="English", 
    output_language="French", 
    text="I love programming."
)
print(chat_prompt_value)
Output (From Original Source)
复制代码
ChatPromptValue(messages=[
    SystemMessage(content='You are a helpful assistant that translates English to French.', additional_kwargs={}), 
    HumanMessage(content='I love programming.', additional_kwargs={})
])

What This Means

  • ChatPromptValue has a messages attribute that holds a list of message objects (here: SystemMessage and HumanMessage).
  • Use this if you need to work with the prompt as a structured object (not just text) in LangChain workflows.

3. Output as a List of Message Objects

You can convert the ChatPromptValue into a list of SystemMessage and HumanMessage objects. This list is ready to pass directly to Chat models (e.g., ChatOpenAI), since models accept message objects as input.

Code (From Original Source)

python 复制代码
message_list = chat_prompt.format_prompt(
    input_language="English", 
    output_language="French", 
    text="I love programming."
).to_messages()
print(message_list)
Output (From Original Source)
复制代码
[
    SystemMessage(content='You are a helpful assistant that translates English to French.', additional_kwargs={}), 
    HumanMessage(content='I love programming.', additional_kwargs={})
]

What This Means

  • The list contains actual LangChain message objects (not just text).
  • This is the most useful format for running the prompt with a Chat model ---you can pass message_list directly to the model's predict or generate methods.

Quick Summary of All 3 Formats

Format Type How to Get It Use Case
String chat_prompt.format(...) or format_prompt().to_string() Quick checks/sharing prompt text
ChatPromptValue chat_prompt.format_prompt(...) Working with structured prompt objects
List of Message Objects format_prompt().to_messages() Passing input directly to a Chat model

All code, outputs, and logic match the original source---no extra changes or additions.

相关推荐
o***Z44840 分钟前
前端性能优化案例
前端
张拭心44 分钟前
前端没有实际的必要了?结合今年工作内容,谈谈我的看法
前端·ai编程
m***56721 小时前
Win10下安装 Redis
数据库·redis·缓存
姜太小白1 小时前
【前端】CSS媒体查询响应式设计详解:@media (max-width: 600px) {……}
前端·css·媒体
Warren981 小时前
Python自动化测试全栈面试
服务器·网络·数据库·mysql·ubuntu·面试·职场和发展
HIT_Weston1 小时前
39、【Ubuntu】【远程开发】拉出内网 Web 服务:构建静态网页(二)
linux·前端·ubuntu
百***06011 小时前
SpringMVC 请求参数接收
前端·javascript·算法
天外天-亮1 小时前
Vue + excel下载 + 水印
前端·vue.js·excel
起个名字逛街玩1 小时前
前端正在走向“工程系统化”:从页面开发到复杂产品架构的深度进化
前端·架构
用户47949283569152 小时前
React 渲染两次:是 Bug 还是 Feature?聊聊严格模式的“良苦用心”
前端·react.js·前端框架