Formatting Outputs for ChatPrompt Templates(two)

https://python.langchain.com.cn/docs/modules/model_io/prompts/prompt_templates/format_output

This guide explains how to use the format method of ChatPrompt (in LangChain) to get outputs in three useful formats. All examples use the same core task: creating a prompt for translating English to French. We'll keep code and outputs exactly as in the original source---no changes.

Key Background First

Before diving into formats, remember: A ChatPrompt typically includes a SystemMessage (tells the AI its role) and a HumanMessage (the user's input). For our examples, the ChatPrompt is set up to translate text from an input_language to an output_language (we'll use English → French).

1. Output as a String

The simplest format: a plain text string that combines the system message and human message. There are two equivalent ways to get this.

Method 1: Use chat_prompt.format()

This directly returns the prompt as a string.

Code (From Original Source)
python 复制代码
# Assume `chat_prompt` is already set up for translation (English → French)
output = chat_prompt.format(
    input_language="English", 
    output_language="French", 
    text="I love programming."
)
print(output)
Output (From Original Source)
复制代码
System: You are a helpful assistant that translates English to French.
Human: I love programming.

Method 2: Use chat_prompt.format_prompt().to_string()

This is a two-step way to get the same string. First, format_prompt() creates a ChatPromptValue (see Section 2), then to_string() converts it to text.

Code (From Original Source)
python 复制代码
output_2 = chat_prompt.format_prompt(
    input_language="English", 
    output_language="French", 
    text="I love programming."
).to_string()

# Check if both outputs are identical (they will be!)
assert output == output_2  # No error means they match

What This Means

Both methods give you a readable text string. Use this if you want to quickly check or share the prompt content.

2. Output as a ChatPromptValue

ChatPromptValue is a special LangChain object that stores the full prompt (with messages). It's not just text---it keeps track of the message types (system vs. human).

Code (From Original Source)

python 复制代码
chat_prompt_value = chat_prompt.format_prompt(
    input_language="English", 
    output_language="French", 
    text="I love programming."
)
print(chat_prompt_value)
Output (From Original Source)
复制代码
ChatPromptValue(messages=[
    SystemMessage(content='You are a helpful assistant that translates English to French.', additional_kwargs={}), 
    HumanMessage(content='I love programming.', additional_kwargs={})
])

What This Means

  • ChatPromptValue has a messages attribute that holds a list of message objects (here: SystemMessage and HumanMessage).
  • Use this if you need to work with the prompt as a structured object (not just text) in LangChain workflows.

3. Output as a List of Message Objects

You can convert the ChatPromptValue into a list of SystemMessage and HumanMessage objects. This list is ready to pass directly to Chat models (e.g., ChatOpenAI), since models accept message objects as input.

Code (From Original Source)

python 复制代码
message_list = chat_prompt.format_prompt(
    input_language="English", 
    output_language="French", 
    text="I love programming."
).to_messages()
print(message_list)
Output (From Original Source)
复制代码
[
    SystemMessage(content='You are a helpful assistant that translates English to French.', additional_kwargs={}), 
    HumanMessage(content='I love programming.', additional_kwargs={})
]

What This Means

  • The list contains actual LangChain message objects (not just text).
  • This is the most useful format for running the prompt with a Chat model ---you can pass message_list directly to the model's predict or generate methods.

Quick Summary of All 3 Formats

Format Type How to Get It Use Case
String chat_prompt.format(...) or format_prompt().to_string() Quick checks/sharing prompt text
ChatPromptValue chat_prompt.format_prompt(...) Working with structured prompt objects
List of Message Objects format_prompt().to_messages() Passing input directly to a Chat model

All code, outputs, and logic match the original source---no extra changes or additions.

相关推荐
小兵张健13 小时前
价值1000的 AI 工作流:Codex 通用前端协作模式
前端·aigc·ai编程
sunny_13 小时前
面试踩大坑!同一段 Node.js 代码,CJS 和 ESM 的执行顺序居然是反的?!99% 的人都答错了
前端·面试·node.js
拉不动的猪14 小时前
移动端调试工具VConsole初始化时的加载阻塞问题
前端·javascript·微信小程序
数据组小组15 小时前
免费数据库管理工具深度横评:NineData 社区版、Bytebase 社区版、Archery,2026 年开发者该选哪个?
数据库·测试·数据库管理工具·数据复制·迁移工具·ninedata社区版·naivicat平替
ayqy贾杰15 小时前
Agent First Engineering
前端·vue.js·面试
IT_陈寒16 小时前
SpringBoot实战:5个让你的API性能翻倍的隐藏技巧
前端·人工智能·后端
iceiceiceice16 小时前
iOS PDF阅读器段评实现:如何从 PDFSelection 精准还原一个自然段
前端·人工智能·ios
大金乄16 小时前
封装一个vue2的elementUI 表格组件(包含表格编辑以及多级表头)
前端·javascript
葡萄城技术团队17 小时前
【性能优化篇】面对万行数据也不卡顿?揭秘协同服务器的“片段机制 (Fragments)”
前端
程序员阿峰17 小时前
2026前端必备:TensorFlow.js,浏览器里的AI引擎,不写Python也能玩转智能
前端