Formatting Outputs for ChatPrompt Templates(one)

https://python.langchain.com.cn/docs/modules/model_io/prompts/prompt_templates/format_output

The chat_prompt variable in LangChain is built by combining message templates (system messages, human messages, etc.) into a structured ChatPromptTemplate. Let's break down how it's constructed, using the exact example from the original source (translating English to French).

Step 1: Import Required Tools

First, import the necessary classes from LangChain to create chat prompts:

python 复制代码
from langchain.prompts.chat import (
    ChatPromptTemplate,          # To combine message templates
    SystemMessagePromptTemplate, # For system messages (AI's role)
    HumanMessagePromptTemplate   # For human/user messages (input)
)

Step 2: Define Message Templates

A chat_prompt typically includes two key parts:

  • A system message: Tells the AI its role/instructions.
  • A human message: The user's input (with placeholders for dynamic content).
Create the System Message Template

This defines the AI's task (e.g., "translate English to French"):

python 复制代码
# Template string for the system message
system_template = "You are a helpful assistant that translates {input_language} to {output_language}."

# Convert the string to a SystemMessagePromptTemplate
system_message_prompt = SystemMessagePromptTemplate.from_template(system_template)
  • {input_language} and {output_language} are placeholders (we'll fill them later).
Create the Human Message Template

This defines the user's input (the text to translate):

python 复制代码
# Template string for the human message
human_template = "{text}"  # {text} is a placeholder for the user's text

# Convert the string to a HumanMessagePromptTemplate
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)

Step 3: Combine Templates into chat_prompt

Use ChatPromptTemplate.from_messages() to merge the system and human message templates into a single chat_prompt:

python 复制代码
# Combine the two message templates into a ChatPromptTemplate
chat_prompt = ChatPromptTemplate.from_messages([
    system_message_prompt,  # First: system instructions
    human_message_prompt    # Second: user input
])

Final Result: What chat_prompt Contains

The chat_prompt variable now holds a structured prompt that:

  1. Includes the system's role (translation task).
  2. Includes a placeholder for the user's text.
  3. Can be filled with actual values (e.g., input_language="English", text="I love programming") later using .format() or .format_prompt().

This exact structure matches the original source---no changes to code or logic. The chat_prompt is simply a container for combining message templates to guide the AI's behavior.

相关推荐
沐雪架构师4 小时前
LangChain 1.0 内置的Agent中间件详解
中间件·langchain
Bruk.Liu6 小时前
(LangChain实战5):LangChain消息模版ChatPromptTemplate
人工智能·python·langchain·agent
爱敲代码的TOM6 小时前
大模型应用开发-LangChain框架基础
python·langchain·大模型应用
Bruk.Liu6 小时前
(LangChain实战3):LangChain阻塞式invoke与流式stream的调用
人工智能·python·langchain
Bruk.Liu6 小时前
(LangChain实战4):LangChain消息模版PromptTemplate
人工智能·python·langchain
共享家95277 小时前
LangChain初识
人工智能·langchain
Wang201220137 小时前
langchai自带的搜索功能国内tool有哪些(langchain+deepseek+百度AI搜索 打造带搜索功能的agent)
langchain
玄同7651 天前
Llama.cpp 全实战指南:跨平台部署本地大模型的零门槛方案
人工智能·语言模型·自然语言处理·langchain·交互·llama·ollama
玄同7651 天前
LangChain v1.0+ Prompt 模板完全指南:构建精准可控的大模型交互
人工智能·语言模型·自然语言处理·langchain·nlp·交互·知识图谱
一只理智恩1 天前
筹备计划·江湖邀请令!!!
python·langchain