Formatting Outputs for ChatPrompt Templates(one)

https://python.langchain.com.cn/docs/modules/model_io/prompts/prompt_templates/format_output

The chat_prompt variable in LangChain is built by combining message templates (system messages, human messages, etc.) into a structured ChatPromptTemplate. Let's break down how it's constructed, using the exact example from the original source (translating English to French).

Step 1: Import Required Tools

First, import the necessary classes from LangChain to create chat prompts:

python 复制代码
from langchain.prompts.chat import (
    ChatPromptTemplate,          # To combine message templates
    SystemMessagePromptTemplate, # For system messages (AI's role)
    HumanMessagePromptTemplate   # For human/user messages (input)
)

Step 2: Define Message Templates

A chat_prompt typically includes two key parts:

  • A system message: Tells the AI its role/instructions.
  • A human message: The user's input (with placeholders for dynamic content).
Create the System Message Template

This defines the AI's task (e.g., "translate English to French"):

python 复制代码
# Template string for the system message
system_template = "You are a helpful assistant that translates {input_language} to {output_language}."

# Convert the string to a SystemMessagePromptTemplate
system_message_prompt = SystemMessagePromptTemplate.from_template(system_template)
  • {input_language} and {output_language} are placeholders (we'll fill them later).
Create the Human Message Template

This defines the user's input (the text to translate):

python 复制代码
# Template string for the human message
human_template = "{text}"  # {text} is a placeholder for the user's text

# Convert the string to a HumanMessagePromptTemplate
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)

Step 3: Combine Templates into chat_prompt

Use ChatPromptTemplate.from_messages() to merge the system and human message templates into a single chat_prompt:

python 复制代码
# Combine the two message templates into a ChatPromptTemplate
chat_prompt = ChatPromptTemplate.from_messages([
    system_message_prompt,  # First: system instructions
    human_message_prompt    # Second: user input
])

Final Result: What chat_prompt Contains

The chat_prompt variable now holds a structured prompt that:

  1. Includes the system's role (translation task).
  2. Includes a placeholder for the user's text.
  3. Can be filled with actual values (e.g., input_language="English", text="I love programming") later using .format() or .format_prompt().

This exact structure matches the original source---no changes to code or logic. The chat_prompt is simply a container for combining message templates to guide the AI's behavior.

相关推荐
云烟飘渺o8 小时前
生活视角下Prompt 提示词思考
人工智能·prompt·生活
serve the people9 小时前
LangChain Few-Shot Prompt Templates(two)
langchain·prompt
iNBC16 小时前
AI基础概念-第一部分:核心名词与定义(一)
人工智能·语言模型·prompt
IvanCodes1 天前
一、初识 LangChain:架构、应用与开发环境部署
人工智能·语言模型·langchain·llm
serve the people1 天前
MessagePromptTemplate Types in LangChain
langchain
chenchihwen1 天前
AI代码开发宝库系列:Text2SQL深度解析基于LangChain构建
人工智能·python·langchain·text2sql·rag
二向箔reverse2 天前
用langchain搭建简单agent
人工智能·python·langchain
水中加点糖2 天前
使用LangChain+LangGraph自定义AI工作流,实现音视频字幕生成工具
人工智能·ai·langchain·工作流·langgraph
serve the people2 天前
LangChain 提示模板之少样本示例(二)
langchain