https://python.langchain.com.cn/docs/modules/model_io/prompts/prompt_templates/format_output
The chat_prompt variable in LangChain is built by combining message templates (system messages, human messages, etc.) into a structured ChatPromptTemplate. Let's break down how it's constructed, using the exact example from the original source (translating English to French).
Step 1: Import Required Tools
First, import the necessary classes from LangChain to create chat prompts:
python
from langchain.prompts.chat import (
ChatPromptTemplate, # To combine message templates
SystemMessagePromptTemplate, # For system messages (AI's role)
HumanMessagePromptTemplate # For human/user messages (input)
)
Step 2: Define Message Templates
A chat_prompt typically includes two key parts:
- A system message: Tells the AI its role/instructions.
- A human message: The user's input (with placeholders for dynamic content).
Create the System Message Template
This defines the AI's task (e.g., "translate English to French"):
python
# Template string for the system message
system_template = "You are a helpful assistant that translates {input_language} to {output_language}."
# Convert the string to a SystemMessagePromptTemplate
system_message_prompt = SystemMessagePromptTemplate.from_template(system_template)
{input_language}and{output_language}are placeholders (we'll fill them later).
Create the Human Message Template
This defines the user's input (the text to translate):
python
# Template string for the human message
human_template = "{text}" # {text} is a placeholder for the user's text
# Convert the string to a HumanMessagePromptTemplate
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
Step 3: Combine Templates into chat_prompt
Use ChatPromptTemplate.from_messages() to merge the system and human message templates into a single chat_prompt:
python
# Combine the two message templates into a ChatPromptTemplate
chat_prompt = ChatPromptTemplate.from_messages([
system_message_prompt, # First: system instructions
human_message_prompt # Second: user input
])
Final Result: What chat_prompt Contains
The chat_prompt variable now holds a structured prompt that:
- Includes the system's role (translation task).
- Includes a placeholder for the user's text.
- Can be filled with actual values (e.g.,
input_language="English",text="I love programming") later using.format()or.format_prompt().
This exact structure matches the original source---no changes to code or logic. The chat_prompt is simply a container for combining message templates to guide the AI's behavior.