https://python.langchain.com.cn/docs/modules/model_io/output_parsers/comma_separated
Comma-Separated List Output Parser in LangChain
This content is based on LangChain's official documentation (langchain.com.cn) and explains the CommaSeparatedListOutputParser---a tool to convert LLM outputs into comma-separated lists---in simplified terms. It strictly preserves original source codes, retains all knowledge points, and avoids arbitrary additions or modifications.
1. What is CommaSeparatedListOutputParser?
This output parser converts unstructured LLM responses into clean, comma-separated lists (Python lists).
- Use case: When you need the LLM to return a list of items (e.g., ice cream flavors, book titles) and want to directly use the result as a Python list (no manual string splitting).
- Key feature: It provides built-in
format_instructionsto guide the LLM to output comma-separated items, ensuring the parser can correctly parse the result.
2. Step 1: Import Required Modules
The code below imports all necessary classes---exactly as in the original documentation:
python
from langchain.output_parsers import CommaSeparatedListOutputParser
from langchain.prompts import PromptTemplate
from langchain.llms import OpenAI
from langchain.chat_models import ChatOpenAI # Included as in original import (even if not used in the example)
3. Step 2: Initialize the Output Parser
Create an instance of CommaSeparatedListOutputParser and get its format instructions (guidelines for the LLM to follow):
python
output_parser = CommaSeparatedListOutputParser()
format_instructions = output_parser.get_format_instructions() # Tells LLM to output comma-separated items
Note: The format_instructions automatically generated by the parser typically says: "Your response should be a list of comma-separated values. Do not include any additional text."
4. Step 3: Create a Prompt Template
Define a prompt template that includes the LLM task and the format instructions. This ensures the LLM outputs items in a comma-separated format:
python
prompt = PromptTemplate(
template="List five {subject}.\n{format_instructions}",
input_variables=["subject"], # Dynamic input (e.g., "ice cream flavors")
partial_variables={"format_instructions": format_instructions} # Fixed format guidelines
)
5. Step 4: Initialize the LLM and Generate Output
Use OpenAI (with temperature=0 for consistent results) to generate a response based on the formatted prompt:
python
model = OpenAI(temperature=0)
_input = prompt.format(subject="ice cream flavors") # Fill in the dynamic "subject"
output = model(_input) # LLM generates comma-separated items
6. Step 5: Parse the LLM Output into a Python List
Use the output parser to convert the LLM's string output into a structured Python list. The original code and output are preserved exactly:
Code:
python
output_parser.parse(output)
Output (exact as original):
python
['Vanilla',
'Chocolate',
'Strawberry',
'Mint Chocolate Chip',
'Cookies and Cream']
Key Takeaways
CommaSeparatedListOutputParsersimplifies converting LLM text outputs into usable Python lists.get_format_instructions()ensures the LLM follows the correct output format (comma-separated items).- The prompt template combines the task (e.g., "List five ice cream flavors") and format guidelines for reliability.
- Works with both LLMs (e.g.,
OpenAI) and chat models (e.g.,ChatOpenAI)---the core logic remains the same.