1.1 RunnableLambda 可运行
RunnableLambda 将 Python 可调用函数转换为 Runnable,使得函数可以在同步或异步上下文中使用。
举例:
python
from langchain_core.runnables import RunnableLambda
chain = {
"text1": lambda x: x + " world",
"text2": lambda x: x + ", how are you",
}| RunnableLambda(lambda x: len(x["text1"])+ len(x["text2"]))
result = chain.invoke("hello")
print(result)
也可以通过装饰器来使用:
python
from langchain_core.runnables import RunnableLambda
@RunnableLambda
def total_len(x):
return len(x["text1"]) + len(x["text2"])
chain = {
"text1": lambda x: x + " world",
"text2": lambda x: x + ", how are you",
}| total_len
result = chain.invoke("hello")
print(result) # 29
1.2 RunnablePassthrough 可运行透传
RunnablePassthrough 接收输入并将其原样输出。RunnablePassthrough 是 LangChain LCEL 体系中的"无操作节点",用于在流水线中透传输入或保留上下文,也可以用于向输出中添加键。
举例:保留中间结果
ini
from langchain_core.runnablesimport RunnablePassthrough,RunnableParallel
chain = RunnableParallel(
original=RunnablePassthrough(),
# 保留中间结果
word_count=lambda x: len(x),
)
result = chain.invoke("hello world")
print(result) # {'original': 'hello world','word_count': 11}
举例:使用 assign() 向输出中添加键
css
from langchain_core.runnables import RunnablePassthrough
chain = {
"text1": lambda x: x + " world",
"text2": lambda x: x + ", how are you",
}| RunnablePassthrough.assign(word_count=lambda x: len(x["text1"]+ x["text2"]))
result = chain.invoke("hello")
print(result)
#{'text1': 'hello world', 'text2': 'hello, how are you', 'word_count': 29}
1.3 RunnableBranch 可运行分支
RunnableBranch 使用 (条件,Runnable) 对列表和默认分支进行初始化。对输入进行操作时,选择第一个计算结果为 True 的条件,并在输入上运行相应的 Runnable。如果没有条件为 True,则在输入上运行默认分支。
举例:
python
from langchain_core.runnables import RunnableBranch
branch = RunnableBranch(
(lambda x: isinstance(x, str), lambda x: x.upper())
(lambda x: isinstance(x, int), lambda x: x + 1),
(lambda x: isinstance(x, float), lambda x: x * 2)
lambda x: "goodbye",
)
result = branch.invoke("hello")
print(result) # HELLO
result = branch.invoke(None)
print(result) # goodbye
1.4 RunnableWithFallbacks 可运行带回退
RunnableWithFallbacks 使得 Runnable 失败后可以回退到其他 Runnable。可以直接在Runnable 上使用 with_fallbacks 方法。
举例:
ini
import os
from langchain.chat_models import init_chat_model
from langchain_core.prompts import PromptTemplate
from langchain_core.runnables import RunnableLambda
llm = init_chat_model(
model="openai/gpt-oss-20b:free",
model_provider="openai",
base_url="https://openrouter.ai/api/v1",
api_key=os.getenv("OPENROUTER_API_KEY"),
)
chain = PromptTemplate.from_template("hello") | llm
chain_with_fallback = chain.with_fallbacks([RunnableLambda(lambda x: "sorry")])
result = chain_with_fallback.invoke("1") # 提示词模板中没有需要填充的变量,会报错
print(result) # sorry