一文学会OpenClaw对接飞书
一、背景
Openclaw是一个运行在用户自有设备上的 "数字代理" ,它不仅能通过聊天做调研和启发,更能接管电脑真正地处理任务。OpenClaw 的核心价值在于将大语言模型的理解能力与本地系统的执行能力相结合,实现自动化操作。赘余的话就不介绍了,直接开始孵化龙虾。
二、孵化
龙虾可以安装在任意Linux、MacOS及Windows上,步骤大致相同,以下以Linux Ubuntu 24.04为例,配置[核心数>=2, 运行内存>=4GB, 硬盘空间>=40GB]。
由于OpenClaw是基于Node的编写的一款终端软件,所以一定要先安装好Node环境,node>=22.12.0,这里使用非root用户,通过nvm管理器进行node安装。
2-1.加速git
Shell
# 方法一# 设置镜像地址
git config --global url."https://gitclone.com/".insteadOf https://
# 取消镜像设置
# git config --global --unset url."https://gitclone.com/".insteadOf
2-2.安装nvm
执行安装
shell
git clone https://github.com/nvm-sh/nvm.git
cd nvm
./install.sh
配置环境变量,在~/.bashrc追加如下内容
properties
NVM_NODEJS_ORG_MIRROR=https://npmmirror.com/mirrors/node
NVM_NPM_MIRROR=https://npmmirror.com/mirrors/npm
使环境变量生效
shell
source ~/.bashrc
2-3.安装node
shell
nvm install 22
nvm use 22
2-4.安装openclaw
shell
npm --registry=https://registry.npmmirror.com install openclaw -g
当指令执行完成后,gateway便已经运行了。这个步骤比较久,一般5~30分钟,期间llama.cpp至少需要4GB内存,如果本机内存不够,可以配置使用虚拟交换内存。
shell
$ openclaw -V
OpenClaw 2026.3.8 (3caab92)
2-5.配置模型
国内可用模型服务商列表如下(内容转载自传送门),先选择一个平台获取API-KEY
①.通用指令模板
(1/4)按协议接入模型二选一
在对应服务提供商处,购买套餐或用量包,再获取API_KEY填入如下命令的"apiKey"部分
OpenAI协议
shell
openclaw config set 'models.providers.{provider_name}' --json '{
"baseUrl": "https://{baseurl}",
"apiKey": "${YOUR_API_KEY}",
"api": "openai-completions",
"models": [
{ "id": "{model_id}", "name": "{model_name}" }
]
}'
Anthropic协议
shell
openclaw config set 'models.providers.{provider_name}' --json '{
"baseUrl": "https://{baseurl}",
"apiKey": "${YOUR_API_KEY}",
"api": "anthropic-messages",
"models": [
{ "id": "{model_id}", "name": "{model_name}" },
{ "id": "{model2_id}", "name": "{model2_name}" }
]
}'
(2/4)设置models.mode为merge
shell
openclaw config set models.mode merge
(3/4)设置默认模型
shell
openclaw models set {provider_name}/{model_id}
(4/4)重启网关
shell
openclaw gateway restart
②.深度求索
shell
openclaw config set 'models.providers.deepseek' --json '{
"baseUrl": "https://api.deepseek.com/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "deepseek-chat", "name": "DeepSeek Chat" },
{ "id": "deepseek-reasoner", "name": "DeepSeek Reasoner" }
]
}'
txt
openclaw config set models.mode merge
txt
openclaw models set deepseek/deepseek-chat
shell
openclaw gateway restart
③.腾讯云
shell
openclaw config set 'models.providers.qcloudlkeap' --json '{
"baseUrl": "https://api.lkeap.cloud.tencent.com/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "deepseek-r1", "name": "DeepSeek R1" },
{ "id": "deepseek-r1-0528", "name": "DeepSeek R1 0528" },
{ "id": "deepseek-v3", "name": "DeepSeek V3" },
{ "id": "deepseek-v3-0324", "name": "DeepSeek V3 0324" },
{ "id": "deepseek-v3.1", "name": "DeepSeek V3.1" },
{ "id": "deepseek-v3.1-terminus", "name": "DeepSeek V3.1 Terminus" },
{ "id": "deepseek-v3.2", "name": "DeepSeek V3.2" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set qcloudlkeap/deepseek-v3.2
shell
openclaw gateway restart
④.腾讯混元
shell
openclaw config set 'models.providers.hunyuan' --json '{
"baseUrl": "https://api.hunyuan.cloud.tencent.com/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "hunyuan-2.0-thinking-20251109", "name": "混元 2.0 Thinking" },
{ "id": "hunyuan-2.0-instruct-20251111", "name": "混元 2.0 Instruct" },
{ "id": "hunyuan-t1-latest", "name": "混元 T1 Latest" },
{ "id": "hunyuan-t1-vision", "name": "混元 T1 Vision" },
{ "id": "hunyuan-turbos-latest", "name": "混元 TurboS Latest" },
{ "id": "hunyuan-turbo-latest", "name": "混元 Turbo Latest" },
{ "id": "hunyuan-turbo", "name": "混元 Turbo" },
{ "id": "hunyuan-turbo-vision", "name": "混元 Turbo Vision" },
{ "id": "hunyuan-pro", "name": "混元 Pro" },
{ "id": "hunyuan-large", "name": "混元 Large" },
{ "id": "hunyuan-large-longcontext", "name": "混元 Large 长上下文" },
{ "id": "hunyuan-large-vision", "name": "混元 Large Vision" },
{ "id": "hunyuan-standard", "name": "混元 Standard" },
{ "id": "hunyuan-standard-256k", "name": "混元 Standard 256K" },
{ "id": "hunyuan-lite", "name": "混元 Lite" },
{ "id": "hunyuan-vision", "name": "混元 Vision" },
{ "id": "hunyuan-code", "name": "混元 Code" },
{ "id": "hunyuan-role-latest", "name": "混元 Role" },
{ "id": "hunyuan-functioncall", "name": "混元 FunctionCall" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set hunyuan/hunyuan-2.0-instruct-20251111
shell
openclaw gateway restart
⑤.MiniMax
shell
openclaw config set 'models.providers.minimax' --json '{
"baseUrl": "https://api.minimaxi.com/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "MiniMax-M2.5", "name": "MiniMax M2.5" },
{ "id": "MiniMax-M2.5-highspeed", "name": "MiniMax M2.5 极速版" },
{ "id": "MiniMax-M2.1", "name": "MiniMax M2.1" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set minimax/MiniMax-M2.5
shell
openclaw gateway restart
⑥.通义千问
shell
openclaw config set 'models.providers.qwen' --json '{
"baseUrl": "https://dashscope.aliyuncs.com/compatible-mode/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "qwen-max", "name": "Qwen Max" },
{ "id": "qwen-plus", "name": "Qwen Plus" },
{ "id": "qwen-turbo", "name": "Qwen Turbo" },
{ "id": "qwen3-coder-plus", "name": "Qwen3 Coder Plus" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set qwen/qwen-max
shell
openclaw gateway restart
⑦.智谱AI
shell
openclaw config set 'models.providers.zhipu' --json '{
"baseUrl": "https://open.bigmodel.cn/api/paas/v4",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "glm-4-plus", "name": "GLM-4 Plus" },
{ "id": "glm-4-0520", "name": "GLM-4 0520" },
{ "id": "glm-4-air", "name": "GLM-4 Air" },
{ "id": "glm-4-airx", "name": "GLM-4 AirX" },
{ "id": "glm-4-long", "name": "GLM-4 Long" },
{ "id": "glm-4-flash", "name": "GLM-4 Flash" },
{ "id": "glm-4-flashx", "name": "GLM-4 FlashX" },
{ "id": "glm-4v-plus", "name": "GLM-4V Plus" },
{ "id": "glm-4v", "name": "GLM-4V" },
{ "id": "codegeex-4", "name": "CodeGeeX 4" },
{ "id": "charglm-3", "name": "CharGLM 3" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set zhipu/glm-4-plus
shell
openclaw gateway restart
⑧.Moonshot
shell
openclaw config set 'models.providers.moonshot' --json '{
"baseUrl": "https://api.moonshot.cn/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "moonshot-v1-8k", "name": "Moonshot V1 8K" },
{ "id": "moonshot-v1-32k", "name": "Moonshot V1 32K" },
{ "id": "moonshot-v1-128k", "name": "Moonshot V1 128K" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set moonshot/moonshot-v1-128k
shell
openclaw gateway restart
⑨.百度文心
shell
openclaw config set 'models.providers.baidu' --json '{
"baseUrl": "https://qianfan.baidubce.com/v2",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "ernie-4.0-8k", "name": "ERNIE 4.0 8K" },
{ "id": "ernie-4.0-turbo-8k", "name": "ERNIE 4.0 Turbo 8K" },
{ "id": "ernie-3.5-8k", "name": "ERNIE 3.5 8K" },
{ "id": "ernie-speed-128k", "name": "ERNIE Speed 128K" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set baidu/ernie-4.0-8k
shell
openclaw gateway restart
⑩.字节豆包
shell
openclaw config set 'models.providers.doubao' --json '{
"baseUrl": "https://ark.cn-beijing.volces.com/api/v3",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "doubao-pro-32k", "name": "豆包 Pro 32K" },
{ "id": "doubao-pro-128k", "name": "豆包 Pro 128K" },
{ "id": "doubao-lite-32k", "name": "豆包 Lite 32K" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set doubao/doubao-pro-32k
shell
openclaw gateway restart
⑪.讯飞星火
shell
openclaw config set 'models.providers.spark' --json '{
"baseUrl": "https://spark-api-open.xf-yun.com/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "spark-4.0-ultra", "name": "星火 4.0 Ultra" },
{ "id": "spark-max", "name": "星火 Max" },
{ "id": "spark-pro", "name": "星火 Pro" },
{ "id": "spark-lite", "name": "星火 Lite" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set spark/spark-4.0-ultra
shell
openclaw gateway restart
⑫.百川智能
shell
openclaw config set 'models.providers.baichuan' --json '{
"baseUrl": "https://api.baichuan-ai.com/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "Baichuan4", "name": "百川4" },
{ "id": "Baichuan3-Turbo", "name": "百川3 Turbo" },
{ "id": "Baichuan3-Turbo-128k", "name": "百川3 Turbo 128K" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set baichuan/Baichuan4
shell
openclaw gateway restart
⑬.阶跃星辰
shell
openclaw config set 'models.providers.stepfun' --json '{
"baseUrl": "https://api.stepfun.com/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "step-2-16k", "name": "Step 2 16K" },
{ "id": "step-1-128k", "name": "Step 1 128K" },
{ "id": "step-1-32k", "name": "Step 1 32K" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set stepfun/step-2-16k
shell
openclaw gateway restart
⑭.硅基流动
shell
openclaw config set 'models.providers.siliconflow' --json '{
"baseUrl": "https://api.siliconflow.cn/v1",
"apiKey": "",
"api": "openai-completions",
"models": [
{ "id": "deepseek-ai/DeepSeek-V3", "name": "DeepSeek V3" },
{ "id": "Qwen/Qwen2.5-72B-Instruct", "name": "Qwen2.5 72B" },
{ "id": "meta-llama/Llama-3.3-70B-Instruct", "name": "Llama 3.3 70B" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set siliconflow/deepseek-ai/DeepSeek-V3
shell
openclaw gateway restart
⑮.阿里百炼
shell
openclaw config set 'models.providers.bailian' --json '{
"baseUrl": "https://coding.dashscope.aliyuncs.com/apps/anthropic",
"apiKey": "${YOUR_API_KEY}",
"api": "anthropic-messages",
"models": [
{ "id": "qwen3.5-plus", "name": "千问3.5+(支持图片理解)" },
{ "id": "kimi-k2.5", "name": "Kimi2.5(支持图片理解)" },
{ "id": "glm-5", "name": "GLM5" },
{ "id": "MiniMax-M2.5", "name": "MiniMax2.5" }
]
}'
shell
openclaw config set models.mode merge
shell
openclaw models set bailian/qwen3.5-plus
shell
openclaw gateway restart
2-6.测试模型可用
使用Ctrl+Alt+T打开终端
shell
// 输入以下命令
openclaw tui
// 然后在输入框输入一个问题,如"你是什么模型",当回复不报错且模型名称符合预期时,表明成功接入模型
然后使用Ctrl+C退出终端交互
2-7.配置飞书Bot
登陆飞书开放平台:https://open.feishu.cn/app,找到"企业自建应用"并点击,填写"应用名称"、"应用描述"等信息后点击"创建"。
在凭据与基础信息 栏目页,获得App ID 和App Secret
在权限管理栏目页,点击"批量导入/导出权限",填入下述JSON后完成一键开通权限
json{ "scopes": { "tenant": [ "im:chat.members:read", "contact:contact.base:readonly", "contact:user.base:readonly", "corehr:file:download", "docs:document.media:download", "docx:document", "docx:document.block:convert", "docx:document:create", "docx:document:readonly", "docx:document:write_only", "drive:drive", "drive:drive.metadata:readonly", "drive:drive.search:readonly", "drive:file", "drive:file.like:readonly", "drive:file.meta.sec_label.read_only", "drive:file:download", "drive:file:readonly", "drive:file:upload", "drive:file:view_record:readonly", "im:chat", "im:chat.access_event.bot_p2p_chat:read", "im:chat:read", "im:chat:update", "im:message", "im:message.group_at_msg:readonly", "im:message.p2p_msg:readonly", "im:message:send_as_bot", "im:resource", "space:document:delete", "space:document:move", "space:document:retrieve", "space:folder:create", "speech_to_text:speech" ], "user": [ ] } }至此,飞书机器人配置告一段落
2-8.安装飞书插件
openclaw默认安装的feishu插件有重复提示,如下:
shell
$ openclaw gateway status
Config warnings:\n- plugins.entries.feishu: plugin feishu: duplicate plugin id detected; later plugin may be overridden (/home/eddie/.openclaw/extensions/feishu/index.ts)
Config warnings:\n- plugins.entries.feishu: plugin feishu: duplicate plugin id detected; later plugin may be overridden (/home/eddie/.openclaw/extensions/feishu/index.ts)
🦞 OpenClaw 2026.3.8 (3caab92) --- Greetings, Professor Falken
...
如果你不想看见这个提示,就清理如下两个位置之一,或者你想换别的版本时,将两个位置全部清理:
shell
# <node>
$ which node
/home/eddie/.nvm/versions/node/v22.22.1/bin/node
$ cd /home/eddie/.nvm/versions/node/v22.22.1/
$ cd lib/node_modules/openclaw/extensions/
$ rm -rf feishu/
# <env>
$ cd ~/.openclaw/extensions/
$ rm -rf feishu/
我现在是全部清理,再重新安装插件,插件一般会安装到env,但插件无论放在node还是env下都可以保证openclaw对接飞书正常。飞书插件最新安装教程请参照:https://github.com/m1heng/clawdbot-feishu。
shell
# 已经设置过git加速,安装会很快,大约5mins
openclaw plugins install @m1heng-clawd/feishu
# 重启网关gateway
openclaw gateway restart
2-9.配置Channel
shell
# 配置飞书频道
openclaw channels add
使用如下参数
shell
◆ Configure chat channels now?
│ ● Yes / ○ No
◆ Select a channel
│ ○ Telegram (Bot API)
│ ○ XXX
│ ● Feishu/Lark (飞书) (needs app creds)
│ ...
◆ Feishu account
│ ● default (primary)
│ ○ Add a new account
◆ Enter Feishu App ID
│ cli_a92xxxxxxxxxxxxx # 在"凭据与基础信息"栏目页
◆ Enter Feishu App Secret
│ kOxxxxxxxxxxxxxxxxxxxxxxxxxxM3C█ # 在"凭据与基础信息"栏目页
◆ Which Feishu domain?
│ ● Feishu (feishu.cn) - China
│ ○ Lark (larksuite.com) - International
◆ Group chat policy
│ ○ Allowlist - only respond in specific groups
│ ● Open - respond in all groups (requires mention)
│ ○ Disabled - don't respond in groups
◆ Select a channel
│ ...
│ ○ Slack (Socket Mode)
│ ● Finished (Done)
◆ Configure DM access policies now? (default: pairing)
│ Yes
◆ Feishu DM policy
│ ○ Pairing (recommended)
│ ○ Allowlist (specific users only)
│ ● Open (public inbound DMs)
│ ○ Disabled (ignore DMs)
◆ Add display names for these accounts? (optional)
│ ○ Yes / ● No
◆ Bind configured channel accounts to agents now?
│ ● Yes / ○ No
◆ Route feishu account "default" to agent
│ ● main (default)
│
└ Channels updated.
此时~/.openclaw/openclaw.json中的飞书配置如下
json
{
...
"channels": {
"feishu": {
"enabled": true,
"appId": "cli_a92xxxxxxxxxxxxx",
"appSecret": "kOxxxxxxxxxxxxxxxxxxxxxxxxxxM3C",
"domain": "feishu",
"groupPolicy": "open",
"dmPolicy": "open",
"allowFrom": ["*"]
}
},
...
}
使配置生效
shell
# 重启网关gateway
openclaw gateway restart
2-10.继续配置飞书Bot
登陆飞书开放平台:https://open.feishu.cn/app,找到刚才创建的应用
在事件与回调栏目页,依次进入"事件配置>订阅方式",选择"长连接"进行设置
在事件与回调栏目页,依次进入"事件配置>已添加事件",添加如下四个事件
shell机器人进群 im.chat.member.bot.added_v1 机器人被移出群 im.chat.member.bot.deleted_v1 消息已读 im.message.message_read_v1 接收消息 im.message.receive_v1在版本管理与发布栏目页,点击创建版本,填入如下信息,再【保存->审核->发布】
shell应用版本号 1.0.0 更新说明 v1 可用范围 (配置谁可用)至此机器人已经配置完成
然后打开飞书App,"More>Workplace>Favorites>Add",添加自己创建的机器人到工作台,然后从工作台中点击机器人图标,进入对话,检测模型回复是否正常。
由于gateway所在设备与飞书平台在首次对接时需要配对,所以机器人会回复一个pairing code,按以下操作进行审批,然后再与模型对话应该就正常了。
shell
查看飞书IM平台的配对请求
openclaw pairing list feishu
批准配对
openclaw pairing approve feishu <CODE>
拒绝
openclaw pairing reject feishu <CODE>
2-11.飞书集成多agent
启用这个模式的初衷是为了专业分工提效 ,以及合理充分的利用模型。譬如写汇总的模型不必去调用能图片识别的模型,不是不能用,而是太贵。
参照【2-7】再额外创建3个飞书机器人应用,【2-10】的步骤需要等gateway对飞书平台发起ws请求后操作。
①.角色创建
shell
# 查询默认代理列表
openclaw agents list
# 1. 部门主管
openclaw agents add leader \
--model deepseek/deepseek-chat \
--workspace ~/.openclaw/workspace-leader
openclaw agents set-identity --agent leader --name "项目经理" --emoji "👔"
# 2. 开发人员
openclaw agents add developer \
--model minimax/MiniMax-M2.5 \
--workspace ~/.openclaw/workspace-developer
openclaw agents set-identity --agent developer --name "全栈开发" --emoji "🔧"
# 3. 测试维护
openclaw agents add maintainer \
--model bailian/qwen3.5-plus \
--workspace ~/.openclaw/workspace-maintainer
openclaw agents set-identity --agent maintainer --name "测试交付" --emoji "🛡️"
# 验证Agent创建结果(返回4个Agent信息即为成功)
openclaw agents list
# openclaw agents delete leader
②.能力定位
对于每个Agent,其对应的Workspace下配置SOUL.md、AGENTS.md、USER.md三个核心文件,决定其行为模式与专业能力。
对于远程主机,可通过使用Visual Studio Code的"Remote - SSH(microsoft.com)"插件走远程协议连接,进行文件快速配置。
项目经理:
~/.openclaw/workspace-leader/SOUL.md
markdown# SOUL.md:项目掌驼人 ## 身份定位 你是AI团队的"项目大管家",核心职责是"接需求-拆任务-盯进度",不直接写代码或执行测试,专注于整合资源、协调人力、确保项目这艘船不翻。 ## 核心能力 需求深潜:能精准剖析用户需求,挖掘隐性痛点,将模糊想法转化为可执行的功能列表; 任务拆解:把复杂项目分解为原子级任务,按优先级排期,明确每个任务的责任人(开发/测试/设计); 进度追踪:通过每日站会、甘特图、燃尽图等手段,实时监控项目状态,识别风险并提前干预; 资源调度:根据任务紧急程度和人员负载,动态调整人力,必要时调用外部专家Agent; 干系人沟通:向上汇报进度,向下传递目标,对外管理用户预期,确保信息透明同步。 ## 行为准则 绝不越俎代庖替开发写代码,也不替测试点按钮,而是当好"人形项目管理工具"; 对需求变更要敏感但不过敏,任何变更都必须评估影响范围并通知相关方; 每周至少一次向用户同步项目健康度,主动收集反馈,持续优化交付流程。 ## 外交能力 可以使用内置的session_send方法与a2a中"allow"的模型沟通,找到合适的模型协助。
开发人员:
~/.openclaw/workspace-developer/SOUL.md
markdown# SOUL.md:代码搬砖师 ## 身份定位 你是AI团队的"技术实现者",核心职责是"接需求-写代码-交成果",负责将设计蓝图转化为可运行的系统,专注解决技术难题,用代码说话。 ## 核心能力 技术选型:根据项目需求评估技术栈,选择最合适的语言、框架和工具,平衡性能与开发效率; 功能实现:将需求拆解为模块,编写清晰、可维护、符合规范的代码,并自测确保基础功能正常; 问题诊断:面对Bug能快速定位根因,利用调试工具、日志分析和搜索引擎,精准修复; 性能优化:对代码执行效率、资源占用进行调优,确保系统在高负载下依然丝滑; 文档输出:编写必要的技术文档、接口说明和部署手册,为后续维护铺路。 ## 行为准则 坚决不写"屎山"代码,保持模块化、低耦合,注重代码可读性; 遇到技术瓶颈或需求不清晰时,第一时间@项目经理,绝不闷头硬扛; 交付前必须通过冒烟测试,确保主流程跑通,不做"甩手掌柜"。 ## 外交能力 可以使用内置的session_send方法与a2a中"allow"的模型沟通,找到合适的模型协助。
测试运维:
~/.openclaw/workspace-maintainer/SOUL.md
markdown# SOUL.md:质量守门员(测试维护 Agent) ## 身份定位 你是AI团队的"最后一道防线",核心职责是"找茬-验货-护线上",负责挖掘潜在缺陷、保障系统质量,并在线上出现异常时冲锋陷阵。 ## 核心能力 测试设计:基于需求文档设计全面的测试用例,覆盖功能、边界、异常、性能、安全等多维度; 自动化测试:搭建和维护自动化测试脚本(单元/集成/E2E),提升回归效率,构建质量门禁; 缺陷管理:精准提交Bug,清晰描述复现步骤、预期与实际结果,并推动开发修复与验证; 监控告警:配置系统监控(日志、性能、业务指标),及时发现线上异常,快速响应和处理; 根因分析:对线上故障进行复盘,找出根本原因,制定预防措施,避免同类问题再犯。 ## 行为准则 绝不放过任何一个可疑点,但也要区分"必须修复"和"可延期"的缺陷,避免过度测试; 线上问题响应时间不超过15分钟,先止血(恢复服务)再找病因; 每次发版后,主动收集用户反馈和系统日志,持续打磨测试用例,提升防护网密度。 ## 外交能力 可以使用内置的session_send方法与a2a中"allow"的模型沟通,找到合适的模型协助。
③.接入应用
在~/.openclaw/openclaw.json中追加bindings配置
- 将channels>feishu的appId和appSecret放入accounts下,其他信息不变
json
{
"models": ...
"channels": {
"feishu": {
"enabled": true,
"accounts": {
"default": {
"appId": "cli_xxxxxxxxxxxxxx",
"appSecret": "OTxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
},
"leader": {
"appId": "cli_xxxxxxxxxxxxxx",
"appSecret": "OTxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
},
"developer": {
"appId": "cli_xxxxxxxxxxxxxx",
"appSecret": "OTxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
},
"maintainer": {
"appId": "cli_xxxxxxxxxxxxxx",
"appSecret": "OTxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
}
},
"domain": "feishu",
"groupPolicy": "open",
"dmPolicy": "open",
"allowFrom": ["*"]
}
},
"plugins": ...
...
"bindings": [
{
"agentId": "leader",
"match": {
"channel": "feishu",
"accountId": "leader"
}
},
{
"agentId": "developer",
"match": {
"channel": "feishu",
"accountId": "developer"
}
},
{
"agentId": "maintainer",
"match": {
"channel": "feishu",
"accountId": "maintainer"
}
}
],
"tools": {
"agentToAgent": {
"enabled": true,
"allow": ["main", "leader", "developer", "maintainer"]
},
"sessions": {
"visibility": "all"
}
}
}
重启网关,载入新配置
shell
openclaw gateway restart
再到飞书管理后台,参照【2-10】设置长连接,然后依次私聊每个机器人,确保机器人正常接入。
然后把所有机器人拉到同一群里,在群里依次艾特每个机器人,发送如下消息:
tex
(我)> @是个人物 获取该飞书群ID,后续你的任何任务执行都要往该群同步,便于人类观察;用内置的session_send工具,询问其他你可发现的agent并了解它们都用的什么模型,最后汇总给我。这两件事都写入你的MEMORY.md中
(我)> @AI Leader 获取该飞书群ID,后续你的任何任务执行都要往该群同步,便于人类观察;用内置的session_send工具,询问其他你可发现的agent并了解它们都用的什么模型,最后汇总给我。这两件事都写入你的MEMORY.md中
(我)> @AI Developer 获取该飞书群ID,后续你的任何任务执行都要往该群同步,便于人类观察;用内置的session_send工具,询问其他你可发现的agent并了解它们都用的什么模型,最后汇总给我。这两件事都写入你的MEMORY.md中
(我)> @AI Maintainer 获取该飞书群ID,后续你的任何任务执行都要往该群同步,便于人类观察;用内置的session_send工具,询问其他你可发现的agent并了解它们都用的什么模型,最后汇总给我。这两件事都写入你的MEMORY.md中
AI团队的组织架构
properties
是个人物 (main) --- 最高负责人
↓ 直系下属
Leader --- 项目大管家
↓ 直系下属
Developer (代码搬砖师) + Maintainer (质量守门员)
三、其他
1.clawhub镜像加速
shell
openclaw config set clawhub.mirror "https://mirror.aliyun.com/clawhub/"
2.下载技能
shell
clawhub search "browser"
clawhub install browser-automation
3.卸载龙虾
shell
npm uninstall -g openclaw
rm -rf ~/.openclaw/