openclaw使用本地llama.cpp

llama.cpp兼容openapi接口,自然可以作为openclaw的后端。

添加自定义provider同前:为openclaw增加自定义provider

反复修改,总是不能得到正确的model状态。

bash 复制代码
{
  "meta": {
    "lastTouchedVersion": "2026.2.3-1",
    "lastTouchedAt": "2026-02-05T12:16:30.399Z"
  },
  "wizard": {
    "lastRunAt": "2026-01-30T12:20:58.674Z",
    "lastRunVersion": "2026.1.29",
    "lastRunCommand": "onboard",
    "lastRunMode": "local"
  },
  "models": {
    "mode": "merge",
    "providers": {
      "llamacpp": {
        "baseUrl": "http://192.168.1.182:8087/v1",
        "apiKey": "no need key",
        "api": "openai-completions",
        "models": [
          {
            "id": "Qwen3-8B-Q6_K",
            "name": "Qwen3",
            "api": "openai-completions",
            "reasoning": true,
            "input": [
              "text"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 262144,
            "maxTokens": 32000
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "llamacpp/Qwen3-8B-Q6_K"
      },
      "models": {
        "llamacpp/Qwen3-8B-Q6_K": {
          "alias": "Qwen3"
        }
      },
      "maxConcurrent": 4,
      "subagents": {
        "maxConcurrent": 8
      }
    }
  },
  "messages": {
    "ackReactionScope": "group-mentions"
  },
  "commands": {
    "native": "auto",
    "nativeSkills": "auto"
  },
  "gateway": {
    "port": 18789,
    "mode": "local",
    "bind": "loopback",
    "auth": {
      "mode": "token",
      "token": "a08c51975f90e3afa566f4af1de977a70b6e9630909cc8c0",
      "password": "a08c51975f90e3afa566f4af1de977a70b6e9630909cc8c0"
    },
    "tailscale": {
      "mode": "off",
      "resetOnExit": false
    }
  },
  "skills": {
    "install": {
      "nodeManager": "npm"
    }
  }
}

注意C:\Users\yusp7.openclaw\agents\main\agent\models.json,要与config\models\provider里一致,内容不能有重复provider名的:

bash 复制代码
{
  "providers": {
    "llamacpp": {
      "baseUrl": "http://192.168.1.182:8087/v1",
      "apiKey": "no need key",
      "api": "openai-completions",
      "models": [
        {
          "id": "Qwen3-8B-Q6_K",
          "name": "Qwen3",
          "api": "openai-completions",
          "reasoning": true,
          "input": [
            "text"
          ],
          "cost": {
            "input": 0,
            "output": 0,
            "cacheRead": 0,
            "cacheWrite": 0
          },
          "contextWindow": 262144,
          "maxTokens": 32000
        }
      ]
    }
  }
}

但是,为什么返回的对话不对?

相关推荐
熊猫钓鱼>_>4 小时前
OpenClaw技术分析报告
ai·agent·skill·clawdbot·openclaw·meltbot·wise
aopstudio5 小时前
OpenClaw 实测体验:Agent 框架现在到底能不能用?
人工智能·llm·agent·openclaw
smileSunshineMan1 天前
本地mac-openclaw安装
macos·openclaw
甜品屋1 天前
openclaw 配置飞书 报错 应用未建立长连接
飞书·openclaw
汪碧康1 天前
OpenClaw 原版和汉化版windows 和Linux 下的部署实践
linux·人工智能·windows·agent·clawdbot·moltbot·openclaw
Lim小刘1 天前
实战指南:将 OpenClaw 集成至飞书,构建自动化办公智能体
飞书·openclaw
莫有杯子的龙潭峡谷2 天前
在 Windows 系统上安装 OpenClaw
人工智能·node.js·安装教程·openclaw
Java开发追求者2 天前
OpenClaw (Clawdbot) WSL 安装 + 飞书接入完整教程
clawdbot·openclaw·wsl 安装·飞书接入完整教程
love530love2 天前
Windows 11 配置 CUDA 版 llama.cpp 并实现系统全局调用(GGUF 模型本地快速聊天)
人工智能·windows·大模型·llama·llama.cpp·gguf·cuda 加速