LocalAI 部署(主要针对 mac m2 启动)

LocalAI 部署

介绍

LocalAI 是免费的开源 OpenAI 替代方案。 LocalAI 充当 REST API 的直接替代品,与本地推理的 OpenAI API 规范兼容。 它无需 GPU,还有多种用途集成,允许您使用消费级硬件在本地或本地运行 LLM、生成图像、音频等等,支持多个模型系列。

启动方式

1. Linux AMD64 docker 启动

bash 复制代码
helm repo add go-skynet https://go-skynet.github.io/helm-charts/
helm search repo go-skynet
helm pull go-skynet/local-ai
tar -xvf local-ai-3.1.0.tgz && cd local-ai
vim value.yaml
# 取消下面截图的注释
bash 复制代码
helm install --create-namespace local-ai . -n local-ai -f values.yaml

2. Mac M2 手动启动

bash 复制代码
# install build dependencies
brew install abseil cmake go grpc protobuf wget

# clone the repo
git clone https://github.com/go-skynet/LocalAI.git

cd LocalAI

# build the binary
make build
# make BUILD_TYPE=metal build
## Set `gpu_layers: 1` to your YAML model config file and `f16: true`
## Note: only models quantized with q4_0 are supported!

# Download gpt4all-j to models/
wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j

# Use a template from the examples
cp -rf prompt-templates/ggml-gpt4all-j.tmpl models/

# Run LocalAI
./local-ai --models-path=./models/ --debug=true
使用
bash 复制代码
# Now API is accessible at localhost:8080
curl http://localhost:8080/v1/models
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
     "model": "ggml-gpt4all-j",
     "messages": [{"role": "user", "content": "How are you?"}],
     "temperature": 0.9
   }'

官方编译启动文档

FQA

Q1: 编译报错日志 sources/go-llama/llama.go:372:13: undefined: min
bash 复制代码
binding.cpp:333:67: warning: format specifies type 'size_t' (aka 'unsigned long') but the argument has type 'int' [-Wformat]
binding.cpp:809:5: warning: deleting pointer to incomplete type 'llama_model' may cause undefined behavior [-Wdelete-incomplete]
sources/go-llama/llama.cpp/llama.h:60:12: note: forward declaration of 'llama_model'
# github.com/go-skynet/go-llama.cpp
sources/go-llama/llama.go:372:13: undefined: min
note: module requires Go 1.21
make: *** [backend-assets/grpc/llama] Error 1

需要使用 go 1.21 版本

bash 复制代码
brew install mercurial
# 安装 gvm
bash < <(curl -s -S -L https://raw.githubusercontent.com/moovweb/gvm/master/binscripts/gvm-installer)
# gvm直接生效
source ~/.gvm/scripts/gvm
# 查看版本
gvm install go1.21.7
gvm use go1.21.7
复制代码
CMake Error at /opt/homebrew/lib/cmake/protobuf/protobuf-targets.cmake:71 (set_target_properties):
  The link interface of target "protobuf::libprotobuf" contains:

    absl::absl_check

  but the target was not found.  Possible reasons include:

    * There is a typo in the target name.
    * A find_package call is missing for an IMPORTED target.
    * An ALIAS target is missing.

Call Stack (most recent call first):
  /opt/homebrew/lib/cmake/protobuf/protobuf-config.cmake:16 (include)
  examples/grpc-server/CMakeLists.txt:34 (find_package)

需要更新一下 protobuf 和 abseil 版本

bash 复制代码
brew uninstall protobuf abseil
sudo port install re2 grpc abseil
相关推荐
xhxxx13 小时前
AI打字机的秘密:一个 buffer 如何让机器学会“慢慢说话”
前端·vue.js·openai
AI大模型学徒14 小时前
大模型应用开发(九)_LangChain提示词模板
chatgpt·langchain·大模型·deepseek·提示词模板
ohyeah14 小时前
使用 Vue 3 实现大模型流式输出:从零搭建一个简易对话 Demo
前端·vue.js·openai
tobebetter952719 小时前
How to use homebrew on mac
macos·homebrew
SiYuanFeng20 小时前
chatgpt降智的解决办法:
chatgpt
信奥胡老师1 天前
苹果电脑(mac系统)安装vscode与配置c++环境,并可以使用万能头文件全流程
c++·ide·vscode·macos·编辑器
泯泷1 天前
AI 界的“USB-C”协议来了:让你的 AI 拥有即插即用的“手和脚”
aigc·openai·ai编程
泯泷1 天前
告别“接口地狱”,MCP 协议如何让 AI Agent 像乐高一样即插即用?
人工智能·openai·ai编程
记忆偶然2 天前
语音转文本技术方案评估与工具选型指南
ide·macos·xcode