1. 安装python环境
首先安装一个uv,用来管理虚拟环境
shell
pip install uv
Python 版本管理
安装和管理 Python 解释器本身。
- uv python install: 安装 Python 版本
- uv python list: 查看可用 Python 版本
- uv python find: 查找已安装的 Python 版本
- uv python pin: 将当前项目固定使用特定 Python 版本
- uv python uninstall: 卸载 Python 版本
查看有哪些可用的版本
shell
uv python list
如果有安装则有具体的路径
shell
cpython-3.15.0a2-windows-x86_64-none <download available>
cpython-3.15.0a2+freethreaded-windows-x86_64-none <download available>
cpython-3.14.0-windows-x86_64-none <download available>
cpython-3.14.0+freethreaded-windows-x86_64-none <download available>
cpython-3.13.9-windows-x86_64-none <download available>
cpython-3.13.9+freethreaded-windows-x86_64-none <download available>
cpython-3.12.12-windows-x86_64-none <download available>
cpython-3.11.14-windows-x86_64-none <download available>
cpython-3.10.19-windows-x86_64-none <download available>
cpython-3.10.2-windows-x86_64-none AppData\Local\Programs\Python\Python310\python.exe
cpython-3.9.25-windows-x86_64-none <download available>
cpython-3.8.20-windows-x86_64-none <download available>
pypy-3.11.13-windows-x86_64-none <download available>
pypy-3.10.16-windows-x86_64-none <download available>
pypy-3.9.19-windows-x86_64-none <download available>
pypy-3.8.16-windows-x86_64-none <download available>
graalpy-3.12.0-windows-x86_64-none <download available>
graalpy-3.11.0-windows-x86_64-none <download available>
graalpy-3.10.0-windows-x86_64-none <download available>
安装一个3.12版本
shell
uv python install 3.12.12
创建虚拟环境
shell
mkdir agent && cd agent
uv venv --python 3.12.12
Using CPython 3.12.12
Creating virtual environment at: .venv
Activate with: .venv\Scripts\activate
激活环境
powershell环境
shell
# pwoershell环境临时放开权限
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
.venv\Scripts\activate.ps1
cmd环境
shell
.venv\Scripts\activate.bat
(agent) D:\agent>
安装依赖
shell
uv pip install openai
2. 下载模型
不会魔法,只能镜像,有啥用啥
2.1 hf镜像
安装依赖
shell
uv pip install -U huggingface_hub
修改镜像依赖
shell
export HF_ENDPOINT=https://hf-mirror.com # linux
$env:HF_ENDPOINT = "https://hf-mirror.com" # powershell
最后还是没有下载成功,哈哈哈哈,我他妈疯了
shell
huggingface-cli download Qwen/Qwen3-0.6B
手动下载
一个一个点过去
脚本下载
shell
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import os
os.environ["HF_ENDPOINT"] = "https://hf-mirror.com" # 设置为hf的国内镜像网站
from huggingface_hub import snapshot_download
model_name = "Qwen/Qwen3-0.6B"
# while True 是为了防止断联
while True:
try:
snapshot_download(
repo_id=model_name,
local_dir_use_symlinks=True, # 在local-dir指定的目录中都是一些"链接文件"
local_dir=model_name,
resume_download=True
)
break
except:
pass
2.2 modelscope镜像
安装依赖
shell
uv pip install modelscope
下载模型
shell
modelscope download --model Qwen/Qwen3-0.6B
哈哈,下载成功了
3. 搭建LLM服务
3.1 依赖下载
Cmake
git-for-windows
https://registry.npmmirror.com/binary.html?path=git-for-windows/
GCC
cygwin或者mingw二选一即可
cygwin
mingw
llama.cpp
下载
shell
git clone https://github.com/ggml-org/llama.cpp.git
# git clone https://gitclone.com/github.com/ggerganov/llama.cpp
编译
shell
cmake -B build -G "MinGW Makefiles" -DLLAMA_CURL=OFF -DGGML_CUDA=OFF
cmake --build build -j32