llama-factory || AutoDL平台 ||启动web界面

报错如下:

bash 复制代码
root@autodl-container-d83e478b47-3def8c49:~/LLaMA-Factory# llamafactory-cli webui
* Running on local URL:  http://0.0.0.0:7860

Could not create share link. Missing file: /root/miniconda3/lib/python3.10/site-packages/gradio/frpc_linux_amd64_v0.3. 

Please check your internet connection. This can happen if your antivirus software blocks the download of this file. You can install manually by following these steps: 

1. Download this file: https://cdn-media.huggingface.co/frpc-gradio-0.3/frpc_linux_amd64
2. Rename the downloaded file to: frpc_linux_amd64_v0.3
3. Move the file to this location: /root/miniconda3/lib/python3.10/site-packages/gradio

解决办法:

bash 复制代码
cp frpc_linux_amd64_v0.3 /root/miniconda3/lib/python3.10/site-packages/gradio
bash 复制代码
cd /root/miniconda3/lib/python3.10/site-packages/gradio

提高权限

bash 复制代码
 chmod +x frpc_linux_amd64_v0.3

重新执行

bash 复制代码
llamafactory-cli webui
bash 复制代码
 CUDA_VISIBLE_DEVICES=0 llamafactory-cli webchat   --model_name_or_path /root/autodl-fs/Qwen2.5-1.5B-Instruct   --template qwen

结果:

bash 复制代码
[INFO|2025-03-02 23:15:05] llamafactory.model.model_utils.attention:157 >> Using torch SDPA for faster training and inference.
[INFO|2025-03-02 23:15:05] llamafactory.model.loader:157 >> all params: 1,543,714,304
* Running on local URL:  http://0.0.0.0:7860
* Running on public URL: https://35d22b023607f1702a.gradio.live

This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from the terminal in the working directory to deploy to Hugging Face Spaces (https://huggingface.co/spaces)

下面的链接就是可访问的动态链接

相关推荐
阿猿收手吧!1 天前
【大模型】什么是大模型?vLLM是?模型部署?CUDA?
ai·llama
AI大模型1 天前
开源大模型全维度详解+实操部署(Mistral-、Gemma(Google)、Llama、Qwen),小白必看
llm·agent·llama
不会吉他的肌肉男不是好的挨踢男1 天前
LLaMA Factory 训练模型未检测到CUDA环境解决
python·ai·llama
TGITCIC1 天前
LLM推理引擎选型实战指南:用Transformers、llama.cpp 还是 vLLM 之争
transformer·llama·ai大模型·vllm·llama.cpp·大模型ai
被制作时长两年半的个人练习生2 天前
如何调试llama.cpp及判断是否支持RVV
linux·服务器·llama
小镇cxy3 天前
小模型微调过程记录
ai·llama
CV-杨帆3 天前
复现 LLama Guard Llama-Prompt-Guard-2-86M / Llama-Prompt-Guard-2-22M
llama
学习是生活的调味剂3 天前
LLaMA大模型家族发展介绍
人工智能·llama
love530love3 天前
【笔记】解决 Stable Diffusion WebUI 启动 “找不到llama_cpp模块”
运维·windows·笔记·python·stable diffusion·github·llama
同学小张6 天前
【端侧AI 与 C++】1. llama.cpp源码编译与本地运行
开发语言·c++·aigc·llama·agi·ai-native