老显卡老cpu用vllm推理大模型失败Intel(R) Xeon(R) CPU E5-2643 v2

先上结论,显卡太老,无法装cuda12.6

cpu太老,不支持AVX2, ,所以实践失败。

先安装vllm

复制代码
pip install vllm

它会把torch一起安装

老显卡的驱动cuda是11.6,尝试升级

复制代码
sudo apt install nvidia-driver-580-server

没升上去。

尝试升级驱动

看看驱动

复制代码
ubuntu-drivers devices

但是没有看到有啥驱动,显示:ERROR:root:aplay command not found

如果没有这个工具,就可以安装

安装工具

复制代码
sudo apt install ubuntu-drivers-common

查询可用驱动

复制代码
ubuntu-drivers devices

没看到有啥

复制代码
ubuntu-drivers devices
ERROR:root:aplay command not found

安装cuda11.6的torch

复制代码
pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116

报错没有

pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116

Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple, https://download.pytorch.org/whl/cu116

ERROR: Could not find a version that satisfies the requirement torch==1.13.1+cu116 (from versions: 2.2.0, 2.2.1, 2.2.2, 2.3.0, 2.3.1, 2.4.0, 2.4.1, 2.5.0, 2.5.1, 2.6.0, 2.7.0, 2.7.1, 2.8.0, 2.9.0)

ERROR: No matching distribution found for torch==1.13.1+cu116

安装cuda 11.8的

复制代码
pip install torch==2.0.0 torchvision==0.15.1 torchaudio==2.0.1 --index-url https://download.pytorch.org/whl/cu118

报错

pip install torch==2.0.0 torchvision==0.15.1 torchaudio==2.0.1 --index-url https://download.pytorch.org/whl/cu118

Looking in indexes: https://download.pytorch.org/whl/cu118

ERROR: Could not find a version that satisfies the requirement torch==2.0.0 (from versions: 2.2.0+cu118, 2.2.1+cu118, 2.2.2+cu118, 2.3.0+cu118, 2.3.1+cu118, 2.4.0+cu118, 2.4.1+cu118, 2.5.0+cu118, 2.5.1+cu118, 2.6.0+cu118, 2.7.0+cu118, 2.7.1+cu118)

ERROR: No matching distribution found for torch==2.0.0

也就是只有这几个版本:2.2.0+cu118, 2.2.1+cu118, 2.2.2+cu118, 2.3.0+cu118, 2.3.1+cu118, 2.4.0+cu118, 2.4.1+cu118, 2.5.0+cu118, 2.5.1+cu118, 2.6.0+cu118, 2.7.0+cu118, 2.7.1+cu118

安装2.7.1试试

复制代码
pip install torch==2.7.0 torchvision==0.22 torchaudio==2.7.1 --index-url https://down
load.pytorch.org/whl/cu118

经过反复调试,最终命令是:

复制代码
pip install torch==2.7.0 torchvision==0.22 torchaudio==2.7.0 --index-url https://download.pytorch.org/whl/cu118

还是一样的报错

尝试升级cuda和cudnn

复制代码
sudo apt upgrade  nvidia-cuda-dev nvidia-cudnn

碰到了这个报错

sudo apt upgrade nvidia-cuda-dev nvidia-cudnn

Reading package lists... Done

Building dependency tree... Done

Reading state information... Done

You might want to run 'apt --fix-broken install' to correct these.

The following packages have unmet dependencies:

intel-oneapi-runtime-dpcpp-cpp : Depends: intel-oneapi-runtime-compilers (>= 2025.3.0-639) but 2025.0.4-1519 is installed

intel-oneapi-runtime-opencl : Depends: intel-oneapi-runtime-compilers (>= 2025.3.0-639) but 2025.0.4-1519 is installed

E: Unmet dependencies. Try 'apt --fix-broken install' with no packages (or specify a solution).

安装提示执行

复制代码
sudo apt --fix-broken install

重新编译vllm cpu版

下载

复制代码
git clone https://gitcode.com/GitHub_Trending/vl/vllm vllm_source

构建

复制代码
pip install --upgrade pip
pip install -v -r requirements/cpu-build.txt --extra-index-url https://download.pytorch.org/whl/cpu
pip install -v -r requirements/cpu.txt --extra-index-url https://download.pytorch.org/whl/cpu

构筑后端

复制代码
VLLM_TARGET_DEVICE=cpu python setup.py install

报错

复制代码
running build_ext
-- The CXX compiler identification is GNU 13.3.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Build type: RelWithDebInfo
-- Target device: cpu
-- Found Python: /home/skywalk/py312/bin/python (found version "3.12.3") found components: Interpreter Development.Module Development.SABIModule
-- Found python matching: /home/skywalk/py312/bin/python.
CMake Warning at /home/skywalk/py312/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:22 (message):
  static library kineto_LIBRARY-NOTFOUND not found.
Call Stack (most recent call first):
  /home/skywalk/py312/lib/python3.12/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:125 (append_torchlib_if_found)
  CMakeLists.txt:84 (find_package)


-- Found Torch: /home/skywalk/py312/lib/python3.12/site-packages/torch/lib/libtorch.so
CMake Error at cmake/cpu_extension.cmake:188 (message):
  vLLM CPU backend requires AVX512, AVX2, Power9+ ISA, S390X ISA, ARMv8 or
  RISC-V support.
Call Stack (most recent call first):
  CMakeLists.txt:104 (include)


-- Configuring incomplete, errors occurred!
Traceback (most recent call last):
  File "/home/skywalk/github/vllm_source/setup.py", line 706, in <module>
    setup(
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/__init__.py", line 117, in setup
    return distutils.core.setup(**attrs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 186, in setup
    return run_commands(dist)
           ^^^^^^^^^^^^^^^^^^
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 202, in run_commands
    dist.run_commands()
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1002, in run_commands
    self.run_command(cmd)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/dist.py", line 1104, in run_command
    super().run_command(command)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1021, in run_command
    cmd_obj.run()
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/install.py", line 109, in run
    self.do_egg_install()
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/install.py", line 167, in do_egg_install
    self.run_command('bdist_egg')
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 357, in run_command
    self.distribution.run_command(command)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/dist.py", line 1104, in run_command
    super().run_command(command)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1021, in run_command
    cmd_obj.run()
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/bdist_egg.py", line 177, in run
    cmd = self.call_command('install_lib', warn_dir=False)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/bdist_egg.py", line 163, in call_command
    self.run_command(cmdname)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 357, in run_command
    self.distribution.run_command(command)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/dist.py", line 1104, in run_command
    super().run_command(command)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1021, in run_command
    cmd_obj.run()
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/install_lib.py", line 19, in run
    self.build()
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/command/install_lib.py", line 113, in build
    self.run_command('build_ext')
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 357, in run_command
    self.distribution.run_command(command)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/dist.py", line 1104, in run_command
    super().run_command(command)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 1021, in run_command
    cmd_obj.run()
  File "/home/skywalk/github/vllm_source/setup.py", line 282, in run
    super().run()
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/command/build_ext.py", line 99, in run
    _build_ext.run(self)
  File "/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_distutils/command/build_ext.py", line 368, in run
    self.build_extensions()
  File "/home/skywalk/github/vllm_source/setup.py", line 239, in build_extensions
    self.configure(ext)
  File "/home/skywalk/github/vllm_source/setup.py", line 216, in configure
    subprocess.check_call(
  File "/usr/lib/python3.12/subprocess.py", line 413, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '/home/skywalk/github/vllm_source', '-G', 'Ninja', '-DCMAKE_BUILD_TYPE=RelWithDebInfo', '-DVLLM_TARGET_DEVICE=cpu', '-DVLLM_PYTHON_EXECUTABLE=/home/skywalk/py312/bin/python', '-DVLLM_PYTHON_PATH=/home/skywalk/github/vllm_source:/usr/lib/python312.zip:/usr/lib/python3.12:/usr/lib/python3.12/lib-dynload:/home/skywalk/py312/lib/python3.12/site-packages:/home/skywalk/github/exo:/home/skywalk/py312/lib/python3.12/site-packages/setuptools/_vendor', '-DFETCHCONTENT_BASE_DIR=/home/skywalk/github/vllm_source/.deps', '-DCMAKE_JOB_POOL_COMPILE:STRING=compile', '-DCMAKE_JOB_POOLS:STRING=compile=24']' returned non-zero exit status 1.

没办法了,cpu不行,Intel(R) Xeon(R) CPU E5-2643 v2 @ 3.50GHz 太老了。

使用预编译的cpu版本

复制代码
pip install vllm --no-build-isolation

白搭,这个也不行,放弃。

vllm -V

W1030 18:34:21.393673118 OperatorEntry.cpp:218\] Warning: Warning only once for all operators, other operators may also be overridden. Overriding a previously registered kernel for the same operator and the same dispatch key operator: aten::_addmm_activation(Tensor self, Tensor mat1, Tensor mat2, \*, Scalar beta=1, Scalar alpha=1, bool use_gelu=False) -\> Tensor registered at /pytorch/build/aten/src/ATen/RegisterSchema.cpp:6 dispatch key: AutocastCPU previous kernel: registered at /pytorch/aten/src/ATen/autocast_mode.cpp:327 new kernel: registered at /opt/workspace/ipex-cpu-dev/csrc/cpu/autocast/autocast_mode.cpp:112 (function operator()) ERROR! Intel® Extension for PyTorch\* only works on machines with instruction sets equal or newer than AVX2, which are not detected on the current machine. 这台机器放弃!

相关推荐
恋猫de小郭10 小时前
AI 正在造就你的「认知卸载」,但是时代如此
前端·人工智能·ai编程
飞哥数智坊18 小时前
我的“龙虾”罢工了!正好对比下GLM、MiniMax、Kimi 3家谁更香
人工智能
风象南19 小时前
很多人说,AI 让技术平权了,小白也能乱杀老师傅 ?
人工智能·后端
董董灿是个攻城狮20 小时前
大模型连载1:了解 Token
人工智能
花酒锄作田21 小时前
使用 pkgutil 实现动态插件系统
python
RoyLin1 天前
沉睡三十年的标准:HTTP 402、生成式 UI 与智能体原生软件的时代
人工智能
needn1 天前
TRAE为什么要发布SOLO版本?
人工智能·ai编程
毅航1 天前
自然语言处理发展史:从规则、统计到深度学习
人工智能·后端
前端付豪1 天前
LangChain链 写一篇完美推文?用SequencialChain链接不同的组件
人工智能·python·langchain
ursazoo1 天前
写了一份 7000字指南,让 AI 帮我消化每天的信息流
人工智能·开源·github