赞
踩
使用 LMDeploy 以本地对话方式部署 InternLM-Chat-7B 模型,生成 300 字的小故事
2.api 方式部署
运行
结果:
显存占用:
在使用命令,对lmdeploy 进行源码安装是时,报错
pip install 'lmdeploy[all]==v0.1.0'
- Building wheels for collected packages: flash-attn
- Building wheel for flash-attn (setup.py) ... error
- error: subprocess-exited-with-error
-
- × python setup.py bdist_wheel did not run successfully.
- │ exit code: 1
- ╰─> [9 lines of output]
- fatal: not a git repository (or any of the parent directories): .git
-
-
- torch.__version__ = 2.0.1
-
-
- running bdist_wheel
- Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.4.2/flash_attn-2.4.2+cu118torch2.0cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
- error: <urlopen error Tunnel connection failed: 503 Service Unavailable>
- [end of output]
-
- note: This error originates from a subprocess, and is likely not a problem with pip.
- ERROR: Failed building wheel for flash-attn
- Running setup.py clean for flash-attn
- Failed to build flash-attn
- ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects

(1)在https://github.com/Dao-AILab/flash-attention/releases/ 下载对应版本的安装包
(2)通过pip 进行安装
pip install flash_attn-2.3.5+cu117torch2.0cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。