赞
踩
首先保证cuda版本为12.1,torch版本为2.1.0及以上,python版本3.10以上
从此处下载最新版的whl,https://github.com/jllllll/bitsandbytes-windows-webui/releases/tag/wheels,通过whl来安装bitsandbytes
从此处下载最新版的whl, https://github.com/bdashore3/flash-attention/releases, 通过whl来安装flash-attn
如果出现
CUDA SETUP: Problem: The main issue seems to be that the main CUDA library was not detected.
通常就是bitsandbytes的版本没装对,或者cuda版本不对。确保安装cuda12.1以及对应的torch,再从whl安装bitsandbytes,即可避免此问题。
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。