当前位置:   article > 正文

大语言模型(LLM)凤凰 ,训练过程中报错(安装flash_attn过程中报错:No module named ‘torch‘)_no module named 'flash_attn

no module named 'flash_attn

安装flash_attn报错,信息如下:

pip install flash_attn
Collecting flash_attn
  Using cached flash_attn-1.0.8.tar.gz (2.0 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [15 lines of output]
      Traceback (most recent call last):
        File "/home/aaa/anaconda3/envs/fenghuang/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/home/aaa/anaconda3/envs/fenghuang/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/home/aaa/anaconda3/envs/fenghuang/lib

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/你好赵伟/article/detail/270350
推荐阅读
相关标签
  

闽ICP备14008679号