当前位置:   article > 正文

ModuleNotFoundError: No module named ‘transformers_modules.baichuan-7B‘ 解决方案

ModuleNotFoundError: No module named ‘transformers_modules.baichuan-7B‘ 解决方案

 代码

tokenizer = AutoTokenizer.from_pretrained("./baichuan-7B", trust_remote_code=True)

报错 

  1. File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  2.   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  3.   File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
  4.   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  5.   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  6.   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  7.   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
  8. ModuleNotFoundError: No module named 'transformers_modules.baichuan-7B'

找了半天,一个方法是把transformers降到4.26.1,但是我代码中有用来量化的BitsAndBytesConfig库,最少也要4.31版本以上。最后终于找到了另一个方法,把模型路径中的-变成_就解决啦!!!

修改后

tokenizer = AutoTokenizer.from_pretrained("./baichuan_7B", trust_remote_code=True)

就可以运行了!!!

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Monodyee/article/detail/586367
推荐阅读
相关标签
  

闽ICP备14008679号