赞
踩
报错:
Traceback (most recent call last):
File "/home/bingxing2/ailab/group/ai4agr/wzf/LLM/models/ChatGLM-Finetuning/train.py", line 22, in <module>
from peft import LoraConfig, get_peft_model
File "/home/bingxing2/ailab/scxlab0069/.conda/envs/llm_test/lib/python3.9/site-packages/peft/__init__.py", line 22, in <module>
from .auto import (
File "/home/bingxing2/ailab/scxlab0069/.conda/envs/llm_test/lib/python3.9/site-packages/peft/auto.py", line 31, in <module>
from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING
File "/home/bingxing2/ailab/scxlab0069/.conda/envs/llm_test/lib/python3.9/site-packages/peft/mapping.py", line 23, in <module>
from .peft_model import (
File "/home/bingxing2/ailab/scxlab0069/.conda/envs/llm_test/lib/python3.9/site-packages/peft/peft_model.py", line 32, in <module>
from transformers import PreTrainedModel
File "<frozen importlib._bootstrap>", line 1055, in _handle_fromlist
File "/home/bingxing2/ailab/scxlab0069/.conda/envs/llm_test/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1174, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/home/bingxing2/ailab/scxlab0069/.conda/envs/llm_test/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1186, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
CUDA Setup failed despite GPU being available. Please run the following command to get more information:
python -m bitsandbytes
Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them
to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes
and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。