赞
踩
Traceback (most recent call last):
File "C:\Users\Admin\ChatGLM-Efficient-Tuning\src\train_bash.py", line 21, in
<module>
main()
File "C:\Users\Admin\ChatGLM-Efficient-Tuning\src\train_bash.py", line 5, in m
ain
model_args, data_args, training_args, finetuning_args, general_args = get_tr
ain_args()
File "C:\Users\Admin\ChatGLM-Efficient-Tuning\src\glmtuner\tuner\core\parser.p
y", line 35, in get_train_args
model_args, data_args, training_args, finetuning_args, general_args = parser
.parse_args_into_dataclasses()
File "E:\Aconda\envs\py310_chat\lib\site-packages\transformers\hf_argparser.py
", line 338, in parse_args_into_dataclasses
obj = dtype(**inputs)
File "<string>", line 119, in __init__
File "E:\Aconda\envs\py310_chat\lib\site-packages\transformers\training_args.p
y", line 1405, in __post_init__
raise ValueError(
ValueError: FP16 Mixed precision training with AMP or APEX (`--fp16`) and FP16 h
alf precision evaluation (`--fp16_full_eval`) can only be used on CUDA or NPU de
vices.
机器没问题,cuda没问题,发现是省事用的pip下载的pytorch是cpu版的,低级错误。
之后报错:
08/30/2023 15:59:41 - WARNING - glmtuner.tuner.core.parser - We recommend enable fp16 mixed precision training for
ChatGLM-6B.
08/30/2023 15:59:41 - WARNING - glmtuner.tuner.core.parser - `ddp_find_unused_parameters` needs to be set as False
in DDP training.
Traceback (most recent call last):
File "C:\Users\Admin\ChatGLM-Efficient-Tuning\src\train_bash.py", line 21, in <module>
main()
File "C:\Users\Admin\ChatGLM-Efficient-Tuning\src\train_bash.py", line 5, in main
model_args, data_args, training_args, finetuning_args, general_args = get_train_args()
File "C:\Users\Admin\ChatGLM-Efficient-Tuning\src\glmtuner\tuner\core\parser.py", line 78, in get_train_args
training_args.ddp_find_unused_parameters = False
File "E:\Aconda\envs\py310_chat\lib\site-packages\transformers\training_args.py", line 1712, in __setattr__
raise FrozenInstanceError(f"cannot assign to field {name}")
dataclasses.FrozenInstanceError: cannot assign to field ddp_find_unused_parameters
卡了N多个小时,什么都试了,崩溃差点没放弃
结果只是transformers版本太高了...调低就可以了...
4.32.1-->4.30.1
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。