当前位置:   article > 正文

[Bug]使用Transformers 微调 Whisper出现版本不兼容的bug

[Bug]使用Transformers 微调 Whisper出现版本不兼容的bug

错误的现象


ImportError Traceback (most recent call last)
<ipython-input-20-6958d7eed552> in ()
        from transformers import Seq2SegTrainingArguments
        training_args = Seq2SeqTrainingArguments(
                output_dir="./whisper-small-hi", # change to a repo name of your choice
                per_device_train_batch_size=16,

/usr/local/lib/python3.1o/dist-packages/transformers/training_args-py in_setup_devices(self)
        if not is_sagemaker_mp_enabled():
                if notis_accelerate_available():
                        raise ImportError(
                                f"Using the`Trainer`with^PyTorch`requires `accelerate>={ACCELERATE_MIN_VERSION}`:"
                                "please run `pip install transformers[torch]` or `pip install accelerate -U` 


ImportError: Using the^Trainer` with `PyTorch` requires `accelerate>=0.21.0`: Please run `pip install transformers[torch]` or `pip install accelerate -U`

原因分析

看上去accelerate包的依赖没有导入,或者是版本不太匹配。

我先按照他的建议执行了

pip install transformers[torch]

pip install accelerate -U

这两个命令,但是都不管用。

分析一下提到的每个包的作用

transformers

accelerate

PyTorch

依赖分析结果是

accelerate 来管理 transformers 模型的分布式训练和混合精度训练,他们都依赖PyTorch 来执行底层操作。

OK,一步步尝试吧

尝试解决

我做了多次尝试如下,

  1. import transformers
  2. import accelerate
  3. import torch
  4. accelerate.__version__,transformers.__version__,torch.__version__
('0.30.1', '4.42.0.dev0', '2.3.0+cu121')
('0.30.1', '4.41.2', '2.3.0+cu121')
('0.21.0', '4.41.2', '2.3.0+cu121')

经过多次尝试,最后一次的版本好能走通。

新问题

下一步的时候又有新错误:

  1. from transformers import Seq2SeqTrainer
  2. trainer = Seq2SeqTrainer(
  3. args=training_args,
  4. model=model,
  5. train_dataset=common_voice["train"],
  6. eval_dataset=common_voice["test"],
  7. data_collator=data_collator,
  8. compute_metrics=compute_metrics,
  9. tokenizer=processor.feature_extractor,
  10. )
TypeError                                 Traceback (most recent call last)
<ipython-input-26-ec450b16962e> in <cell line: 3>()
      1 from transformers import Seq2SeqTrainer
      2 
----> 3 trainer = Seq2SeqTrainer(
      4     args=training_args,
      5     model=model,
/usr/local/lib/python3.10/dist-packages/transformers/trainer.py in create_accelerator_and_postprocess(self)
   4533 
   4534         # create accelerator object
-> 4535         self.accelerator = Accelerator(**args)
   4536         # some Trainer classes need to use `gather` instead of `gather_for_metrics`, thus we store a flag
   4537         self.gather_function = self.accelerator.gather_for_metrics

TypeError: Accelerator.__init__() got an unexpected keyword argument 'use_seedable_sampler'

最终可用版本

重新尝试了下面新的依赖:

!pip install torch==2.2.0
!pip install accelerate==0.27.2

('0.27.2', '4.41.2', '2.3.0+cu121')

终于成功了!

注意,每次尝试新的版本,都要重启启动整个计算资源,这样新的版本才会生效,建议最上面导入资源的时候指定版本

哎,这玩意真难啊,本来版本不会是太大的问题,奈何菜鸟碰到的问题多啊!

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/不正经/article/detail/707425
推荐阅读
相关标签
  

闽ICP备14008679号