赞
踩
今天使用GPT2遇到的ERROR,看了一下源码,问题在这:
pytorch_transformers.tokenization_utils
class PreTrainedTokenizer(object):
......
@property
def pad_token(self):
""" Padding token (string). Log an error if used while not having been set. """
if self._pad_token is None:
logger.error("Using pad_token, but it is not set yet.")
return self._pad_token
transformers里有一个issue是这个问题:#2648
这个人是在run_lm_finetuning.py里遇到的,julien-c修改了相关代码push到了分支6b4c3ee
按道理这个问题不应该再出现了,但我是使用AutoModelWithLMHead遇到的,可以看一下GPT2Config:
{ "_num_labels": 2, "activation_function": "gelu_new", "architectures": [ "GPT2LMHeadModel" ], "attn_pdrop": 0.1, "bos_token_id": 50256, "decoder_start_token_id": null, "do_sample": false, "early_stopping": false, "embd_pdrop": 0.1, "eos_token_id": 50256, "finetuning_task": null, "id2label": { "0": "LABEL_0", "1": "LABEL_1" }, "initializer_range": 0.02, "is_decoder": false, "is_encoder_decoder": false, "label2id": { "LABEL_0": 0, "LABEL_1": 1 }, "layer_norm_epsilon": 1e-05, "length_penalty": 1.0, "max_length": 20, "min_length": 0, "model_type": "gpt2", "n_ctx": 1024, "n_embd": 768, "n_head": 12, "n_layer": 12, "n_positions": 1024, "no_repeat_ngram_size": 0, "num_beams": 1, "num_return_sequences": 1, "output_attentions": false, "output_hidden_states": false, "output_past": true, "pad_token_id": null, "prefix": null, "pruned_heads": {}, "repetition_penalty": 1.0, "resid_pdrop": 0.1, "summary_activation": null, "summary_first_dropout": 0.1, "summary_proj_to_labels": true, "summary_type": "cls_index", "summary_use_proj": true, "task_specific_params": { "text-generation": { "do_sample": true, "max_length": 50 } }, "temperature": 1.0, "top_k": 50, "top_p": 1.0, "torchscript": false, "use_bfloat16": false, "vocab_size": 50257 }
问题应该出在:“pad_token_id”: null,
解决方案是在tokenizer初始化后添加一行代码,pad_token可以指定为eos_token或者你想要的其他的符号。
tokenizer = AutoTokenizer.from_pretrained(model_name)
### ERROR - transformers.tokenization_utils - Using pad_token, but it is not set yet.
tokenizer.pad_token = tokenizer.eos_token
model = AutoModelWithLMHead.from_pretrained(model_name)
感谢:
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。