赞
踩
参考:
https://github.com/zhusleep/pytorch_chinese_lm_pretrain
https://github.com/bojone/bert4keras/tree/master/pretraining
https://github.com/xv44586/toolkit4nlp/tree/master/pretraining
1、clone 下来,里面有数据 https://github.com/zhusleep/pytorch_chinese_lm_pretrain
2、下载transformers和pytorch相关库
3、运行进行再训练
python run_language_model_bert.py --output_dir=output --model_type=bert --model_name_or_path=bert-base-chinese --do_train --train_data_file=train.txt --do_eval --eval_data_file=eval.txt --mlm --per_device_train_batch_size=4
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。