当前位置:   article > 正文

微调控件 0.1微调_微调T5变压器以完成任何汇总任务

t5 模型 怎么微调

微调控件 0.1微调

介绍 (Introduction)

I am amazed with the power of the T5 transformer model! T5 which stands for text to text transfer transformer makes it easy to fine tune a transformer model on any text to text task. Any NLP task event if it is a classification task, can be framed as an input text to output text problem.

T5变压器型号的强大功能令我惊讶! T5代表文本到文本转换转换器,可以轻松地在任何文本到文本任务上微调转换器模型。 任何NLP任务事件(如果是分类任务)都可以被构造为输入文本,以输出文本问题。

In this blog, I show how you can tune this model on any data set you have. In particular, I demo how this can be done on Summarization data sets. I have personally tested this on CNN-Daily Mail and the WikiHow data sets. The code is publicly available on my Github here.

在此博客中,我将展示如何根据您拥有的任何数据集调整此模型。 特别是,我演示了如何在摘要数据集上完成此操作。 我已经在CNN每日邮件和WikiHow数据集上对此进行了亲自测试。 该代码是我Github上公开可用这里

T5-small trained on Wikihow writes amazing summaries. See snippet below of actual text, actual summary and predicted summary. This model is also available on HuggingFace Transformers model hub here. The link provides a convenient way to test the model on input texts as well as a JSON endpoint.

在Wikihow上受过训练的T5小写摘要。 请参见下面的实际文本,实际摘要和预测摘要的摘要。 此模型在此处的 HuggingFace Transformers模型中心也可用。 该链接提供了一种方便的方法来测试输入文本以及JSON端点上的模型。

WikiHow Text: Make sure you've got all the cables disconnected from the back of your console,especially the power cord., You'll need the straight end to be about 2-3 inches long.You will need alarge size paper clip for this method because it will need to go in abo
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/神奇cpp/article/detail/748132
推荐阅读
相关标签
  

闽ICP备14008679号