当前位置:   article > 正文

主题论文总结4:长文本摘要(持续更新ing...)_globalizing bert-based transformer architectures f

globalizing bert-based transformer architectures for long document summariza

诸神缄默不语-个人CSDN博文目录

(这个文章本来是单文本摘要集合,但我觉得这么大的标题实在是毫无存在的意义,所以直接废物利用改成长文本摘要了。
如果之前的博文中有地方没有把链接和标题对应替换完,请联系我修改)
我之前几篇博文中分别写过相关的信息,具体如何解耦和修改我以后再慢慢来。

最近更新时间:2023.5.9
最早更新时间:2023.5.9

1. 抽取

  1. (2019) Exploiting discourse-level segmentation for extractive summarization:discourse-level segmentation + adapted contextual representation model
    RNN或Bert
  2. (2020) Discourse-Aware Neural Extractive Text Summarization

2. 抽取 - 具体咋做的还没看

  1. (2021) Globalizing BERT-based Transformer Architectures for Long Document Summarization
  2. (2021) Sliding Selector Network with Dynamic Memory for Extractive Summarization of Long Documents

3. 生成 - 分治

  1. (2018) Deep Communicating Agents for Abstractive Summarization:强化学习。用agent分别处理每个subsection并进行信息交互
  2. (2020) A Divide-and-Conquer Approach to the Summarization of Long Documents

4. 抽取→生成

  1. (2021) Long Document Summarization in a Low Resource Setting using Pretrained Language Models
  2. (2021) Long-Span Summarization via Local Attention and Content Selection

5. 生成 - 具体咋做的还没看

  1. (2021) Hierarchical Learning for Generation with Long Source Sequences
  2. (2021) Efficient attentions for long document summarization
  3. (2022) Long Document Summarization with Top-Down and Bottom-Up Representation Inference
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Gausst松鼠会/article/detail/473613
推荐阅读
相关标签
  

闽ICP备14008679号