Advanced searches left 3/3

Abstractive Text Summarization - Astrophysics Data System

Summarized by Plex Scholar
Last Updated: 02 July 2022

* If you want to update the article please login/register

Leveraging Locality in Abstractive Text Summarization

Despite the success of neural attention models for natural language generation tasks, the quadratic memory complexity of the self-attention module in regard to the input length limits their use in long text summarization. We explore whether models with a restricted context can have competitive results in comparison to memory-efficient attention models that maintain a global context by treating the input as a complete sequence rather than designing more sophisticated attention modules.

Source link: https://ui.adsabs.harvard.edu/abs/2022arXiv220512476L/abstract

* Please keep in mind that all text is summarized by machine, we do not bear any responsibility, and you should always check original source before taking any actions

* Please keep in mind that all text is summarized by machine, we do not bear any responsibility, and you should always check original source before taking any actions