學(xué)術(shù)空間 / 論文 / 期刊論文
Incorporating word attention with convolutional neural networks for abstractive summarization
作 者 | Chengzhe Yuan , Zhifeng Bao , Mark Sanderson , Yong Tang * |
期刊名稱 | World Wide Web (CCF B,中科院3區(qū)) |
狀 態(tài) | 23(1), pages 267–287(2020) |
發(fā)表日期 | 2019 年 08 月 |
摘 要 |
Neural sequence-to-sequence (seq2seq) models have been widely used in abstractive summarization tasks. One of the challenges of this task is redundant contents in the input Document.often confuses the models and leads to poor performance. An efficient way to solve this problem is to select salient information from the input Document. In this paper, we propose an approach that incorporates word attention with multilayer convolutional neural networks (CNNs) to extend a standard seq2seq model for abstractive summarization. First, by concentrating on a subset of source words during encoding an input sentence, word attention is able to extract informative keywords in the input, which gives us the ability to interpret generated summaries. Second, these keywords are further distilled by multilayer CNNs to capture the coarse-grained contextual features of the input sentence. Thus, the combined word attention and multilayer CNNs modules provide a better-learned representation of the input Document. which helps the model generate interpretable, coherent and informative summaries in an abstractive summarization task. We evaluate the effectiveness of our model on the English Gigaword, DUC2004 and Chinese summarization dataset LCSTS. Experimental results show the effectiveness of our approach. |
關(guān) 鍵 字 | Abstractive summarization · Word attention · Convolutional neural networks · Sequence-to-sequence model |
訪問鏈接 | https://doi.org/10.1007/s11280-019-00709-6 |
基金項(xiàng)目 | 國家自然科學(xué)基金委(61772211);廣東省應(yīng)用型科技研發(fā)專項(xiàng)?(2016B010124008);廣州市產(chǎn)學(xué)研協(xié)同創(chuàng)新重大專項(xiàng)(201704020203); |
推薦中...