ereverter/cnn_dailymail_extractive
Viewer • Updated • 312k • 105 • 6
BERTSUM, a variant of BERT, achieves top performance on extractive summarization using the CNN/Dailymail dataset with improvements in ROUGE-L.
BERT, a pre-trained Transformer model, has achieved ground-breaking performance on multiple NLP tasks. In this paper, we describe BERTSUM, a simple variant of BERT, for extractive summarization. Our system is the state of the art on the CNN/Dailymail dataset, outperforming the previous best-performed system by 1.65 on ROUGE-L. The codes to reproduce our results are available at https://github.com/nlpyang/BertSum
Get this paper in your agent:
hf papers read 1903.10318 curl -LsSf https://hf.co/cli/install.sh | bash No model linking this paper
No Space linking this paper
No Collection including this paper