Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Text Summarization Models

Latest
Compare
Choose a tag to compare
@daden-ms daden-ms released this 30 Mar 15:09
· 83 commits to master since this release
21a6e09

Text Summarization

In this release, we support both abstractive and extractive text summarization.

New Model: UniLM

UniLM is a state of the art model developed by Microsoft Research Asia (MSRA). The model is pre-trained on a large unlabeled natural language corpus (English Wikipedia and BookCorpus) and can be fine-tuned on different types of labeled data for various NLP tasks like text classification and abstractive summarization.

Supported Models

  • unilm-large-cased
  • unilm-base-cased

For more info about UniLM, please refer to the following:

Thanks to the UniLM team, Li Dong, Nan Yang, Wenhui Wang, Furu Wei, Xiaodong Liu, Yu Wang, Jianfeng Gao, Ming Zhou, Hsiao-Wuen Hon, for their great work and support for the integration.

New Model: BERTSum

BERTSum is an encoder architecture designed for text summarization. It can be used together with different decoders to support both extractive and abstractive summarization.

Supported Models

  • bert-base-uncased (extractive and abstractive)
  • distilbert-base-uncased (extractive)

Thanks to the original authors Yang Liu and Mirella Lapata for their great contribution.

All model implementations support distributed training and multi-GPU inferencing. For abstractive summarization, we also support mixed-precision training and inference.