Fine-tuned weights(converted from [original fairseq version repo](https://github.com/microsoft/ProphetNet)) for [ProphetNet](https://arxiv.org/abs/2001.04063) on summarization task CNN/DailyMail.
Fine-tuned weights(converted from [original fairseq version repo](https://github.com/microsoft/ProphetNet)) for [ProphetNet](https://arxiv.org/abs/2001.04063) on summarization task CNN/DailyMail.
ProphetNet is a new pre-trained language model for sequence-to-sequence learning with a novel self-supervised objective called future n-gram prediction.
ProphetNet is a new pre-trained language model for sequence-to-sequence learning with a novel self-supervised objective called future n-gram prediction.
#shouldgive:'ustc was founded in beijing by the chinese academy of sciences in 1958. [X_SEP] ustc\'smissionwastodevelopahigh-levelscienceandtechnologyworkforce.[X_SEP]theestablishmentwashailedas" a major event in the history of chinese education and science "'
```
```
Here, [X_SEP] is used as a special token to seperate sentences.
Here, [X_SEP] is used as a special token to seperate sentences.