Model | Author | BLEU | Verified | |||
---|---|---|---|---|---|---|
1 | UniLM v2 UNILMv2: Pseudo-Masked Language Models for Unified Language Model Pre-Training |
Bao et al (2020) Microsoft |
Paper | Code | 24.43 | |
2 | UniLM Unified Language Model Pre-training for Natural Language Understanding and Generation |
Dong et al (2019) Microsoft |
Paper | Code | 22.78 | |
3 | CopyBERT CopyBERT: A Unified Approach to Question Generation with Self-Attention |
Varanasi et al (2020) DFKI |
Paper | 22.71 | ||
4 | BART Numbers not reported in paper |
Lewis et al (2020) |
Paper | 21.53 | ||
5 | QPP&QAP Addressing Semantic Drift in Question Generation for Semi-Supervised Question Answering |
Zhang and Bansal (2019) UNC Chapel Hill |
Paper | Code | 18.37 | |
6 | G2S +BERT+RL Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation |
Chen et al (2019) RPI/IBM |
Paper | 17.94 | ||
7 | TTPC Improving Question Generation With to the Point Context |
Li et al (2019) CUHK |
Paper | 16.27 | ||
8 | ASs2s Improving Neural Question Generation using Answer Separation |
Kim et al (2018) SNC |
Paper | 16.2 | ||
9 | Seq2Seq +copy +attn +coref Harvesting Paragraph-level Question-Answer Pairs from Wikipedia |
Du et al (2018) Cornell |
Paper | 15.16 | ||
10 | M2S+cp Leveraging Context Information for Natural Question Generation |
Song et al (2018) University of Rochester/IBM |
Paper | 13.98 | ||
11 | Seq2Seq +copy +attn A RNN Seq2Seq architecture with attention and copy mechanism |
Tom Hosking (2019) UCL |
Paper | Code | 13.5 | |
12 | Seq2Seq Learning to Ask: Neural Question Generation for Reading Comprehension |
Du et al (2017) Cornell |
Paper | 12.28 | ||
13 | PCFG-Trans Question Generation via Overgenerating Transformations and Ranking |
Heilman and Smith (2011) CMU |
Paper | Code | 9.12 | |
14 | Baseline - sentence Take the sentence that the answer span falls within as the question |
- (2019) - |
Code | 4.93 |
Is your model missing? Submit it!