Leaderboard for Du et al 2017 split of SQuAD

Leaderboard for Du et al 2017 split of SQuAD


Model Author BLEU Verified
1 UniLM
Unified Language Model Pre-training for Natural Language Understanding and Generation
Dong et al (2019)
Microsoft
Paper Code 22.78
2 QPP&QAP
Addressing Semantic Drift in Question Generation for Semi-Supervised Question Answering
Zhang and Bansal (2019)
UNC Chapel Hill
Paper Code 18.37
3 G2S +BERT+RL
Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation
Chen et al (2019)
RPI/IBM
Paper 17.94
4 TTPC
Improving Question Generation With to the Point Context
Li et al (2019)
CUHK
Paper 16.27
5 ASs2s
Improving Neural Question Generation using Answer Separation
Kim et al (2018)
SNC
Paper 16.2
6 Seq2Seq +copy +attn +coref
Harvesting Paragraph-level Question-Answer Pairs from Wikipedia
Du et al (2018)
Cornell
Paper 15.16
7 M2S+cp
Leveraging Context Information for Natural Question Generation
Song et al (2018)
University of Rochester/IBM
Paper 13.98
8 Seq2Seq +copy +attn
A RNN Seq2Seq architecture with attention and copy mechanism
Tom Hosking (2019)
UCL
Paper Code 13.5
9 Seq2Seq
Learning to Ask: Neural Question Generation for Reading Comprehension
Du et al (2017)
Cornell
Paper 12.28
10 PCFG-Trans
Question Generation via Overgenerating Transformations and Ranking
Heilman and Smith (2011)
CMU
Paper Code 9.12
11 Baseline - sentence
Take the sentence that the answer span falls within as the question
- (2019)
-
Code 4.93

Is your model missing? Submit it!