This is the RAG-Sequence Model of the the paper [Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks](https://arxiv.org/pdf/2005.11401.pdf)
by Patrick Lewis, Ethan Perez, Aleksandara Piktus et al.
## Usage:
The model is a *uncased* model, which means that capital letters are simply converted to lower-case letters.
```python
The model consits of a *question_encoder*, *retriever* and a *generator*. The retriever is extracts relevant passages from the *wiki_dpr*`train` datasets, which is linked above.
The question_encoder and retriever are based on `facebook/dpr-question_encoder-single-nq-base` and `facebook/bart-large`, which were jointly finetuned on
on the *wiki_dpr* QA dataset in an end-to-end fashion.