# Checkpoints
We provide links for you to download our checkpoints, including pretrained and finetuned models on different tasks. If you would like to use OFA with Transformers, please download checkpoints at [https://huggingface.co/OFA-Sys](https://huggingface.co/OFA-Sys), and check the code in the branch `feature/add_transformers`.
## Pretraining
* Pre-trained checkpoint (OFA-Huge) (~930M parameters)
* Pre-trained checkpoint (OFA-Large) (~470M parameters)
* Pre-trained checkpoint (OFA-Base) (~180M parameters)
* Pre-trained checkpoint (OFA-Medium) (~93M parameters)
* Pre-trained checkpoint (OFA-Tiny) (~33M parameters)
## Finetuning (OFA-Huge)
* Finetuned checkpoint for Caption on COCO
* Finetuned checkpoint for VQAv2
## Finetuning (OFA-Large)
* Finetuned checkpoint for Caption on COCO
* Finetuned checkpoint for Caption on COCO During Stage1 Finetuning
* Finetuned checkpoint for RefCOCO
* Finetuned checkpoint for RefCOCO+
* Finetuned checkpoint for RefCOCOg
* Finetuned checkpoint for VQAv2
* Finetuned checkpoint for SNLI-VE
* Finetuned checkpoint for Text-to-Image Generation on COCO && CLIP checkpoint && VQGAN checkpoint
* Finetuned checkpoint for ImageNet-1K
* Finetuned checkpoint for Gigaword
## Finetuning (OFA-Base)
* Finetuned base checkpoint for Caption on COCO
* Finetuned base checkpoint for RefCOCO
* Finetuned base checkpoint for RefCOCO+
* Finetuned base checkpoint for RefCOCOg
* Finetuned base checkpoint for VQAv2
* Finetuned base checkpoint for SNLI-VE
* Finetuned base checkpoint for Text-to-Image Generation on COCO
## Pretrained Language Models
To follow our multimodal pretraining, we suggest using pretrained language models for the initialization. Note that for the base-size and large-size models, we directly use BART-base and BART-large, and for the other sizes, we pretrained the tiny-size, medium size, and huge-size OFA-based language models.
* Tiny-size encoder-decoder language model (OFA)
* Medium-size encoder-decoder language model (OFA)
* Base-size encoder-decoder language model (BART)
* Large-size encoder-decoder language model (BART)
* Huge-size encoder-decoder language model (OFA)