1. 10 Dec, 2019 4 commits
    • R茅mi Louf's avatar
      share pretrained embeddings · ba089c78
      R茅mi Louf authored
      ba089c78
    • R茅mi Louf's avatar
      Add beam search · 9660ba1c
      R茅mi Louf authored
      9660ba1c
    • R茅mi Louf's avatar
      load the pretrained weights for encoder-decoder · 1c71ecc8
      R茅mi Louf authored
      We currently save the pretrained_weights of the encoder and decoder in
      two separate directories `encoder` and `decoder`. However, for the
      `from_pretrained` function to operate with automodels we need to
      specify the type of model in the path to the weights.
      
      The path to the encoder/decoder weights is handled by the
      `PreTrainedEncoderDecoder` class in the `save_pretrained` function. Sice
      there is no easy way to infer the type of model that was initialized for
      the encoder and decoder we add a parameter `model_type` to the function.
      This is not an ideal solution as it is error prone, and the model type
      should be carried by the Model classes somehow.
      
      This is a temporary fix that should be changed before merging.
      1c71ecc8
    • R茅mi Louf's avatar
      update function to add special tokens · 07f4cd73
      R茅mi Louf authored
      Since I started my PR the `add_special_token_single_sequence` function
      has been deprecated for another; I replaced it with the new function.
      07f4cd73
  2. 09 Dec, 2019 7 commits
  3. 07 Dec, 2019 1 commit
  4. 06 Dec, 2019 7 commits
  5. 05 Dec, 2019 19 commits
  6. 04 Dec, 2019 2 commits