1. 10 Dec, 2019 7 commits
    • R茅mi Louf's avatar
      default output dir to documents dir · 3a9a9f78
      R茅mi Louf authored
      3a9a9f78
    • R茅mi Louf's avatar
      update the docs · 693606a7
      R茅mi Louf authored
      693606a7
    • R茅mi Louf's avatar
      give transformers API to BertAbs · 2403a665
      R茅mi Louf authored
      2403a665
    • R茅mi Louf's avatar
      share pretrained embeddings · ba089c78
      R茅mi Louf authored
      ba089c78
    • R茅mi Louf's avatar
      Add beam search · 9660ba1c
      R茅mi Louf authored
      9660ba1c
    • R茅mi Louf's avatar
      load the pretrained weights for encoder-decoder · 1c71ecc8
      R茅mi Louf authored
      We currently save the pretrained_weights of the encoder and decoder in
      two separate directories `encoder` and `decoder`. However, for the
      `from_pretrained` function to operate with automodels we need to
      specify the type of model in the path to the weights.
      
      The path to the encoder/decoder weights is handled by the
      `PreTrainedEncoderDecoder` class in the `save_pretrained` function. Sice
      there is no easy way to infer the type of model that was initialized for
      the encoder and decoder we add a parameter `model_type` to the function.
      This is not an ideal solution as it is error prone, and the model type
      should be carried by the Model classes somehow.
      
      This is a temporary fix that should be changed before merging.
      1c71ecc8
    • R茅mi Louf's avatar
      update function to add special tokens · 07f4cd73
      R茅mi Louf authored
      Since I started my PR the `add_special_token_single_sequence` function
      has been deprecated for another; I replaced it with the new function.
      07f4cd73
  2. 09 Dec, 2019 6 commits
  3. 05 Dec, 2019 5 commits
  4. 04 Dec, 2019 2 commits
  5. 03 Dec, 2019 20 commits