1. 05 Jan, 2021 5 commits
    • Stas Bekman's avatar
      [trainer] --model_parallel hasn't been implemented for most models (#9347) · 748006c0
      Stas Bekman authored
      * --model_parallel hasn't been implemented for most models
      
      * make the help clear as well
      
      * implement is_parallelizable; use it
      
      * oops
      
      * remove property
      748006c0
    • Julien Plu's avatar
      Use stable functions (#9369) · 4225740a
      Julien Plu authored
      4225740a
    • Stas Bekman's avatar
      [logging] autoflush (#9385) · 4aa8f6ad
      Stas Bekman authored
      This PR proposes to:
      
      * auto-flush `transformers` logging 
      
      When using logging for tracing signals from different parts of the code and which could be mixed with print debug this aids to get all the logging events synchronized. 
      
      I don't think this change will introduce any performance impacts.
      
      If it helps someone here is the code I used to sync `transformers` logging with various other debug prints.
      
      I was porting bart to MP and I needed to trace that the device switching happens correctly and I added a bunch of logger.info calls inside `modeling_bart.py` and also had some other helpers `print` debug messages which weren't logger based:
      
      ```
      
      # auto flush std streams
      from sys import stdout, stderr
      def stdout_write_flush(args, w=stderr.write): w(args); stderr.flush()
      def stderr_write_flush(args, w=stderr.write): w(args); stderr.flush()
      stdout.write = stdout_write_flush
      stderr.write = stderr_write_flush
      
      from transformers import BartTokenizer, BartForConditionalGeneration, BartConfig
      
      import logging
      import transformers.utils.logging
      import transformers.models.bart.modeling_bart
      
      # I wanted a shorter simpler format
      handlers = transformers.utils.logging._get_library_root_logger().handlers
      for handler in handlers:
          formatter = logging.Formatter("[%(funcName)s] %(message)s")
          handler.setFormatter(formatter)
      
      transformers.models.bart.modeling_bart.logger.setLevel(transformers.logging.INFO)
      ```
      
      @LysandreJik, @sgugger, @patrickvonplaten
      4aa8f6ad
    • Julien Plu's avatar
      Fix TF Longformer (#9348) · 83eec97e
      Julien Plu authored
      * Fix longformer
      
      * Apply style
      
      * Remove serving content
      
      * Forgot a condition
      
      * Apply style
      
      * Address Patrick's comments
      
      * Fix dtype
      83eec97e
    • Boris Dayma's avatar
      feat(wandb): save model as artifact (#8119) · 30fa0b78
      Boris Dayma authored
      * feat(wandb): log artifacts
      
      * fix: typo
      
      * feat(wandb): ensure name is allowed
      
      * feat(wandb): log artifact
      
      * feat(wandb): saving logic
      
      * style: improve formatting
      
      * fix: unrelated typo
      
      * feat:聽use a fake trainer
      
      * fix:聽simplify
      
      * feat(wandb): log model files as artifact
      
      * style: fix style
      
      * docs(wandb): correct description
      
      * feat: unpack model + allow env Truethy values
      
      * feat: TrainerCallback can access tokenizer
      
      * style:聽fix style
      
      * feat(wandb): log more interesting metadata
      
      * feat: unpack tokenizer
      
      * feat(wandb): metadata with load_best_model_at_end
      
      * feat(wandb): more robust metadata
      
      * style(wandb): fix formatting
      30fa0b78
  2. 04 Jan, 2021 13 commits
  3. 03 Jan, 2021 1 commit
  4. 02 Jan, 2021 3 commits
  5. 30 Dec, 2020 1 commit
  6. 29 Dec, 2020 3 commits
  7. 28 Dec, 2020 3 commits
  8. 27 Dec, 2020 1 commit
  9. 25 Dec, 2020 2 commits
  10. 24 Dec, 2020 8 commits