"docs/source/sharings/model_compression_zh.rst" did not exist on "171400ee5dfff3f7cc1775ad080a80786688b3d7"
  • Julien Chaumond's avatar
    Improve bert-japanese tokenizer handling (#8659) · 0cc5ab13
    Julien Chaumond authored
    
    
    * Make ci fail
    
    * Try to make tests actually run?
    
    * CI finally failing?
    
    * Fix CI
    
    * Revert "Fix CI"
    
    This reverts commit ca7923be7334d4e571b023478ebdd6b33dfd0ebb.
    
    * Ooops wrong one
    
    * one more try
    
    * Ok ok let's move this elsewhere
    
    * Alternative to globals() (#8667)
    
    * Alternative to globals()
    
    * Error is raised later so return None
    
    * Sentencepiece not installed make some tokenizers None
    
    * Apply Lysandre wisdom
    
    * Slightly clearer comment?
    
    cc @sgugger
    Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
    0cc5ab13
test_tokenization_bert_japanese.py 11.1 KB