"test/srt/quant/test_w8a8_quantization.py" did not exist on "a7c47e0f028c2a9e67cbc99ab67692ec765d3dd0"
- 29 May, 2020 2 commits
-
-
Hongkun Yu authored
Proposes the full functionality of MultiHeadAttention layer. This change first goes to model garden NLP library. PiperOrigin-RevId: 313847485
-
Chen Chen authored
PiperOrigin-RevId: 313812017
-
- 28 May, 2020 1 commit
-
-
Reed Wanderman-Milne authored
Float32 is used if the model uses mixed precision with bfloat16. Float16 activation are unchanged. The motivation is that BERT with the LAMB optimizer with a gelu activation has an unstable loss when gelu is in bfloat16. Unfortunately, it is not easy to check if the LAMB optimizer and gelu is used, and perhaps there are other cases that work better with float32 activations instead of bfloat16 activations, so we always do the activation in float32 instead of bfloat16. PiperOrigin-RevId: 313618322
-
- 19 May, 2020 1 commit
-
-
Chen Chen authored
PiperOrigin-RevId: 312366167
-
- 18 May, 2020 1 commit
-
-
Chen Chen authored
PiperOrigin-RevId: 312116965
-
- 17 May, 2020 1 commit
-
-
A. Unique TensorFlower authored
Update nlp.modeling.layers.ReZeroTransformer, to have the same interface with nlp.modeling.layers.Transformer PiperOrigin-RevId: 311937563
-
- 12 May, 2020 3 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 311196489
-
Hongkun Yu authored
PiperOrigin-RevId: 311165658
-
Chen Chen authored
PiperOrigin-RevId: 311072125
-
- 10 May, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 310767440
-
- 05 May, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 310032518
-
- 21 Apr, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 307689094
-
- 20 Apr, 2020 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 307425903
-
- 19 Apr, 2020 1 commit
-
-
Le Hou authored
PiperOrigin-RevId: 307297217
-
- 17 Apr, 2020 4 commits
- 15 Apr, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 306748161
-
- 13 Apr, 2020 1 commit
-
-
Chen Chen authored
PiperOrigin-RevId: 306182576
-
- 08 Apr, 2020 1 commit
-
-
Yichao 'Peak' Ji authored
-
- 01 Apr, 2020 1 commit
-
-
George Karpenkov authored
PiperOrigin-RevId: 304222530
-
- 31 Mar, 2020 2 commits
-
-
Yichao 'Peak' Ji authored
-
Yichao 'Peak' Ji authored
-
- 27 Mar, 2020 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 303407939
-
- 12 Mar, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 300477605
-
- 09 Mar, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 299901483
-
- 05 Mar, 2020 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 299169021
-
Jing Li authored
PiperOrigin-RevId: 299126470
-
- 03 Mar, 2020 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 298692558
-
George Karpenkov authored
Removed with plans to re-add later once the feature stabilizes more. PiperOrigin-RevId: 298486867
-
- 26 Feb, 2020 3 commits
-
-
George Karpenkov authored
PiperOrigin-RevId: 297412889
-
George Karpenkov authored
PiperOrigin-RevId: 297366405
-
George Karpenkov authored
PiperOrigin-RevId: 297366158
-
- 25 Feb, 2020 1 commit
-
-
George Karpenkov authored
Application in graph mode still leads to some crashes. PiperOrigin-RevId: 297144398
-
- 21 Feb, 2020 1 commit
-
-
George Karpenkov authored
To debug the tf.function this API can be used: https://www.tensorflow.org/api_docs/python/tf/config/experimental_run_functions_eagerly PiperOrigin-RevId: 296458870
-
- 08 Feb, 2020 1 commit
-
-
Zongwei Zhou authored
PiperOrigin-RevId: 293958491
-
- 21 Jan, 2020 1 commit
-
-
Hongkun Yu authored
Keras: "manual" shape inference is only required if the layer is dynamic (otherwise we use TF's static shape inference capabilities) PiperOrigin-RevId: 290821518
-
- 08 Jan, 2020 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 288774884
-
- 19 Dec, 2019 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 286477560
-