- 10 Oct, 2019 5 commits
-
-
Yeqing Li authored
PiperOrigin-RevId: 274023277
-
Hongkun Yu authored
PiperOrigin-RevId: 274015143
-
Yeqing Li authored
PiperOrigin-RevId: 274010788
-
Hongkun Yu authored
PiperOrigin-RevId: 273966871
-
Hongkun Yu authored
PiperOrigin-RevId: 273861263
-
- 09 Oct, 2019 3 commits
-
-
Reed Wanderman-Milne authored
Instead of needing to ensure variables are float32, casting inputs to float32, etc, instead dtype="float32" is passed to the layer constructor, which will do all that logic automatically. The only difference is the output of LayerNorm is now float32 instead of float16, so an extra cast is needed elsewhere. PiperOrigin-RevId: 273833286
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 273795511
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 273653001
-
- 08 Oct, 2019 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 273562498
-
George Karpenkov authored
PiperOrigin-RevId: 273527676
-
- 07 Oct, 2019 4 commits
-
-
A. Unique TensorFlower authored
Bert fp16 perf improvements, do the matmul in intermediate later in fp16, and also remove explicit casting to fp32 for layerNorm. PiperOrigin-RevId: 273379063
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 273371605
-
Jing Li authored
PiperOrigin-RevId: 273358759
-
Jing Li authored
PiperOrigin-RevId: 273233857
-
- 05 Oct, 2019 1 commit
-
-
Jing Li authored
PiperOrigin-RevId: 273066504
-
- 04 Oct, 2019 3 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 272934570
-
Jing Li authored
PiperOrigin-RevId: 272915002
-
Hongkun Yu authored
PiperOrigin-RevId: 272777104
-
- 03 Oct, 2019 2 commits
-
-
Hongkun Yu authored
As the model code is subject to a major change, we do not release hub module at this moment. PiperOrigin-RevId: 272688279
-
Sergey Mironov authored
-
- 01 Oct, 2019 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 272280612
-
David Chen authored
PiperOrigin-RevId: 272121528
-
- 30 Sep, 2019 3 commits
-
-
George Karpenkov authored
PiperOrigin-RevId: 272077584
-
Hongkun Yu authored
PiperOrigin-RevId: 272043067
-
Hongkun Yu authored
Adds a swish activation without customized gradients. PiperOrigin-RevId: 272029817
-
- 29 Sep, 2019 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 271873759
-
- 27 Sep, 2019 2 commits
-
-
Hongkun Yu authored
Support variable reshape to make TF1 checkpoint compatible with a Bert without reshape in einsum layers. PiperOrigin-RevId: 271613961
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 271611082
-
- 26 Sep, 2019 1 commit
-
-
David Chen authored
PiperOrigin-RevId: 271422100
-
- 24 Sep, 2019 2 commits
-
-
Bruce Fontaine authored
PiperOrigin-RevId: 270926016
-
Hongkun Yu authored
PiperOrigin-RevId: 270817869
-
- 23 Sep, 2019 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 270749832
-
Lucky Srivastava authored
-
- 20 Sep, 2019 6 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 270346317
-
Hongkun Yu authored
-
Hongkun Yu authored
Testing
-
Hongkun Yu authored
PiperOrigin-RevId: 270313909
-
Guillermo Rodríguez Cano authored
-
A. Unique TensorFlower authored
additional fix for resnet ctl benchamrk. As copybara needs exact spacing in the file, adding back the missing space. PiperOrigin-RevId: 270167733
-
- 19 Sep, 2019 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 270122397
-