- 23 Feb, 2021 2 commits
-
-
A. Unique TensorFlower authored
convert Dimension to int to avoid "TypeError: unsupported operand type(s) for /: float and Dimension" PiperOrigin-RevId: 358988747
-
A. Unique TensorFlower authored
convert Dimension to int to avoid "TypeError: unsupported operand type(s) for /: float and Dimension" PiperOrigin-RevId: 358988747
-
- 05 Jan, 2021 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 350170448
-
Hongkun Yu authored
PiperOrigin-RevId: 350170448
-
- 21 Dec, 2020 1 commit
-
-
Samuel Marks authored
-
- 22 Sep, 2020 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 332992539
-
Hongkun Yu authored
PiperOrigin-RevId: 332992539
-
- 14 Sep, 2020 2 commits
-
-
Zhenyu Tan authored
PiperOrigin-RevId: 331620370
-
Zhenyu Tan authored
PiperOrigin-RevId: 331620370
-
- 28 Aug, 2020 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 328872353
-
Hongkun Yu authored
PiperOrigin-RevId: 328872353
-
- 27 Aug, 2020 2 commits
-
-
Zhenyu Tan authored
PiperOrigin-RevId: 328674302
-
Zhenyu Tan authored
PiperOrigin-RevId: 328674302
-
- 19 Aug, 2020 4 commits
-
-
Hongkun Yu authored
Polish Seq2SeqTransformer: (1) consolidate args; (2) add tests for distribution strategy and decoding path. (3) fix bugs PiperOrigin-RevId: 327455733
-
Hongkun Yu authored
Polish Seq2SeqTransformer: (1) consolidate args; (2) add tests for distribution strategy and decoding path. (3) fix bugs PiperOrigin-RevId: 327455733
-
Hongkun Yu authored
PiperOrigin-RevId: 327363070
-
Hongkun Yu authored
PiperOrigin-RevId: 327363070
-
- 13 Aug, 2020 2 commits
-
-
Zhenyu Tan authored
PiperOrigin-RevId: 326496940
-
Zhenyu Tan authored
PiperOrigin-RevId: 326496940
-
- 07 Aug, 2020 3 commits
-
-
xinliupitt authored
-
xinliupitt authored
-
xinliupitt authored
-
- 03 Aug, 2020 1 commit
-
-
xinliupitt authored
-
- 30 Jul, 2020 4 commits
-
-
xinliupitt authored
-
xinliupitt authored
-
Hongkun Yu authored
PiperOrigin-RevId: 323927843
-
Hongkun Yu authored
PiperOrigin-RevId: 323927843
-
- 29 Jul, 2020 2 commits
-
-
xinliupitt authored
-
xinliupitt authored
-
- 21 Jul, 2020 2 commits
-
-
Hongkun Yu authored
(1) call() consume kwargs and implement _build_from_signature (layers added inside init_scope) (2) make build/call_attention as public. PiperOrigin-RevId: 322454619
-
Hongkun Yu authored
(1) call() consume kwargs and implement _build_from_signature (layers added inside init_scope) (2) make build/call_attention as public. PiperOrigin-RevId: 322454619
-
- 08 Jul, 2020 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 320240466
-
Hongkun Yu authored
PiperOrigin-RevId: 320240466
-
- 22 Jun, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 317596394
-
- 19 Jun, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 317330705
-
- 29 May, 2020 1 commit
-
-
Hongkun Yu authored
Proposes the full functionality of MultiHeadAttention layer. This change first goes to model garden NLP library. PiperOrigin-RevId: 313847485
-
- 28 May, 2020 1 commit
-
-
Reed Wanderman-Milne authored
Float32 is used if the model uses mixed precision with bfloat16. Float16 activation are unchanged. The motivation is that BERT with the LAMB optimizer with a gelu activation has an unstable loss when gelu is in bfloat16. Unfortunately, it is not easy to check if the LAMB optimizer and gelu is used, and perhaps there are other cases that work better with float32 activations instead of bfloat16 activations, so we always do the activation in float32 instead of bfloat16. PiperOrigin-RevId: 313618322
-
- 12 May, 2020 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 311165658
-
Chen Chen authored
PiperOrigin-RevId: 311072125
-
- 10 May, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 310767440
-