- 05 Oct, 2021 1 commit
-
-
Jialu Liu authored
PiperOrigin-RevId: 401000583
-
- 04 Oct, 2021 1 commit
-
-
Yuexin Wu authored
PiperOrigin-RevId: 400837715
-
- 24 Sep, 2021 2 commits
-
-
Rebecca Chen authored
PiperOrigin-RevId: 398612564
-
Rebecca Chen authored
PiperOrigin-RevId: 398612420
-
- 23 Sep, 2021 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 398593113
-
- 16 Sep, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 397106309
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 397106309
-
- 10 Sep, 2021 4 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 396035361
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 396035361
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 395996268
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 395996268
-
- 27 Aug, 2021 2 commits
- 25 Aug, 2021 6 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 392968271
-
Hongkun Yu authored
PiperOrigin-RevId: 392968271
-
Yuexin Wu authored
The current Funnel-Transformer contains only the encoder part and uses 1D-MaxPool for query pooling. PiperOrigin-RevId: 392961143
-
Yuexin Wu authored
The current Funnel-Transformer contains only the encoder part and uses 1D-MaxPool for query pooling. PiperOrigin-RevId: 392961143
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 392901555
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 392901555
-
- 18 Aug, 2021 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 391455959
-
Hongkun Yu authored
PiperOrigin-RevId: 391455959
-
- 13 Aug, 2021 2 commits
-
-
Frederick Liu authored
[transformer] Add a flag to return intermediate outputs. This is needed to add auxiliary loss which has been shown helpful in "Character-Level Language Modeling with Deeper Self-Attention" and "End-to-End Object Detection with Transformers" PiperOrigin-RevId: 390505073
-
Frederick Liu authored
[transformer] Add a flag to return intermediate outputs. This is needed to add auxiliary loss which has been shown helpful in "Character-Level Language Modeling with Deeper Self-Attention" and "End-to-End Object Detection with Transformers" PiperOrigin-RevId: 390505073
-
- 09 Aug, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 389568005
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 389568005
-
- 01 Aug, 2021 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 388118554
-
Hongkun Yu authored
PiperOrigin-RevId: 388118554
-
- 29 Jul, 2021 2 commits
- 28 Jul, 2021 4 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 387447417
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 387447417
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 387434804
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 387434804
-
- 24 Jul, 2021 2 commits
-
-
Tianqi Liu authored
PiperOrigin-RevId: 386580562
-
Tianqi Liu authored
PiperOrigin-RevId: 386580562
-
- 19 Jul, 2021 2 commits
-
-
Hongkun Yu authored
Both unit tests and real use cases are passing None to it. PiperOrigin-RevId: 385468558
-
Hongkun Yu authored
Both unit tests and real use cases are passing None to it. PiperOrigin-RevId: 385468558
-
- 13 Jul, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 384358378
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 384358378
-
- 10 Jul, 2021 1 commit
-
-
Frederick Liu authored
[efficient] Fix is_short_seq order so that we can also apply feature transform and apply softmax afterwards. PiperOrigin-RevId: 383967806
-