- 27 Aug, 2021 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 393343072
-
- 25 Aug, 2021 4 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 392968271
-
Yuexin Wu authored
The current Funnel-Transformer contains only the encoder part and uses 1D-MaxPool for query pooling. PiperOrigin-RevId: 392961143
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 392931060
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 392901555
-
- 19 Aug, 2021 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 391807562
-
- 18 Aug, 2021 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 391455959
-
- 13 Aug, 2021 1 commit
-
-
Frederick Liu authored
[transformer] Add a flag to return intermediate outputs. This is needed to add auxiliary loss which has been shown helpful in "Character-Level Language Modeling with Deeper Self-Attention" and "End-to-End Object Detection with Transformers" PiperOrigin-RevId: 390505073
-
- 11 Aug, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 390235315
-
Sagun Bajra authored
PiperOrigin-RevId: 390202261
-
- 09 Aug, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 389748939
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 389568005
-
- 04 Aug, 2021 2 commits
-
-
Frederick Liu authored
PiperOrigin-RevId: 388614364
-
Le Hou authored
PiperOrigin-RevId: 388586684
-
- 03 Aug, 2021 2 commits
-
-
Le Hou authored
PiperOrigin-RevId: 388575593
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 388356184
-
- 01 Aug, 2021 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 388118554
-
- 29 Jul, 2021 2 commits
- 28 Jul, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 387447417
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 387434804
-
- 24 Jul, 2021 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 386654855
-
Tianqi Liu authored
PiperOrigin-RevId: 386580562
-
- 19 Jul, 2021 1 commit
-
-
Hongkun Yu authored
Both unit tests and real use cases are passing None to it. PiperOrigin-RevId: 385468558
-
- 13 Jul, 2021 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 384358378
-
- 10 Jul, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 384018258
-
Frederick Liu authored
[efficient] Fix is_short_seq order so that we can also apply feature transform and apply softmax afterwards. PiperOrigin-RevId: 383967806
-
- 08 Jul, 2021 1 commit
-
-
Philip Pham authored
PiperOrigin-RevId: 383660517
-
- 04 Jul, 2021 1 commit
-
-
Frederick Liu authored
PiperOrigin-RevId: 382960239
-
- 03 Jul, 2021 1 commit
-
-
Frederick Liu authored
PiperOrigin-RevId: 382846192
-
- 02 Jul, 2021 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 382655843
-
- 25 Jun, 2021 3 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 381540013
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 381396116
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 381367878
-
- 24 Jun, 2021 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 381152422
-
- 23 Jun, 2021 1 commit
-
-
Reed Wanderman-Milne authored
In nlp/train.py and vision/beta/train.py, certain flags are marked as required. Additionally, in certain functions, error messages are improved if a necessary flag is not specified, which is a fallback in case a file calling define_flags() does not mark the necessary flags are required. Previously if any of these flags were not specified, it would crash with a cryptic error message, making it hard to tell what went wrong. In a subsequent change, I will mark flags as required in more files which call define_flags(). PiperOrigin-RevId: 381066985
-
- 19 Jun, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 380296477
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 380296477
-
- 18 Jun, 2021 2 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 380190962
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 380190962
-