- 21 Jul, 2020 2 commits
-
-
Hongkun Yu authored
(1) call() consume kwargs and implement _build_from_signature (layers added inside init_scope) (2) make build/call_attention as public. PiperOrigin-RevId: 322454619
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 322428923
-
- 20 Jul, 2020 1 commit
-
-
Allen Wang authored
PiperOrigin-RevId: 322216928
-
- 08 Jul, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 320240466
-
- 25 Jun, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 318208409
-
- 22 Jun, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 317596394
-
- 19 Jun, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 317330705
-
- 16 Jun, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 316593329
-
- 12 Jun, 2020 2 commits
- 10 Jun, 2020 3 commits
-
-
Chen Chen authored
PiperOrigin-RevId: 315767729
-
Hongkun Yu authored
PiperOrigin-RevId: 315738983
-
Hongkun Yu authored
PiperOrigin-RevId: 315630320
-
- 09 Jun, 2020 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 315584374
-
- 08 Jun, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 315214450
-
- 03 Jun, 2020 4 commits
-
-
Hongkun Yu authored
This reverts commit 4bb13e61.
-
Hongkun Yu authored
This reverts commit c3c2386c.
-
xinliupitt authored
* root dir * zone updated * print mask * preview emb * tf print * input only * emb * tf print * emb after mask * masked_softmax print * print scores * multi folder * first pos emb * check input shape * add test temp * import math * two classes * prints * all get_pos replace * make time scale private * pos emb comments * print input * embedding_inputs * tf shape * dimention list * tf_util * print tf_util * concise * transformer pos change to layer * keep length var * length as input * None as input * print time signal * print time signal * remove print * test input shape * double check shape * double check shape * double check shape * more test * shape check * shape check * print 97 info * print 97 info new * test if sam * assert same * remove assert * tf print same * tf print diff * output example * output example * output example * formal test * formal test length * raise valurerror * test valurerror * double check * comments * remove prints * rename relative * delet naive test * delete docs in xinliu branch * code reformat * import order * indentation fix * more files * adjust char number * disable not callable * comment to length * error of length unequal to input_shape * root dir * zone updated * print mask * preview emb * tf print * input only * emb * tf print * emb after mask * masked_softmax print * print scores * multi folder * remove docs * remove prints * root dir * zone updated * print mask * preview emb * tf print * input only * emb * tf print * emb after mask * masked_softmax print * print scores * multi folder * remove docs * apply revised 3 files * rm prints
-
Tianqi Liu authored
PiperOrigin-RevId: 314451720
-
- 02 Jun, 2020 2 commits
-
-
xinliupitt authored
* root dir * zone updated * print mask * preview emb * tf print * input only * emb * tf print * emb after mask * masked_softmax print * print scores * multi folder * first pos emb * check input shape * add test temp * import math * two classes * prints * all get_pos replace * make time scale private * pos emb comments * print input * embedding_inputs * tf shape * dimention list * tf_util * print tf_util * concise * transformer pos change to layer * keep length var * length as input * None as input * print time signal * print time signal * remove print * test input shape * double check shape * double check shape * double check shape * more test * shape check * shape check * print 97 info * print 97 info new * test if sam * assert same * remove assert * tf print same * tf print diff * output example * output example * output example * formal test * formal test length * raise valurerror * test valurerror * double check * comments * remove prints * rename relative * delet naive test * delete docs in xinliu branch * code reformat * import order * indentation fix * more files * adjust char number * disable not callable * comment to length * error of length unequal to input_shape
-
Chen Chen authored
PiperOrigin-RevId: 314373769
-
- 30 May, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 313906815
-
- 29 May, 2020 2 commits
-
-
Hongkun Yu authored
Proposes the full functionality of MultiHeadAttention layer. This change first goes to model garden NLP library. PiperOrigin-RevId: 313847485
-
Chen Chen authored
PiperOrigin-RevId: 313812017
-
- 28 May, 2020 1 commit
-
-
Reed Wanderman-Milne authored
Float32 is used if the model uses mixed precision with bfloat16. Float16 activation are unchanged. The motivation is that BERT with the LAMB optimizer with a gelu activation has an unstable loss when gelu is in bfloat16. Unfortunately, it is not easy to check if the LAMB optimizer and gelu is used, and perhaps there are other cases that work better with float32 activations instead of bfloat16 activations, so we always do the activation in float32 instead of bfloat16. PiperOrigin-RevId: 313618322
-
- 19 May, 2020 1 commit
-
-
Chen Chen authored
PiperOrigin-RevId: 312366167
-
- 18 May, 2020 1 commit
-
-
Chen Chen authored
PiperOrigin-RevId: 312116965
-
- 17 May, 2020 1 commit
-
-
A. Unique TensorFlower authored
Update nlp.modeling.layers.ReZeroTransformer, to have the same interface with nlp.modeling.layers.Transformer PiperOrigin-RevId: 311937563
-
- 12 May, 2020 3 commits
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 311196489
-
Hongkun Yu authored
PiperOrigin-RevId: 311165658
-
Chen Chen authored
PiperOrigin-RevId: 311072125
-
- 10 May, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 310767440
-
- 05 May, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 310032518
-
- 21 Apr, 2020 1 commit
-
-
Hongkun Yu authored
PiperOrigin-RevId: 307689094
-
- 20 Apr, 2020 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 307425903
-
- 19 Apr, 2020 1 commit
-
-
Le Hou authored
PiperOrigin-RevId: 307297217
-
- 17 Apr, 2020 4 commits