Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
85b50c88
"vscode:/vscode.git/clone" did not exist on "04da8a6022aa7ed13974ce6cdaa1331ba9fe0ed1"
Commit
85b50c88
authored
Apr 17, 2020
by
Chen Chen
Committed by
A. Unique TensorFlower
Apr 17, 2020
Browse files
Add TalkingHeadsAttention to README.
PiperOrigin-RevId: 307082084
parent
f499e880
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
0 deletions
+3
-0
official/nlp/modeling/layers/README.md
official/nlp/modeling/layers/README.md
+3
-0
No files found.
official/nlp/modeling/layers/README.md
View file @
85b50c88
...
@@ -14,6 +14,9 @@ If `from_tensor` and `to_tensor` are the same, then this is self-attention.
...
@@ -14,6 +14,9 @@ If `from_tensor` and `to_tensor` are the same, then this is self-attention.
*
[
CachedAttention
](
attention.py
)
implements an attention layer with cache used
*
[
CachedAttention
](
attention.py
)
implements an attention layer with cache used
for auto-agressive decoding.
for auto-agressive decoding.
*
[
TalkingHeadsAttention
](
talking_heads_attention.py
)
implements the talking
heads attention, as decribed in
[
"Talking-Heads Attention"
](
https://arxiv.org/abs/2003.02436
)
.
*
[
Transformer
](
transformer.py
)
implements an optionally masked transformer as
*
[
Transformer
](
transformer.py
)
implements an optionally masked transformer as
described in
[
"Attention Is All You Need"
](
https://arxiv.org/abs/1706.03762
)
.
described in
[
"Attention Is All You Need"
](
https://arxiv.org/abs/1706.03762
)
.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment