Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
0490e860
Commit
0490e860
authored
Aug 08, 2020
by
xinliupitt
Browse files
attention initializer
parent
c60aa809
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
13 additions
and
8 deletions
+13
-8
official/modeling/activations/attention_initializer.py
official/modeling/activations/attention_initializer.py
+13
-8
No files found.
official/modeling/activations/attention_initializer.py
View file @
0490e860
...
...
@@ -12,21 +12,26 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Gaussian error linear unit."""
"""Attention Layer Initializer."""
from
__future__
import
absolute_import
from
__future__
import
division
from
__future__
import
print_function
import
math
import
tensorflow
as
tf
@
tf
.
keras
.
utils
.
register_keras_serializable
(
package
=
'Text'
)
def
gelu
(
x
):
"""
Gaussian Error Linear Unit
.
def
attention_initializer
(
hidden_size
):
"""
Weight Initializer of Attention Layer in Seq2Seq Transformer
.
This is a smoother version of the RELU.
Original paper: https://arxiv.org/abs/1606.08415
Args:
x: float Tensor to perform activation.
hidden_size: hidden size of input tensor
Returns:
`x` with the GELU activation applied.
Initialized weights based on hidden size
"""
return
tf
.
keras
.
activations
.
gelu
(
x
,
approximate
=
True
)
limit
=
math
.
sqrt
(
6.0
/
(
hidden_size
+
hidden_size
))
return
tf
.
keras
.
initializers
.
RandomUniform
(
minval
=-
limit
,
maxval
=
limit
)
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment