Commit 581b8d15 authored by liangjing's avatar liangjing
Browse files

version 1

parents
Pipeline #169 failed with stages
in 0 seconds
This diff is collapsed.
# preparation
Accordding to the following PR to produce fmhalib.so
https://github.com/sneaxiy/apex/pull/1
# build the fmha op
python setup.py install
# unittest (need to improve)
python test_fmha.py
This diff is collapsed.
This diff is collapsed.
# Function
Support gemm_nn/nt + bias with float16, float32 and float64 data types.
# Correcness test
`
python test_fused_dense_op.py
`
# performance result
`
python test_fused_dense_perf.py
`
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
# function
This fused op implements the following logics:
```
layer_norm(residual + dropout(input))
```
# Accuracy test
python test_fused_dropout_op.py
# Perf test
python test_fused_dropout_perf.py
# TODO
1. In the unittest, we only test the accuracy when dropout_rate is 0.
2. how to set is_test is true for dropout in eval phase?
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment