[PyTorch] Lower atol/rtol for F16 attention tests (#1157)
* reduce atol/rtol for F16 tests Signed-off-by:Charlene Yang <8636796+cyanguwa@users.noreply.github.com> * relax the tols for Ampere Signed-off-by:
Charlene Yang <8636796+cyanguwa@users.noreply.github.com> --------- Signed-off-by:
Charlene Yang <8636796+cyanguwa@users.noreply.github.com>
Showing
Please register or sign in to comment