"bw100.json" did not exist on "9afddf86a592c7d6bc6f15bf61a2ca9f679e1afa"
Unverified Commit d155eaac authored by Przemyslaw Tredak's avatar Przemyslaw Tredak Committed by GitHub
Browse files

Respect pyTorch determinism flag (#582)


Signed-off-by: default avatarPrzemek Tredak <ptredak@nvidia.com>
Co-authored-by: default avatarKirthi Shankar Sivamani <ksivamani@nvidia.com>
parent b90b638d
......@@ -2006,7 +2006,8 @@ class DotProductAttention(torch.nn.Module):
norm_factor = math.sqrt(self.hidden_size_per_attention_head)
self.device_compute_capability = get_device_compute_capability()
self.deterministic = not bool(int(os.getenv("NVTE_ALLOW_NONDETERMINISTIC_ALGO", "1")))
self.deterministic = not bool(int(os.getenv("NVTE_ALLOW_NONDETERMINISTIC_ALGO", "1"))) \
or torch.are_deterministic_algorithms_enabled()
self.use_flash_attention = (
int(os.getenv("NVTE_FLASH_ATTN", "1"))
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment