Unverified Commit 87816e4e authored by Ikko Eltociear Ashimine's avatar Ikko Eltociear Ashimine Committed by GitHub
Browse files

Fix typo in test_optim.py

paramters -> parameters
parent e229fbce
......@@ -169,7 +169,7 @@ def test_optimizer32bit(dim1, dim2, gtype, optim_name):
if gtype != torch.float32:
# the adam buffers should also be close because they are 32-bit
# but the paramters can diverge because they are 16-bit
# but the parameters can diverge because they are 16-bit
# the difference grow larger and larger with each update
# --> copy the state to keep weights close
p1.data = p1.data.to(p2.dtype).float()
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment