Commit aa08088e authored by Wilbur Wu's avatar Wilbur Wu Committed by Soumith Chintala
Browse files

Fixing the issue reading 1GB chunk when calculating md5 (#259)

According to the comment, this should be 1MB which is 1024*1024 bytes.
parent 8648132c
...@@ -10,7 +10,7 @@ def check_integrity(fpath, md5): ...@@ -10,7 +10,7 @@ def check_integrity(fpath, md5):
md5o = hashlib.md5() md5o = hashlib.md5()
with open(fpath, 'rb') as f: with open(fpath, 'rb') as f:
# read in 1MB chunks # read in 1MB chunks
for chunk in iter(lambda: f.read(1024 * 1024 * 1024), b''): for chunk in iter(lambda: f.read(1024 * 1024), b''):
md5o.update(chunk) md5o.update(chunk)
md5c = md5o.hexdigest() md5c = md5o.hexdigest()
if md5c != md5: if md5c != md5:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment