Unverified Commit 1a3deae8 authored by Stas Bekman's avatar Stas Bekman Committed by GitHub
Browse files

[trainer] release tmp memory in checkpoint load (#12718)



* [trainer] release tmp memory in checkpoint load

* Update src/transformers/trainer.py
Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
parent a18a17d2
......@@ -1076,6 +1076,9 @@ class Trainer:
# If the model is on the GPU, it still works!
self._load_state_dict_in_model(state_dict)
# release memory
del state_dict
# If model was re-initialized, put it on the right device and update self.model_wrapped
if model_reloaded:
if self.place_model_on_device:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment