Unverified Commit 9a6f2924 authored by 彭齐荣's avatar 彭齐荣 Committed by GitHub
Browse files

[Bugfix] Fix fail to create_shared_mem_array in ddp spawn train #4110 (#4111)



* Fix fail to create_shared_mem_array in ddp spawn train #4110

Fix fail to create_shared_mem_array in ddp spawn train #4110

* [Bugfix] Fix fail to create_shared_mem_array in ddp spawn train #4110

[Bugfix] Fix fail to create_shared_mem_array in ddp spawn train #4110
Replace random.seed() to random_ = random.Random()

* Update pytorch.py
Co-authored-by: default avatarQuan (Andy) Gan <coin2028@hotmail.com>
parent 7936e2ed
......@@ -69,7 +69,10 @@ def call_once_and_share(func, shape, dtype, rank=0):
# Process with the given rank creates and populates the shared memory array.
if current_rank == rank:
id_ = random.getrandbits(32)
# PyTorch Lightning 1.6+ seems to set the random seed during process spawning
# to the same seed value.
random_ = random.Random()
id_ = random_.getrandbits(32)
name = _get_shared_mem_name(id_)
result = create_shared_mem_array(name, shape, dtype)
result[:] = func()
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment