Fix distributed seeding behavior
This adds workers=True to the Lightning seed_everything function which guarantees different random states across all processes in distributed training. Prior to that some processes on different GPUs with the same worker ID could share the same random state. Note that this will break reproducibility between runs prior to and after this change. Also removes the seed and supress_output modules that were not used anymore in OpenFold.
Showing
Please register or sign in to comment