"...git@developer.sourcefind.cn:modelzoo/alphafold2_jax.git" did not exist on "eb93322ba9e65542721fec157cfce6e2b74e0936"
  1. 28 Nov, 2023 1 commit
  2. 01 Dec, 2021 1 commit
  3. 19 Nov, 2021 1 commit
  4. 25 Jan, 2021 1 commit
    • Jeff Daily's avatar
      fix bugs in syncbn (#46) · 3f49dbf0
      Jeff Daily authored
      - incorrect use of __shfl_down
      - fix warp size assumptions
      - update unit tests to exit on failure
      3f49dbf0
  5. 31 Jul, 2020 1 commit
  6. 10 Jul, 2020 1 commit
  7. 06 Jul, 2020 1 commit
    • jjsjann123's avatar
      [sync BN] (#792) · 1ff54b8f
      jjsjann123 authored
      * [sync BN]
      
      support non-uniform batch size across process group.
      
      TODO: test should be added once cleaned up.
      
      * updating unit tests
      
      * new unit tests for different inputs
      
      * cleaning
      1ff54b8f
  8. 03 Jun, 2020 1 commit
  9. 06 Nov, 2019 1 commit
  10. 26 Jul, 2019 1 commit
    • jjsjann123's avatar
      [sbn update] (#384) · 896ecdd6
      jjsjann123 authored
      fixing empty return from python implementation
        adding proper test to verify functional correctness for python implementation
      896ecdd6
  11. 12 Jul, 2019 1 commit
    • jjsjann123's avatar
      [sbn update] (#384) · 574fe244
      jjsjann123 authored
      fixing empty return from python implementation
        adding proper test to verify functional correctness for python implementation
      574fe244
  12. 01 May, 2019 1 commit
  13. 04 Apr, 2019 1 commit
    • mcarilli's avatar
      WIP: Handle arbitrary combinations of optimizers/models/losses (#232) · 3f87614f
      mcarilli authored
      * Refactor to allow more flexible treatment of multiple optimizers/models/losses
      
      * Adding _process_optimizers.py
      
      * Created L0 tests (now passing).
      
      * fix: minor print typo (#234)
      
      * make L1 results easier to read
      
      * L0 multiple model/optimizer/loss test fleshed out
      
      * Adding test that master params remain synced across distributed processes
      
      * Docstring updates
      
      * Docstring updates
      3f87614f
  14. 12 Mar, 2019 1 commit
  15. 26 Feb, 2019 1 commit
  16. 03 Feb, 2019 1 commit
  17. 01 Nov, 2018 1 commit
  18. 29 Oct, 2018 1 commit
    • mcarilli's avatar
      Merging in fused adam optimizer, additional DDP features tested in 18.10 (#60) · e0bc5d62
      mcarilli authored
      * test passes
      
      * notes
      
      * Using C++-side flatten and unflatten functions
      
      * Adding csrc
      
      * Persistent synchronization event so it doesn't need to be created and destroyed each time
      
      * Interop with parameter flattening in SSD
      
      * Added deterministic option to imagenet main.py
      
      * Adding options to split gradient averaging and allreduce in pure fp32
      
      * Fixing allreduce_maybe_retain call
      
      * Fixing allreduce_fallback
      
      * Also sync active_i_buckets from rank 0
      
      * Making retain_allreduce_buffers compatible with/orthogonal to delay_allreduce=True|False
      
      * Correcting syntax error, now all seems to work with SSD
      
      * Optional cpp extension build
      
      * Add mixed precision adam optimizer (#59)
      
      * Add FusedAdam Optimizer to Apex that places all the math into a cuda kernel.
      
      * Added fixes to fused_adam to get it to work with network.
      
      * wip work on python interface for adam with options
      
      * fix dispatch for halfs, add python options to handle optional half gradients and params
      
      * cleanup, get rid of grid-stride loop
      e0bc5d62
  19. 29 Sep, 2018 2 commits
  20. 14 May, 2018 1 commit
  21. 07 May, 2018 1 commit