1. 22 Apr, 2019 1 commit
  2. 18 Apr, 2019 4 commits
  3. 16 Apr, 2019 1 commit
  4. 11 Apr, 2019 1 commit
    • henrymai's avatar
      prelu belongs in FP16_CASTS (#257) · 4dc711bc
      henrymai authored
      The main use of these functions (e.g.: `torch.{conv*, prelu}`) is via their `torch.nn`
      wrapping layers.
      
      The `torch.nn` layers are what contain the weights and call into these lower level
      functions with the weights as a parameter in their `forward()` method.
      
      The `torch.conv*` functions are already in the `FP16_CASTS` list due to amp's philosophy of
      casting the parameters rather than the model/layer weights.
      
      Conceptually `torch.prelu` is the same as the `torch.conv*` case, where its weight parameter
      is passed in from its wrapper layer `torch.nn.PReLU`.
      4dc711bc
  5. 10 Apr, 2019 5 commits
  6. 09 Apr, 2019 1 commit
  7. 08 Apr, 2019 1 commit
  8. 05 Apr, 2019 3 commits
  9. 04 Apr, 2019 3 commits
  10. 03 Apr, 2019 1 commit
  11. 01 Apr, 2019 1 commit
  12. 31 Mar, 2019 1 commit
  13. 27 Mar, 2019 2 commits
  14. 26 Mar, 2019 2 commits
  15. 23 Mar, 2019 1 commit
  16. 22 Mar, 2019 4 commits
    • jjsjann123's avatar
      [SyncBatchNorm] (#206) · 0a991543
      jjsjann123 authored
      supporting 2 dimensional input, resolving issue #194
      
      Implementation:
        for 2d input, switching channel_last flag to true for better memory access
      pattern in the kernel.
      0a991543
    • henrymai's avatar
      Add prelu to list of torch overrides (#217) · 570fde70
      henrymai authored
      * Add prelu to list of torch overrides
      
      This is to fix the following error:
      
        File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 489, in __call__
          result = self.forward(*input, **kwargs)
        File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/container.py", line 92, in forward
          input = module(input)
        File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 489, in __call__
          result = self.forward(*input, **kwargs)
        File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/activation.py", line 722, in forward
          return F.prelu(input, self.weight)
        File "/opt/conda/lib/python3.6/site-packages/torch/nn/functional.py", line 1040, in prelu
          return torch.prelu(input, weight)
      RuntimeError: expected scalar type Half but found Float
      
      * Update torch_overrides.py
      570fde70
    • enricoschroeder's avatar
      Fix 'local variable 'optimizers_was_list' referenced before assignment' when... · ba429e51
      enricoschroeder authored
      Fix 'local variable 'optimizers_was_list' referenced before assignment' when amp.initialize() is called with optimizers=None (#218)
      
      ba429e51
    • mcarilli's avatar
      Check cuda version (#216) · 5b8faa29
      mcarilli authored
      * Adding Torch + bare-metal nvcc version check and container build tests
      
      * Putting a canary in the coalmine
      
      * canary proved elusive
      
      * Trying direct setup.py install
      
      * this should work
      
      * Removing canary
      
      * hopefully this works
      5b8faa29
  17. 21 Mar, 2019 3 commits
  18. 20 Mar, 2019 3 commits
  19. 19 Mar, 2019 2 commits