1. 06 Sep, 2023 1 commit
  2. 01 Sep, 2023 1 commit
  3. 29 Aug, 2023 1 commit
    • Brian Pickrell's avatar
      Fix dyn pooling (#1768) · 7b8a28f5
      Brian Pickrell authored
      Adds support for dynamic input shape in pooling operator along with auto-padding. This combination requires that the padding (and therefore the output shape) can't be computed until runtime.
      7b8a28f5
  4. 22 Aug, 2023 2 commits
  5. 21 Aug, 2023 1 commit
  6. 18 Aug, 2023 2 commits
  7. 13 Aug, 2023 1 commit
  8. 11 Aug, 2023 1 commit
  9. 10 Aug, 2023 1 commit
  10. 08 Aug, 2023 3 commits
  11. 06 Aug, 2023 2 commits
  12. 03 Aug, 2023 1 commit
  13. 31 Jul, 2023 2 commits
    • Lakhinder Walia's avatar
      Lw/fix half shape (#2000) · e4dc75ea
      Lakhinder Walia authored
      * Use shape of Instruction (instead of a default) in add_return()
      
      * Instruction validation fix: not to use a default shape value for comparison
      
      * Fix instruction::replace() to recompute shape for "@return"
      
      * handle the case of missing shape in an Instruction related Test
      
      * use compute_shape() to get op shapes + test case for tuple_type
      
      * add test case shape_test/return_shape_tuple
      
      * Add test for @return to check for half type
      
      * Move @return unit-tests around..; Address review comments
      
      * Broken comparison fix: comparison to a (default) shape of tuple_type
      
      * Test cases: (add) return_shape_empty & (modify) return_shape_tuple
      
      * modify the assert() statement
      e4dc75ea
    • Artur Wojcik's avatar
  14. 30 Jul, 2023 2 commits
  15. 28 Jul, 2023 2 commits
  16. 27 Jul, 2023 1 commit
  17. 26 Jul, 2023 1 commit
  18. 25 Jul, 2023 1 commit
  19. 23 Jul, 2023 1 commit
  20. 22 Jul, 2023 2 commits
  21. 21 Jul, 2023 2 commits
    • Umang Yadav's avatar
      Add back clamping and add tests (#1969) · 6957243c
      Umang Yadav authored
      Fixes #1957
      
      Clamping was removed in #1853.
      
      Turns out clamping as necessary to handle overflow/underflow cases. during downcasting, if it overflowed then without clamping it returned infinity.
      6957243c
    • Umang Yadav's avatar
      Use `optimize_module` pass for the quantization to fp16 (#1974) · 6f1f4b59
      Umang Yadav authored
      Fixes #1746
      
      BatchNorm only has x as the runtime input parameter for the following equation. All the other parameters are compile-time constants and related operations can be const-folded before quantizing to fp16 to preserve precision.
      6f1f4b59
  22. 19 Jul, 2023 2 commits
  23. 16 Jul, 2023 1 commit
  24. 13 Jul, 2023 1 commit
    • Charlie Lin's avatar
      Update deconvolution -> convolution_backwards and Dynamic Shape Support (#1801) · 4edf1195
      Charlie Lin authored
      Renames deconvolution -> convolution_backwards to be more consistent with the literature
      Note: this is not the cross-correlation operator (which is the adjoint of convolution). This is technically a standard convolution operator combined with an upsampling operator rather than a downsampling operator.
      Adds unit tests for the padding, strides, dilations, and other op attributes.
      Throws on auto_pad attribute since it has not been implemented
      Previously it read the attribute and set it but then did nothing with it
      Extended for dynamic shapes
      Does not support using asymmetric padding (padding_L != padding_R) and output_shape with dynamic shapes.
      4edf1195
  25. 10 Jul, 2023 3 commits
  26. 08 Jul, 2023 2 commits