1. 18 Jul, 2017 3 commits
    • Derek Chow's avatar
      Enable inference with dynamic batch size in Faster RCNN. · 5d5fb7cc
      Derek Chow authored
      * Adds a util function to compute a mix of dynamic and static shapes
        preferring static when available.
      * Uses batch_multiclass_non_max_suppression function in postprocess_rpn
        instead of looping over static batch shape and performing
        multiclass_non_max_suppression.
      * Adds a new helper function _unpad_proposals_and_sample_boxclassifier_batch
        to sample from a batch of tensors possibly containing paddings.
      * Tests batch inference with various configurations of static shape via
        unittests.
      5d5fb7cc
    • Derek Chow's avatar
      Changes to Batch Non-Max Suppression to enable batch inference. · 4d641f7f
      Derek Chow authored
      A few change to prepare for batch inference:
      
      * Modify the return type of batch non max suppression to be tuple of tensors
        so it can be reused for both stages of faster rcnn without any confusion
        in the semantics implied the the keys used to represent the tensors.
      * Allow dynamic number of anchors (boxes) in addition to dynamic batch size.
      * Remove a redundant dynamic batch size test.
      4d641f7f
    • Derek Chow's avatar
  2. 06 Jul, 2017 1 commit
  3. 15 Jun, 2017 1 commit