1. 25 Sep, 2018 1 commit
    • pkulzc's avatar
      Update slim and fix minor issue in object detection (#5354) · f505cecd
      pkulzc authored
      * Merged commit includes the following changes:
      213899768  by Sergio Guadarrama:
      
          Fixes #3819.
      
      --
      213493831  by Sergio Guadarrama:
      
          Internal change
      
      212057654  by Sergio Guadarrama:
      
          Internal change
      
      210747685  by Sergio Guadarrama:
      
          For FPN, when use_depthwise is set to true, use slightly modified mobilenet v1 config.
      
      --
      210128931  by Sergio Guadarrama:
      
          Allow user-defined current_step in NASNet.
      
      --
      209092664  by Sergio Guadarrama:
      
          Add quantized fine-tuning / training / eval and export to slim image classifier binaries.
      
      --
      207651347  by Sergio Guadarrama:
      
          Update mobilenet v1 docs to include revised tflite models.
      
      --
      207165245  by Sergio Guadarrama:
      
          Internal change
      
      207095064  by Sergio Guadarrama:
      
          Internal change
      
      PiperOrigin-RevId: 213899768
      
      * Update model_lib.py to fix eval_spec name issue.
      f505cecd
  2. 26 Jul, 2018 1 commit
    • Chenxi Liu's avatar
      Internal changes including PNASNet-5 mobile (#4895) · 696b69a4
      Chenxi Liu authored
      * PiperOrigin-RevId: 201234832
      
      * PiperOrigin-RevId: 202507333
      
      * PiperOrigin-RevId: 204320344
      
      * Add PNASNet-5 mobile network model and cell structure.
      
      PiperOrigin-RevId: 204735410
      
      * Add option to customize individual projection layer activation.
      
      PiperOrigin-RevId: 204776951
      696b69a4
  3. 19 Jun, 2018 1 commit
    • Mark Sandler's avatar
      1. Splits train_image_classifier into library and binary rule, to simplify reuse. (#4552) · 5bb9e6f3
      Mark Sandler authored
      2. Flag that allows to prevent imagenet.py  from downloading label_to_names from github and/or dumping into training directory (which might be read-only)
      3. Adds some comments about how decay steps are computed, since it computed differently when there are clones vs sync replicas.
      4. Updates mobilenet.md to describe the training process using train_image_classifer
      5. Add citation for TF-Slim model library.
      
      PiperOrigin-RevId: 191955231
      
      PiperOrigin-RevId: 193254125
      
      PiperOrigin-RevId: 193371562
      
      PiperOrigin-RevId: 194085628
      
      PiperOrigin-RevId: 194857067
      
      PiperOrigin-RevId: 196125653
      
      PiperOrigin-RevId: 196589070
      
      PiperOrigin-RevId: 199522873
      
      PiperOrigin-RevId: 200351305
      5bb9e6f3
  4. 13 Mar, 2018 1 commit
    • Mark Sandler's avatar
      Pulling internal changes to github (#3583) · 376dc8dd
      Mark Sandler authored
      * Internal change.
      
      PiperOrigin-RevId: 187042423
      
      * Internal change.
      
      PiperOrigin-RevId: 187072380
      
      * Opensource float and eight-bit fixed-point mobilenet_v1 training and eval scripts.
      
      PiperOrigin-RevId: 187106140
      
      * Initial check-in for Mobilenet V2
      
      PiperOrigin-RevId: 187213595
      
      * Allow configuring batch normalization decay and epsilon in MobileNet v1
      
      PiperOrigin-RevId: 187425294
      
      * Allow overriding NASNet model HParams.
      
      This is a change to the API that will allow users to pass in their own configs
      to the building functions, which should make these APIs much more customizable
      for end-user cases.
      
      This change removes the use_aux_head argument from the model construction
      functions, which is no longer necessary given that the use_aux_head option is
      configurable in the model config. For example, for the mobile ImageNet model,
      the auxiliary head can be disabled using:
      
      config = nasnet.mobile_imagenet_config()
      config.set_hparam('use...
      376dc8dd
  5. 27 Feb, 2018 1 commit
    • pkulzc's avatar
      Internal changes for slim (#3448) · 629adffa
      pkulzc authored
      * Merged commit includes the following changes:
      186565198  by Sergio Guadarrama:
      
          Applied random_hsv_in_yiq in inception_preprocessing.
      
      --
      186501039  by Sergio Guadarrama:
      
          Applied random_hsv_in_yiq in inception_preprocessing.
      
      --
      186013907  by Sergio Guadarrama:
      
          Internal change
      
      185715309  by Sergio Guadarrama:
      
          Obviates the need for prepadding on mobilenet v1 and v2 for fully convolutional models.
      
      --
      184266252  by Sergio Guadarrama:
      
          Give build_nasnet_*() functions an optional flag use_aux_head,
          and add an internal-only arg scope to NasNetA*Cell._apply_drop_path().
      
      --
      183865228  by Sergio Guadarrama:
      
          Internal change
      
      179580924  by Sergio Guadarrama:
      
          Internal change
      
      177320302  by Sergio Guadarrama:
      
          Internal change
      
      177130184  by Sergio Guadarrama:
      
          Make slim nets tests faster by using smaller examples of oversized inputs.
      
      --
      176965289  by Sergio Guadarrama:
      
          Internal change
      
      176585260  by Sergio Guadarrama:
      
          Internal change
      
      176534973  by Sergio Guadarrama:
      
          Internal change
      
      175526881  by Sergio Guadarrama:
      
          Internal change
      
      174967704  by Sergio Guadarrama:
      
          Treat num_classes=0 same as None in a few slim nets overlooked by the recent
          change.
      
      --
      174443227  by Sergio Guadarrama:
      
          Internal change
      
      174281864  by Sergio Guadarrama:
      
          Internal change
      
      174249903  by Sergio Guadarrama:
      
          Fix nasnet image classification and object detection by moving the option to turn ON or OFF batch norm training into it's own arg_scope used only by detection
      
      --
      173954505  by Sergio Guadarrama:
      
          Merge pull request #2651 from sguada/tmp1
      
          Fixes imports
      
          Closes #2636
      
          ORIGINAL_AUTHOR=Jon Shlens <shlens@users.noreply.github.com>
          COPYBARA_INTEGRATE_REVIEW=https://github.com/tensorflow/models/pull/2636 from tensorflow:sguada-patch-1 19ff570f52df5ab655c00fb439129b201c5f2dce
      
      --
      173928094  by Sergio Guadarrama:
      
          Remove pending imports
      
      --
      
      PiperOrigin-RevId: 186565198
      
      * Remove internal links.
      629adffa
  6. 20 Nov, 2017 1 commit
  7. 15 Nov, 2017 1 commit
  8. 30 Oct, 2017 3 commits
  9. 29 Oct, 2017 1 commit
  10. 28 Oct, 2017 1 commit