".github/git@developer.sourcefind.cn:ox696c/ktransformers.git" did not exist on "ca1dc1e7d16958893aa4ef3e005ad419e55a4b71"
Add XLA to transformer (#7048)
* set default steps to 300K.
* Log flags to perfzero.
* Add XLA support to transformer
- Moved config logic to keras_utils
- Added enable_xla flag to _performance flags
- Did not refactor enable_xla flag from keras resnet due to
reliance on calling FLAGs in estimator keras and that is
a needed refactor for another time.
* fix g3 lint complaint.
* Refactor set config into keras_utils.
* Move flags out of main.
* pipe through enable_xla
* Update official/transformer/v2/misc.py
Co-Authored-By:
Reed <reedwm@google.com>
Showing
Please register or sign in to comment