- 10 Mar, 2021 2 commits
-
-
Frederick Liu authored
PiperOrigin-RevId: 362075728
-
Frederick Liu authored
PiperOrigin-RevId: 362075728
-
- 12 Aug, 2020 2 commits
-
-
Hongkun Yu authored
PiperOrigin-RevId: 326286926
-
Hongkun Yu authored
PiperOrigin-RevId: 326286926
-
- 09 May, 2020 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 310658964
-
- 25 Apr, 2020 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 308369201
-
- 27 Sep, 2019 1 commit
-
-
A. Unique TensorFlower authored
PiperOrigin-RevId: 271611082
-
- 17 Sep, 2019 1 commit
-
-
Hongkun Yu authored
Move movielens to recommendation PiperOrigin-RevId: 269680664
-
- 20 Apr, 2019 1 commit
-
-
Shining Sun authored
* Remove contrib imports, or move them inline * Use exposed API for FixedLenFeature * Replace tf.logging with absl logging * Change GFile to v2 APIs * replace tf.logging with absl loggin in movielens * Fixing an import bug * Change gfile to v2 APIs in code * Swap to keras optimizer v2 * Bug fix for optimizer * Change tf.log to tf.keras.backend.log * Change the loss function to keras loss * convert another loss to keras loss * Resolve comments and fix lint * Add a doc string * Fix existing tests and add new tests for DS * Added tests for multi-replica * Fix lint * resolve comments * make estimator run in tf2.0 * use compat v1 loss * fix lint issue
-
- 01 Nov, 2018 1 commit
-
-
Taylor Robie authored
-
- 17 Sep, 2018 1 commit
-
-
Tayo Oguntebi authored
-
- 30 Jul, 2018 1 commit
-
-
Taylor Robie authored
* intermediate commit * ncf now working * reorder pipeline * allow batched decode for file backed dataset * fix bug * more tweaks * parallize false negative generation * shared pool hack * workers ignore sigint * intermediate commit * simplify buffer backed dataset creation to fixed length record approach only. (more cleanup needed) * more tweaks * simplify pipeline * fix misplaced cleanup() calls. (validation works\!) * more tweaks * sixify memoryview usage * more sixification * fix bug * add future imports * break up training input pipeline * more pipeline tuning * first pass at moving negative generation to async * refactor async pipeline to use files instead of ipc * refactor async pipeline * move expansion and concatenation from reduce worker to generation workers * abandon complete async due to interactions with the tensorflow threadpool * cleanup * remove performance_comparison.py * experiment with rough generator + interleave pipeline * yet more pipeline tuning * update on-the-fly pipeline * refactor preprocessing, and move train generation behind a GRPC server * fix leftover call * intermediate commit * intermediate commit * fix index error in data pipeline, and add logging to train data server * make sharding more robust to imbalance * correctly sample with replacement * file buffers are no longer needed for this branch * tweak sampling methods * add README for data pipeline * fix eval sampling, and vectorize eval metrics * add spillover and static training batch sizes * clean up cruft from earlier iterations * rough delint * delint 2 / n * add type annotations * update run script * make run.sh a bit nicer * change embedding initializer to match reference * rough pass at pure estimator model_fn * impose static shape hack (revisit later) * refinements * fix dir error in run.sh * add documentation * add more docs and fix an assert * old data test is no longer valid. Keeping it around as reference for the new one * rough draft of data pipeline validation script * don't rely on shuffle default * tweaks and documentation * add separate eval batch size for performance * initial commit * terrible hacking * mini hacks * missed a bug * messing about trying to get TPU running * TFRecords based TPU attempt * bug fixes * don't log remotely * more bug fixes * TPU tweaks and bug fixes * more tweaks * more adjustments * rework model definition * tweak data pipeline * refactor async TFRecords generation * temp commit to run.sh * update log behavior * fix logging bug * add check for subprocess start to avoid cryptic hangs * unify deserialize and make it TPU compliant * delint * remove gRPC pipeline code * fix logging bug * delint and remove old test files * add unit tests for NCF pipeline * delint * clean up run.sh, and add run_tpu.sh * forgot the most important line * fix run.sh bugs * yet more bash debugging * small tweak to add keras summaries to model_fn * Clean up sixification issues * address PR comments * delinting is never over
-
- 25 Jun, 2018 1 commit
-
-
Qianli Scott Zhu authored
-
- 20 Jun, 2018 1 commit
-
-
Taylor Robie authored
* begin branch * finish download script * rename download to dataset * intermediate commit * intermediate commit * misc tweaks * intermediate commit * intermediate commit * intermediate commit * delint and update census test. * add movie tests * delint * fix py2 issue * address PR comments * intermediate commit * intermediate commit * intermediate commit * finish wide deep transition to vanilla movielens * delint * intermediate commit * intermediate commit * intermediate commit * intermediate commit * fix import * add default ncf csv construction * change default on download_if_missing * shard and vectorize example serialization * fix import * update ncf data unittests * delint * delint * more delinting * fix wide-deep movielens serialization * address PR comments * add file_io tests * investigate wide-deep test failure * remove hard coded path and properly use flags. * address file_io test PR comments * missed a hash_bucked_size
-