Backpropagate global and attention layers together (#9335)
* Merged commit includes the following changes:
326369548 by Andre Araujo:
Fix import issues.
--
326159826 by Andre Araujo:
Changed the implementation of the cosine weights from Keras layer to tf.Variable to manually control for L2 normalization.
--
326139082 by Andre Araujo:
Support local feature matching using ratio test.
To allow for easily choosing which matching type to use, we rename a flag/argument and modify all related files to avoid breakages.
Also include a small change when computing nearest neighbors for geometric matching, to parallelize computation, which saves a little bit of time during execution (argument "n_jobs=-1").
--
326119848 by Andre Araujo:
Option to measure DELG latency taking binarization into account.
--
324316608 by Andre Araujo:
DELG global features training.
--
323693131 by Andre Araujo:
PY3 conversion for delf public lib.
--
321046157 by Andre Araujo:
Purely Google refactor
--
PiperOrigin-RevId: 326369548
* Added export of delg_model module.
* README update to explain training DELG global features head
* Added guidelines for DELF hyperparameter values
* Fixed typo
* Added mention about remaining training flags.
* Merged commit includes the following changes:
334723489 by Andre Araujo:
Backpropagate global and attention layers together.
--
334228310 by Andre Araujo:
Enable scaling of local feature locations to the resized resolution.
--
PiperOrigin-RevId: 334723489
Co-authored-by:
Andre Araujo <andrearaujo@google.com>
Showing
Please register or sign in to comment