- 10 Nov, 2021 1 commit
-
-
Frederick Liu authored
[reuse] Add layers used in [Leveraging redundancy in attention with Reuse Transformers](https://arxiv.org/abs/2110.06821). PiperOrigin-RevId: 408969659
-
[reuse] Add layers used in [Leveraging redundancy in attention with Reuse Transformers](https://arxiv.org/abs/2110.06821). PiperOrigin-RevId: 408969659