-
Frederick Liu authored
[reuse] Add layers used in [Leveraging redundancy in attention with Reuse Transformers](https://arxiv.org/abs/2110.06821). PiperOrigin-RevId: 408969659
892dac23
[reuse] Add layers used in [Leveraging redundancy in attention with Reuse Transformers](https://arxiv.org/abs/2110.06821). PiperOrigin-RevId: 408969659