[refactor] Making the xformers mem-efficient attention activation recursive (#1493)
* Moving the mem efficiient attention activation to the top + recursive
* black, too bad there's no pre-commit ?
Co-authored-by:
Benjamin Lefaudeux <benjamin@photoroom.com>
Showing
Please register or sign in to comment