"...text-generation-inference.git" did not exist on "f171bdc82313e58bf82a05fee1d5c01d2f3f2be0"
[refactor] Making the xformers mem-efficient attention activation recursive (#1493)
* Moving the mem efficiient attention activation to the top + recursive
* black, too bad there's no pre-commit ?
Co-authored-by:
Benjamin Lefaudeux <benjamin@photoroom.com>
Showing
Please register or sign in to comment