"...git@developer.sourcefind.cn:chenpangpang/diffusers.git" did not exist on "a5edb981a72ff9d516f3bea4e18b448ed0178751"
[refactor] Making the xformers mem-efficient attention activation recursive (#1493)
* Moving the mem efficiient attention activation to the top + recursive
* black, too bad there's no pre-commit ?
Co-authored-by:
Benjamin Lefaudeux <benjamin@photoroom.com>
Showing
Please register or sign in to comment