Removed the redundant SiLUActivation class. (#27136)
* Removed the redundant SiLUActivation class and now use nn.functional.silu directly. * I apologize for adding torch.functional.silu. I have replaced it with nn.SiLU.
Showing
Please register or sign in to comment