Fix HuBERT xlarge configuration and test (#1811)
1. Fix the HuBERT xlarge model config 2. In the 48 transformer layers of HuBERT xlarge model, very few elements deviate from the equivalent model of fairseq, and exceeds the default atol 1e-5. This commit relax it to 3e-5 for the specific test.
Showing
Please register or sign in to comment