• moto's avatar
    Fix HuBERT xlarge configuration and test (#1811) · 13b2349a
    moto authored
    1. Fix the HuBERT xlarge model config
    2. In the 48 transformer layers of HuBERT xlarge model, very few elements deviate from the equivalent model of fairseq, and exceeds the default atol 1e-5. This commit relax it to 3e-5 for the specific test.
    13b2349a
model.py 15.2 KB