• Aymeric Augustin's avatar
    Prevent excessive parallelism in PyTorch. · 80caf79d
    Aymeric Augustin authored
    We're already using as many processes in parallel as we have CPU cores.
    Furthermore, the number of core may be incorrectly calculated as 36
    (we've seen this in pytest-xdist) which make compound the problem.
    
    PyTorch performance craters without this.
    80caf79d
config.yml 3.55 KB