Commit e1a1721b authored by Denis Savenkov's avatar Denis Savenkov Committed by Facebook GitHub Bot
Browse files

Fixes D2Go Github CI missing apex dependency error from D43920637

Summary:
Pull Request resolved: https://github.com/facebookresearch/d2go/pull/506

Apparently in D43920637 CI broke, missing dependency. Internally everything worked, assuming the dependency was included somewhere.

Installing apex seems to be involved, didn't see a clean options with setuptools. For now, just move it to project internal directory.

Reviewed By: ertrue, wat3rBro

Differential Revision: D44154348

fbshipit-source-id: 676597a82e052f87487849896ae79d48ebe3e61d
parent 847c6025
......@@ -4,8 +4,6 @@ import itertools
import logging
from typing import Any, Dict, List, Optional, Union
import apex
import torch
# FIXME: optimizer should not depend on quantization (or vice versa)
......@@ -318,24 +316,6 @@ def adamw_mt(cfg, model: torch.nn.Module) -> torch.optim.Optimizer:
)
@D2GO_OPTIM_MAPPER_REGISTRY.register()
def lamb(cfg, model: torch.nn.Module) -> torch.optim.Optimizer:
"""
LAMB optimizer has been proposed in `Large Batch Optimization for Deep Learning:
Training BERT in 76 minutes` (https://arxiv.org/abs/1904.00962). It helped scale
LLM training to batch sizes of 32K samples.
"""
params = get_optimizer_param_groups(model, cfg)
assert cfg.SOLVER.FUSED, "Only fused version of LAMB optimizer is supported"
return maybe_add_gradient_clipping(cfg, apex.optimizers.FusedLAMB)(
params=params,
lr=cfg.SOLVER.BASE_LR,
betas=cfg.SOLVER.BETAS,
eps=cfg.SOLVER.EPS,
)
def build_optimizer_mapper(cfg, model):
name = cfg.SOLVER.OPTIMIZER
optimizer = D2GO_OPTIM_MAPPER_REGISTRY.get(name.lower())(cfg, model)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment