Unverified Commit f5c425c8 authored by Frank Lee's avatar Frank Lee Committed by GitHub
Browse files

fixed the example docstring for booster (#3795)

parent 788e07db
...@@ -23,27 +23,28 @@ class Booster: ...@@ -23,27 +23,28 @@ class Booster:
training with different precision, accelerator, and plugin. training with different precision, accelerator, and plugin.
Examples: Examples:
>>> colossalai.launch(...) ```python
>>> plugin = GeminiPlugin(stage=3, ...) colossalai.launch(...)
>>> booster = Booster(precision='fp16', plugin=plugin) plugin = GeminiPlugin(stage=3, ...)
>>> booster = Booster(precision='fp16', plugin=plugin)
>>> model = GPT2()
>>> optimizer = Adam(model.parameters()) model = GPT2()
>>> dataloader = Dataloader(Dataset) optimizer = Adam(model.parameters())
>>> lr_scheduler = LinearWarmupScheduler() dataloader = Dataloader(Dataset)
>>> criterion = GPTLMLoss() lr_scheduler = LinearWarmupScheduler()
>>> criterion = GPTLMLoss()
>>> model, optimizer, lr_scheduler, dataloader = booster.boost(model, optimizer, lr_scheduler, dataloader)
>>> model, optimizer, lr_scheduler, dataloader = booster.boost(model, optimizer, lr_scheduler, dataloader)
>>> for epoch in range(max_epochs):
>>> for input_ids, attention_mask in dataloader: for epoch in range(max_epochs):
>>> outputs = model(input_ids, attention_mask) for input_ids, attention_mask in dataloader:
>>> loss = criterion(outputs.logits, input_ids) outputs = model(input_ids, attention_mask)
>>> booster.backward(loss, optimizer) loss = criterion(outputs.logits, input_ids)
>>> optimizer.step() booster.backward(loss, optimizer)
>>> lr_scheduler.step() optimizer.step()
>>> optimizer.zero_grad() lr_scheduler.step()
optimizer.zero_grad()
```
Args: Args:
device (str or torch.device): The device to run the training. Default: 'cuda'. device (str or torch.device): The device to run the training. Default: 'cuda'.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment