Unverified Commit 00183ed5 authored by Shaden Smith's avatar Shaden Smith Committed by GitHub
Browse files

WIP tutorial warning (#224)

parent 53ac7947
......@@ -3,6 +3,12 @@ title: "BERT Pre-training"
excerpt: ""
---
**Note:**
This tutorial is being updated to include new details for reproducing the
recent 44-minute [BERT pre-training record](https://www.microsoft.com/en-us/research/blog/zero-2-deepspeed-shattering-barriers-of-deep-learning-speed-scale/).
Please check again soon!
{: .notice--warning}
In this tutorial we will apply DeepSpeed to pre-train the BERT
(**B**idirectional **E**ncoder **R**epresentations from **T**ransformers),
which is widely used for many Natural Language Processing (NLP) tasks. The
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment