Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
deepspeed
Commits
664fa30c
Unverified
Commit
664fa30c
authored
Jun 19, 2020
by
Shaden Smith
Committed by
GitHub
Jun 19, 2020
Browse files
Ai scale (#271)
parent
224494bd
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
10 additions
and
0 deletions
+10
-0
README.md
README.md
+5
-0
docs/index.md
docs/index.md
+5
-0
No files found.
README.md
View file @
664fa30c
...
@@ -16,6 +16,11 @@ a language model (LM) with over 17B parameters called
...
@@ -16,6 +16,11 @@ a language model (LM) with over 17B parameters called
[
Turing-NLG
](
https://www.microsoft.com/en-us/research/blog/turing-nlg-a-17-billion-parameter-language-model-by-microsoft
)
,
[
Turing-NLG
](
https://www.microsoft.com/en-us/research/blog/turing-nlg-a-17-billion-parameter-language-model-by-microsoft
)
,
establishing a new SOTA in the LM category.
establishing a new SOTA in the LM category.
DeepSpeed is an important part of Microsoft’s new
[
AI at Scale
](
https://www.microsoft.com/en-us/research/project/ai-at-scale/
)
initiative to enable next-generation AI capabilities at scale, where you can find more
information
[
here
](
https://innovation.microsoft.com/en-us/exploring-ai-at-scale
)
.
**_For further documentation, tutorials, and technical deep-dives please see [deepspeed.ai](https://www.deepspeed.ai/)!_**
**_For further documentation, tutorials, and technical deep-dives please see [deepspeed.ai](https://www.deepspeed.ai/)!_**
...
...
docs/index.md
View file @
664fa30c
...
@@ -18,6 +18,11 @@ a language model (LM) with over 17B parameters called
...
@@ -18,6 +18,11 @@ a language model (LM) with over 17B parameters called
[
Turing-NLG
](
https://www.microsoft.com/en-us/research/blog/turing-nlg-a-17-billion-parameter-language-model-by-microsoft
)
,
[
Turing-NLG
](
https://www.microsoft.com/en-us/research/blog/turing-nlg-a-17-billion-parameter-language-model-by-microsoft
)
,
establishing a new SOTA in the LM category.
establishing a new SOTA in the LM category.
DeepSpeed is an important part of Microsoft’s new
[
AI at Scale
](
https://www.microsoft.com/en-us/research/project/ai-at-scale/
)
initiative to enable next-generation AI capabilities at scale, where you can find more
information
[
here
](
https://innovation.microsoft.com/en-us/exploring-ai-at-scale
)
.
# What's New?
# What's New?
{% assign news = site.posts | where: "sneak_preview", "false" %}
{% assign news = site.posts | where: "sneak_preview", "false" %}
{% for post in news limit:5 %}
{% for post in news limit:5 %}
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment