Unverified Commit 02961717 authored by Tri Dao's avatar Tri Dao Committed by GitHub
Browse files

Merge pull request #95 from Quentin-Anthony/patch-1

Add gpt-neox adoption
parents a6ec1782 d2a69a55
...@@ -25,6 +25,8 @@ PR or email us. We'd very much like to hear from you! ...@@ -25,6 +25,8 @@ PR or email us. We'd very much like to hear from you!
[library](https://www.mosaicml.com/blog/gpt-3-quality-for-500k). Composer is a [library](https://www.mosaicml.com/blog/gpt-3-quality-for-500k). Composer is a
library for efficient neural network training. library for efficient neural network training.
- EleutherAI's [GPT-NeoX](https://github.com/EleutherAI/gpt-neox/pull/725). This is a research library for training large language transformer models at scale based on NVIDIA's Megatron-LM and Microsoft's DeepSpeed.
## MLPerf benchmarks ## MLPerf benchmarks
[MLPerf](https://mlcommons.org/en/) is a competitive machine learning performance benchmark. FlashAttention [MLPerf](https://mlcommons.org/en/) is a competitive machine learning performance benchmark. FlashAttention
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment