@@ -26,7 +26,9 @@ Please register [here](https://lu.ma/ygxbpzhl) and join us!
...
@@ -26,7 +26,9 @@ Please register [here](https://lu.ma/ygxbpzhl) and join us!
---
---
*Latest News* 🔥
*Latest News* 🔥
- [2023/12] Added ROCm support to vLLM.
-[2024/01] We hosted [the second vLLM meetup](https://lu.ma/ygxbpzhl) in SF! Please find the meetup slides [here](https://docs.google.com/presentation/d/12mI2sKABnUw5RBWXDYY-HtHth4iMSNcEoQ10jDQbxgA/edit?usp=sharing).
- [2024/01] Added ROCm 6.0 support to vLLM.
- [2023/12] Added ROCm 5.7 support to vLLM.
-[2023/10] We hosted [the first vLLM meetup](https://lu.ma/first-vllm-meetup) in SF! Please find the meetup slides [here](https://docs.google.com/presentation/d/1QL-XPFXiFpDBh86DbEegFXBXFXjix4v032GhShbKf3s/edit?usp=sharing).
-[2023/10] We hosted [the first vLLM meetup](https://lu.ma/first-vllm-meetup) in SF! Please find the meetup slides [here](https://docs.google.com/presentation/d/1QL-XPFXiFpDBh86DbEegFXBXFXjix4v032GhShbKf3s/edit?usp=sharing).
-[2023/09] We created our [Discord server](https://discord.gg/jz7wjKhh6g)! Join us to discuss vLLM and LLM serving! We will also post the latest announcements and updates there.
-[2023/09] We created our [Discord server](https://discord.gg/jz7wjKhh6g)! Join us to discuss vLLM and LLM serving! We will also post the latest announcements and updates there.
-[2023/09] We released our [PagedAttention paper](https://arxiv.org/abs/2309.06180) on arXiv!
-[2023/09] We released our [PagedAttention paper](https://arxiv.org/abs/2309.06180) on arXiv!
...
@@ -45,7 +47,7 @@ vLLM is fast with:
...
@@ -45,7 +47,7 @@ vLLM is fast with:
- Efficient management of attention key and value memory with **PagedAttention**
- Efficient management of attention key and value memory with **PagedAttention**