Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
f9644932
Unverified
Commit
f9644932
authored
Feb 12, 2024
by
Simon Mo
Committed by
GitHub
Feb 12, 2024
Browse files
[CI] Ensure documentation build is checked in CI (#2842)
parent
a4211a4d
Changes
5
Hide whitespace changes
Inline
Side-by-side
Showing
5 changed files
with
14 additions
and
1 deletion
+14
-1
.buildkite/test-pipeline.yaml
.buildkite/test-pipeline.yaml
+7
-0
.buildkite/test-template.j2
.buildkite/test-template.j2
+3
-1
docs/source/conf.py
docs/source/conf.py
+2
-0
docs/source/index.rst
docs/source/index.rst
+1
-0
docs/source/quantization/fp8_e5m2_kv_cache.rst
docs/source/quantization/fp8_e5m2_kv_cache.rst
+1
-0
No files found.
.buildkite/test-pipeline.yaml
View file @
f9644932
...
@@ -49,3 +49,10 @@ steps:
...
@@ -49,3 +49,10 @@ steps:
commands
:
commands
:
-
pip install aiohttp
-
pip install aiohttp
-
bash run-benchmarks.sh
-
bash run-benchmarks.sh
-
label
:
Documentation Build
working_dir
:
"
/vllm-workspace/docs"
no_gpu
:
True
commands
:
-
pip install -r requirements-docs.txt
-
SPHINXOPTS=\"-W\" make html
.buildkite/test-template.j2
View file @
f9644932
...
@@ -35,13 +35,15 @@ steps:
...
@@ -35,13 +35,15 @@ steps:
- image: "{{ docker_image }}"
- image: "{{ docker_image }}"
command: ["bash"]
command: ["bash"]
args:
args:
-
"
-c
"
-
'
-c
'
- "'cd {{ (step.working_dir or default_working_dir) | safe }} && {{ step.command or (step.commands | join(' && ')) | safe }}'"
- "'cd {{ (step.working_dir or default_working_dir) | safe }} && {{ step.command or (step.commands | join(' && ')) | safe }}'"
{% if not step.no_gpu %}
resources:
resources:
requests:
requests:
nvidia.com/gpu: "{{ step.num_gpus or default_num_gpu }}"
nvidia.com/gpu: "{{ step.num_gpus or default_num_gpu }}"
limits:
limits:
nvidia.com/gpu: "{{ step.num_gpus or default_num_gpu }}"
nvidia.com/gpu: "{{ step.num_gpus or default_num_gpu }}"
{% endif %}
env:
env:
- name: HF_TOKEN
- name: HF_TOKEN
valueFrom:
valueFrom:
...
...
docs/source/conf.py
View file @
f9644932
...
@@ -94,3 +94,5 @@ class MockedClassDocumenter(autodoc.ClassDocumenter):
...
@@ -94,3 +94,5 @@ class MockedClassDocumenter(autodoc.ClassDocumenter):
autodoc
.
ClassDocumenter
=
MockedClassDocumenter
autodoc
.
ClassDocumenter
=
MockedClassDocumenter
navigation_with_keys
=
False
docs/source/index.rst
View file @
f9644932
...
@@ -89,6 +89,7 @@ Documentation
...
@@ -89,6 +89,7 @@ Documentation
:caption: Quantization
:caption: Quantization
quantization/auto_awq
quantization/auto_awq
quantization/fp8_e5m2_kv_cache
.. toctree::
.. toctree::
:maxdepth: 2
:maxdepth: 2
...
...
docs/source/quantization/fp8_e5m2_kv_cache.rst
View file @
f9644932
...
@@ -9,6 +9,7 @@ The FP8 data format retains 2~3 mantissa bits and can convert float/fp16/bflaot1
...
@@ -9,6 +9,7 @@ The FP8 data format retains 2~3 mantissa bits and can convert float/fp16/bflaot1
Here is an example of how to enable this feature:
Here is an example of how to enable this feature:
.. code-block:: python
.. code-block:: python
from vllm import LLM, SamplingParams
from vllm import LLM, SamplingParams
# Sample prompts.
# Sample prompts.
prompts = [
prompts = [
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment