Unverified Commit c9245c02 authored by Przemyslaw Tredak's avatar Przemyslaw Tredak Committed by GitHub
Browse files

Move from Sphinx Autodoc to sphinx-autoapi (#92)



* Change from AutoDoc to AutoAPI
Signed-off-by: default avatarPrzemek Tredak <ptredak@nvidia.com>

* Fixes
Signed-off-by: default avatarPrzemyslaw Tredak <ptredak@nvidia.com>

* WAR for the wrong autosummary generation
Signed-off-by: default avatarPrzemyslaw Tredak <ptredak@nvidia.com>

* Change common to be in line with pytorch API docs
Signed-off-by: default avatarPrzemek Tredak <ptredak@nvidia.com>

* Add GitHub Action to build docs
Signed-off-by: default avatarPrzemek Tredak <ptredak@nvidia.com>

* Fix
Signed-off-by: default avatarPrzemek Tredak <ptredak@nvidia.com>

* Trying to fix the versions
Signed-off-by: default avatarPrzemek Tredak <ptredak@nvidia.com>

---------
Signed-off-by: default avatarPrzemek Tredak <ptredak@nvidia.com>
Signed-off-by: default avatarPrzemyslaw Tredak <ptredak@nvidia.com>
parent 81429b80
# Copyright (c) 2022-2023, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
#
# See LICENSE for license information.
# A workflow to trigger the build of TE documentation on GitHub
name: 'Build documentation'
on:
pull_request:
workflow_dispatch:
jobs:
build_docs:
runs-on: ubuntu-latest
steps:
- name: 'Checkout'
uses: actions/checkout@v3
- name: 'Install dependencies'
run: |
pip install sphinx==5.1.1 sphinx_rtd_theme==1.0.0 nbsphinx==0.8.10 IPython ipython_genutils==0.2.0 ipywidgets==8.0.2
pip install breathe==4.34.0 sphinx-autoapi==2.0.1
sudo apt-get install -y pandoc graphviz doxygen
export GIT_SHA=$(git show-ref --hash HEAD)
- name: 'Build docs'
run: |
doxygen docs/Doxyfile
cd docs
make html
- name: 'Upload docs'
uses: actions/upload-artifact@v3
with:
name: te_docs
path: docs/_build/html
retention-days: 7
......@@ -6,10 +6,7 @@
Common API
==========
Classes
-------
.. autoapiclass:: transformer_engine.common.recipe.Format
.. autoclass:: transformer_engine.common.recipe.Format
.. autoclass:: transformer_engine.common.recipe.DelayedScaling(margin=0, interval=1, fp8_format=Format.E4M3, amax_history_len=1, amax_compute_algo="most_recent", scaling_factor_compute_algo=None, override_linear_precision=(False, False, False))
.. autoapiclass:: transformer_engine.common.recipe.DelayedScaling(margin=0, interval=1, fp8_format=Format.E4M3, amax_history_len=1, amax_compute_algo="most_recent", scaling_factor_compute_algo=None, override_linear_precision=(False, False, False))
......@@ -6,29 +6,23 @@
pyTorch
=======
Modules
-------
.. autoclass:: transformer_engine.pytorch.Linear(in_features, out_features, bias=True, **kwargs)
.. autoapiclass:: transformer_engine.pytorch.Linear(in_features, out_features, bias=True, **kwargs)
:members: forward
.. autoclass:: transformer_engine.pytorch.LayerNorm(hidden_size, eps=1e-5, **kwargs)
.. autoapiclass:: transformer_engine.pytorch.LayerNorm(hidden_size, eps=1e-5, **kwargs)
.. autoclass:: transformer_engine.pytorch.LayerNormLinear(in_features, out_features, eps=1e-5, bias=True, **kwargs)
.. autoapiclass:: transformer_engine.pytorch.LayerNormLinear(in_features, out_features, eps=1e-5, bias=True, **kwargs)
:members: forward
.. autoclass:: transformer_engine.pytorch.LayerNormMLP(hidden_size, ffn_hidden_size, eps=1e-5, bias=True, **kwargs)
.. autoapiclass:: transformer_engine.pytorch.LayerNormMLP(hidden_size, ffn_hidden_size, eps=1e-5, bias=True, **kwargs)
:members: forward
.. autoclass:: transformer_engine.pytorch.DotProductAttention(num_attention_heads, kv_channels, **kwargs)
.. autoapiclass:: transformer_engine.pytorch.DotProductAttention(num_attention_heads, kv_channels, **kwargs)
:members: forward
.. autoclass:: transformer_engine.pytorch.TransformerLayer(hidden_size, ffn_hidden_size, num_attention_heads, **kwargs)
.. autoapiclass:: transformer_engine.pytorch.TransformerLayer(hidden_size, ffn_hidden_size, num_attention_heads, **kwargs)
:members: forward
Functions
---------
.. autofunction:: transformer_engine.pytorch.fp8_autocast
.. autoapifunction:: transformer_engine.pytorch.fp8_autocast
.. autofunction:: transformer_engine.pytorch.checkpoint
.. autoapifunction:: transformer_engine.pytorch.checkpoint
......@@ -65,7 +65,8 @@ extensions = [
'sphinx.ext.mathjax',
'sphinx.ext.napoleon',
'nbsphinx',
'breathe']
'breathe',
'autoapi.extension']
templates_path = ['_templates']
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
......@@ -97,3 +98,6 @@ napoleon_custom_sections = [('Parallelism parameters', 'params_style'),
breathe_projects = {"TransformerEngine": os.path.abspath("doxygen/xml/")}
breathe_default_project = "TransformerEngine"
autoapi_generate_api_docs = False
autoapi_dirs = ["../transformer_engine"]
......@@ -14,7 +14,7 @@ import torch
from flash_attn.flash_attn_interface import flash_attn_unpadded_func
from transformer_engine.pytorch import LayerNormLinear, Linear, LayerNormMLP, LayerNorm
from transformer_engine.pytorch.module import LayerNormLinear, Linear, LayerNormMLP, LayerNorm
from transformer_engine.pytorch.jit import (
set_jit_fusion_options,
warmup_jit_bias_dropout_add_all_dtypes,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment