Unverified Commit 9e3b8b7a authored by Frank Lee's avatar Frank Lee Committed by GitHub
Browse files

[doc] removed read-the-docs (#2932)

parent 77b88a38
colossalai.gemini.paramhooks
============================
.. automodule:: colossalai.gemini.paramhooks
:members:
colossalai.gemini.placement\_policy
===================================
.. automodule:: colossalai.gemini.placement_policy
:members:
colossalai.gemini
=================
.. automodule:: colossalai.gemini
:members:
.. toctree::
:maxdepth: 2
colossalai.gemini.memory_tracer
colossalai.gemini.ophooks
colossalai.gemini.paramhooks
.. toctree::
:maxdepth: 2
colossalai.gemini.chunk
colossalai.gemini.chunk_mgr
colossalai.gemini.gemini_context
colossalai.gemini.gemini_mgr
colossalai.gemini.placement_policy
colossalai.gemini.stateful_tensor
colossalai.gemini.stateful_tensor_container
colossalai.gemini.stateful_tensor_mgr
colossalai.gemini.tensor_placement_policy
colossalai.gemini.tensor_utils
colossalai.gemini.stateful\_tensor
==================================
.. automodule:: colossalai.gemini.stateful_tensor
:members:
colossalai.gemini.stateful\_tensor\_container
=============================================
.. automodule:: colossalai.gemini.stateful_tensor_container
:members:
colossalai.gemini.stateful\_tensor\_mgr
=======================================
.. automodule:: colossalai.gemini.stateful_tensor_mgr
:members:
colossalai.gemini.tensor\_placement\_policy
===========================================
.. automodule:: colossalai.gemini.tensor_placement_policy
:members:
colossalai.gemini.tensor\_utils
===============================
.. automodule:: colossalai.gemini.tensor_utils
:members:
colossalai.global\_variables
============================
.. automodule:: colossalai.global_variables
:members:
colossalai.initialize
=====================
.. automodule:: colossalai.initialize
:members:
colossalai.kernel.cuda\_native.layer\_norm
==========================================
.. automodule:: colossalai.kernel.cuda_native.layer_norm
:members:
colossalai.kernel.cuda\_native.multihead\_attention
===================================================
.. automodule:: colossalai.kernel.cuda_native.multihead_attention
:members:
colossalai.kernel.cuda\_native
==============================
.. automodule:: colossalai.kernel.cuda_native
:members:
.. toctree::
:maxdepth: 2
colossalai.kernel.cuda_native.layer_norm
colossalai.kernel.cuda_native.multihead_attention
colossalai.kernel.cuda_native.scaled_softmax
colossalai.kernel.cuda\_native.scaled\_softmax
==============================================
.. automodule:: colossalai.kernel.cuda_native.scaled_softmax
:members:
colossalai.kernel.jit.bias\_dropout\_add
========================================
.. automodule:: colossalai.kernel.jit.bias_dropout_add
:members:
colossalai.kernel.jit.bias\_gelu
================================
.. automodule:: colossalai.kernel.jit.bias_gelu
:members:
colossalai.kernel.jit.option
============================
.. automodule:: colossalai.kernel.jit.option
:members:
colossalai.kernel.jit
=====================
.. automodule:: colossalai.kernel.jit
:members:
.. toctree::
:maxdepth: 2
colossalai.kernel.jit.bias_dropout_add
colossalai.kernel.jit.bias_gelu
colossalai.kernel.jit.option
colossalai.kernel
=================
.. automodule:: colossalai.kernel
:members:
.. toctree::
:maxdepth: 2
colossalai.kernel.cuda_native
colossalai.kernel.jit
colossalai.logging.logger
=========================
.. automodule:: colossalai.logging.logger
:members:
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment