Commit cf5c8f47 authored by myhloli's avatar myhloli
Browse files

docs: remove outdated documentation files

- Deleted .readthedocs.yaml files from multiple directories
- Removed outdated API and user guide documentation files
- Deleted command line usage examples
- Removed CUDA acceleration guide
parent cb57e84c
version: 2
build:
os: ubuntu-22.04
tools:
python: "3.10"
formats:
- epub
python:
install:
- requirements: next_docs/zh_cn/requirements.txt
sphinx:
configuration: next_docs/zh_cn/conf.py
# Ascend NPU 加速
## 简介
本文档介绍如何在 Ascend NPU 上使用 MinerU。本文档内容已在`华为 Atlas 800T A2`服务器上测试通过。
```
CPU:鲲鹏 920 aarch64 2.6GHz
NPU:Ascend 910B 64GB
OS:openEuler 22.03 (LTS-SP3)/ Ubuntu 22.04.5 LTS
CANN:8.0.RC2
驱动版本:24.1.rc2.1
```
由于适配 Ascend NPU 的环境较为复杂,建议使用 Docker 容器运行 MinerU。
通过docker运行MinerU前需确保物理机已安装支持CANN 8.0.RC2的驱动和固件。
## 构建镜像
请保持网络状况良好,并执行以下代码构建镜像。
```bash
wget https://gcore.jsdelivr.net/gh/opendatalab/MinerU@master/docker/ascend_npu/Dockerfile -O Dockerfile
docker build -t mineru_npu:latest .
```
如果构建过程中未发生报错则说明镜像构建成功。
## 运行容器
```bash
docker run -it -u root --name mineru-npu --privileged=true \
--ipc=host \
--network=host \
--device=/dev/davinci0 \
--device=/dev/davinci1 \
--device=/dev/davinci2 \
--device=/dev/davinci3 \
--device=/dev/davinci4 \
--device=/dev/davinci5 \
--device=/dev/davinci6 \
--device=/dev/davinci7 \
--device=/dev/davinci_manager \
--device=/dev/devmm_svm \
--device=/dev/hisi_hdc \
-v /var/log/npu/:/usr/slog \
-v /usr/local/bin/npu-smi:/usr/local/bin/npu-smi \
-v /usr/local/Ascend/driver:/usr/local/Ascend/driver \
mineru_npu:latest \
/bin/bash -c "echo 'source /opt/mineru_venv/bin/activate' >> ~/.bashrc && exec bash"
magic-pdf --help
```
# Ubuntu 22.04 LTS
### 1. Check if NVIDIA Drivers Are Installed
```sh
nvidia-smi
```
If you see information similar to the following, it means that the NVIDIA drivers are already installed, and you can skip Step 2.
> [!NOTE]
> Notice:`CUDA Version` should be >= 12.4, If the displayed version number is less than 12.4, please upgrade the driver.
```plaintext
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 570.133.07 Driver Version: 572.83 CUDA Version: 12.8 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce RTX 3060 Ti WDDM | 00000000:01:00.0 On | N/A |
| 0% 51C P8 12W / 200W | 1489MiB / 8192MiB | 5% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
```
### 2. Install the Driver
If no driver is installed, use the following command:
```sh
sudo apt-get update
sudo apt-get install nvidia-driver-570-server
```
Install the proprietary driver and restart your computer after installation.
```sh
reboot
```
### 3. Install Anaconda
If Anaconda is already installed, skip this step.
```sh
wget https://repo.anaconda.com/archive/Anaconda3-2024.06-1-Linux-x86_64.sh
bash Anaconda3-2024.06-1-Linux-x86_64.sh
```
In the final step, enter `yes`, close the terminal, and reopen it.
### 4. Create an Environment Using Conda
```bash
conda create -n mineru 'python=3.12' -y
conda activate mineru
```
### 5. Install Applications
```sh
pip install -U magic-pdf[full]
```
> [!TIP]
> After installation, you can check the version of `magic-pdf` using the following command:
>
> ```sh
> magic-pdf --version
> ```
### 6. Download Models
Refer to detailed instructions on [how to download model files](how_to_download_models_en.md).
## 7. Understand the Location of the Configuration File
After completing the [6. Download Models](#6-download-models) step, the script will automatically generate a `magic-pdf.json` file in the user directory and configure the default model path.
You can find the `magic-pdf.json` file in your user directory.
> [!TIP]
> The user directory for Linux is "/home/username".
### 8. First Run
Download a sample file from the repository and test it.
```sh
wget https://github.com/opendatalab/MinerU/raw/master/demo/pdfs/small_ocr.pdf
magic-pdf -p small_ocr.pdf -o ./output
```
### 9. Test CUDA Acceleration
If your graphics card has at least **6GB** of VRAM, follow these steps to test CUDA acceleration:
1. Modify the value of `"device-mode"` in the `magic-pdf.json` configuration file located in your home directory.
```json
{
"device-mode": "cuda"
}
```
2. Test CUDA acceleration with the following command:
```sh
magic-pdf -p small_ocr.pdf -o ./output
```
\ No newline at end of file
# Ubuntu 22.04 LTS
## 1. 检测是否已安装nvidia驱动
```bash
nvidia-smi
```
如果看到类似如下的信息,说明已经安装了nvidia驱动,可以跳过步骤2
> [!NOTE]
> `CUDA Version` 显示的版本号应 >= 12.4,如显示的版本号小于12.4,请升级驱动
```plaintext
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 570.133.07 Driver Version: 572.83 CUDA Version: 12.8 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce RTX 3060 Ti WDDM | 00000000:01:00.0 On | N/A |
| 0% 51C P8 12W / 200W | 1489MiB / 8192MiB | 5% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
```
## 2. 安装驱动
如没有驱动,则通过如下命令
```bash
sudo apt-get update
sudo apt-get install nvidia-driver-570-server
```
安装专有驱动,安装完成后,重启电脑
```bash
reboot
```
## 3. 安装anacoda
如果已安装conda,可以跳过本步骤
```bash
wget -U NoSuchBrowser/1.0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/Anaconda3-2024.06-1-Linux-x86_64.sh
bash Anaconda3-2024.06-1-Linux-x86_64.sh
```
最后一步输入yes,关闭终端重新打开
## 4. 使用conda 创建环境
```bash
conda create -n mineru 'python=3.12' -y
conda activate mineru
```
## 5. 安装应用
```bash
pip install -U magic-pdf[full] -i https://mirrors.aliyun.com/pypi/simple
```
> [!TIP]
> 下载完成后,您可以通过以下命令检查`magic-pdf`的版本:
>
> ```bash
> magic-pdf --version
> ```
## 6. 下载模型
详细参考 [如何下载模型文件](how_to_download_models_zh_cn.md)
## 7. 了解配置文件存放的位置
完成[6.下载模型](#6-下载模型)步骤后,脚本会自动生成用户目录下的magic-pdf.json文件,并自动配置默认模型路径。
您可在【用户目录】下找到magic-pdf.json文件。
> [!TIP]
> linux用户目录为 "/home/用户名"
## 8. 第一次运行
从仓库中下载样本文件,并测试
```bash
wget https://gcore.jsdelivr.net/gh/opendatalab/MinerU@master/demo/pdfs/small_ocr.pdf
magic-pdf -p small_ocr.pdf -o ./output
```
## 9. 测试CUDA加速
如果您的显卡显存大于等于 **6GB** ,可以进行以下流程,测试CUDA解析加速效果
**1.修改【用户目录】中配置文件magic-pdf.json中"device-mode"的值**
```json
{
"device-mode":"cuda"
}
```
**2.运行以下命令测试cuda加速效果**
```bash
magic-pdf -p small_ocr.pdf -o ./output
```
> [!TIP]
> CUDA加速是否生效可以根据log中输出的各个阶段cost耗时来简单判断,通常情况下,使用cuda加速会比cpu更快。
# Windows 10/11
### 1. Install CUDA and cuDNN
You need to install a CUDA version that is compatible with torch's requirements. For details, please refer to the [official PyTorch website](https://pytorch.org/get-started/locally/).
- CUDA 11.8 https://developer.nvidia.com/cuda-11-8-0-download-archive
- CUDA 12.4 https://developer.nvidia.com/cuda-12-4-0-download-archive
- CUDA 12.6 https://developer.nvidia.com/cuda-12-6-0-download-archive
- CUDA 12.8 https://developer.nvidia.com/cuda-12-8-0-download-archive
### 2. Install Anaconda
If Anaconda is already installed, you can skip this step.
Download link: https://repo.anaconda.com/archive/Anaconda3-2024.06-1-Windows-x86_64.exe
### 3. Create an Environment Using Conda
```bash
conda create -n mineru 'python=3.12' -y
conda activate mineru
```
### 4. Install Applications
```
pip install -U magic-pdf[full]
```
> [!IMPORTANT]
> After installation, you can check the version of `magic-pdf` using the following command:
>
> ```bash
> magic-pdf --version
> ```
### 5. Download Models
Refer to detailed instructions on [how to download model files](how_to_download_models_en.md).
### 6. Understand the Location of the Configuration File
After completing the [5. Download Models](#5-download-models) step, the script will automatically generate a `magic-pdf.json` file in the user directory and configure the default model path.
You can find the `magic-pdf.json` file in your 【user directory】 .
> [!TIP]
> The user directory for Windows is "C:/Users/username".
### 7. First Run
Download a sample file from the repository and test it.
```powershell
wget https://github.com/opendatalab/MinerU/raw/master/demo/pdfs/small_ocr.pdf -O small_ocr.pdf
magic-pdf -p small_ocr.pdf -o ./output
```
### 8. Test CUDA Acceleration
If your graphics card has at least 6GB of VRAM, follow these steps to test CUDA-accelerated parsing performance.
1. **Overwrite the installation of torch and torchvision** supporting CUDA.(Please select the appropriate index-url based on your CUDA version. For more details, refer to the [PyTorch official website](https://pytorch.org/get-started/locally/).)
```
pip install --force-reinstall torch torchvision --index-url https://download.pytorch.org/whl/cu124
```
2. **Modify the value of `"device-mode"`** in the `magic-pdf.json` configuration file located in your user directory.
```json
{
"device-mode": "cuda"
}
```
3. **Run the following command to test CUDA acceleration**:
```
magic-pdf -p small_ocr.pdf -o ./output
```
\ No newline at end of file
# Windows10/11
## 1. 安装cuda环境
需要安装符合torch要求的cuda版本,具体可参考[torch官网](https://pytorch.org/get-started/locally/)
- CUDA 11.8 https://developer.nvidia.com/cuda-11-8-0-download-archive
- CUDA 12.4 https://developer.nvidia.com/cuda-12-4-0-download-archive
- CUDA 12.6 https://developer.nvidia.com/cuda-12-6-0-download-archive
- CUDA 12.8 https://developer.nvidia.com/cuda-12-8-0-download-archive
## 2. 安装anaconda
如果已安装conda,可以跳过本步骤
下载链接:
https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/Anaconda3-2024.06-1-Windows-x86_64.exe
## 3. 使用conda 创建环境
```bash
conda create -n mineru 'python=3.12' -y
conda activate mineru
```
## 4. 安装应用
```bash
pip install -U magic-pdf[full] -i https://mirrors.aliyun.com/pypi/simple
```
> [!IMPORTANT]
> 下载完成后,您可以通过以下命令检查magic-pdf的版本
>
> ```bash
> magic-pdf --version
> ```
## 5. 下载模型
详细参考 [如何下载模型文件](how_to_download_models_zh_cn.md)
## 6. 了解配置文件存放的位置
完成[5.下载模型](#5-下载模型)步骤后,脚本会自动生成用户目录下的magic-pdf.json文件,并自动配置默认模型路径。
您可在【用户目录】下找到magic-pdf.json文件。
> [!TIP]
> windows用户目录为 "C:/Users/用户名"
## 7. 第一次运行
从仓库中下载样本文件,并测试
```powershell
wget https://github.com/opendatalab/MinerU/raw/master/demo/pdfs/small_ocr.pdf -O small_ocr.pdf
magic-pdf -p small_ocr.pdf -o ./output
```
## 8. 测试CUDA加速
如果您的显卡显存大于等于 **6GB** ,可以进行以下流程,测试CUDA解析加速效果
**1.覆盖安装支持cuda的torch和torchvision**(请根据cuda版本选择合适的index-url,具体可参考[torch官网](https://pytorch.org/get-started/locally/))
```bash
pip install --force-reinstall torch torchvision --index-url https://download.pytorch.org/whl/cu124
```
**2.修改【用户目录】中配置文件magic-pdf.json中"device-mode"的值**
```json
{
"device-mode":"cuda"
}
```
**3.运行以下命令测试cuda加速效果**
```bash
magic-pdf -p small_ocr.pdf -o ./output
```
> [!TIP]
> CUDA加速是否生效可以根据log中输出的各个阶段的耗时来简单判断,通常情况下,cuda加速后运行速度比cpu更快。
Model downloads are divided into initial downloads and updates to the model directory. Please refer to the corresponding documentation for instructions on how to proceed.
# Initial download of model files
### Download the Model from Hugging Face
Use a Python Script to Download Model Files from Hugging Face
```bash
pip install huggingface_hub
wget https://github.com/opendatalab/MinerU/raw/master/scripts/download_models_hf.py -O download_models_hf.py
python download_models_hf.py
```
The Python script will automatically download the model files and configure the model directory in the configuration file.
The configuration file can be found in the user directory, with the filename `magic-pdf.json`.
# How to update models previously downloaded
## 1. Models downloaded via Hugging Face or Model Scope
If you previously downloaded models via Hugging Face or Model Scope, you can rerun the Python script used for the initial download. This will automatically update the model directory to the latest version.
模型下载分为首次下载和更新模型目录,请参考对应的文档内容进行操作
# 首次下载模型文件
模型文件可以从 Hugging Face 或 Model Scope 下载,由于网络原因,国内用户访问HF可能会失败,请使用 ModelScope。
<details>
<summary>方法一:从 Hugging Face 下载模型</summary>
<p>使用python脚本 从Hugging Face下载模型文件</p>
<pre><code>pip install huggingface_hub
wget https://gcore.jsdelivr.net/gh/opendatalab/MinerU@master/scripts/download_models_hf.py -O download_models_hf.py
python download_models_hf.py</code></pre>
<p>python脚本会自动下载模型文件并配置好配置文件中的模型目录</p>
</details>
## 方法二:从 ModelScope 下载模型
### 使用python脚本 从ModelScope下载模型文件
```bash
pip install modelscope
wget https://gcore.jsdelivr.net/gh/opendatalab/MinerU@master/scripts/download_models.py -O download_models.py
python download_models.py
```
python脚本会自动下载模型文件并配置好配置文件中的模型目录
配置文件可以在用户目录中找到,文件名为`magic-pdf.json`
> [!TIP]
> windows的用户目录为 "C:\\Users\\用户名", linux用户目录为 "/home/用户名", macOS用户目录为 "/Users/用户名"
# 此前下载过模型,如何更新
## 1. 通过 Hugging Face 或 Model Scope 下载过模型
如此前通过 HuggingFace 或 Model Scope 下载过模型,可以重复执行此前的模型下载python脚本,将会自动将模型目录更新到最新版本。
version: 2
build:
os: ubuntu-22.04
tools:
python: "3.10"
formats:
- epub
python:
install:
- requirements: next_docs/requirements.txt
sphinx:
configuration: next_docs/en/conf.py
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
<?xml version="1.0" encoding="UTF-8" standalone="no" ?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" version="1.1" width="224" height="72" viewBox="-29 -3.67 224 72" xml:space="preserve">
<desc>Created with Fabric.js 5.2.4</desc>
<defs>
</defs>
<rect x="0" y="0" width="100%" height="100%" fill="transparent"></rect>
<g transform="matrix(1 0 0 1 112 36)" id="7a867f58-a908-4f30-a839-fb725512b521" >
<rect style="stroke: none; stroke-width: 1; stroke-dasharray: none; stroke-linecap: butt; stroke-dashoffset: 0; stroke-linejoin: miter; stroke-miterlimit: 4; fill: rgb(255,255,255); fill-rule: nonzero; opacity: 1; visibility: hidden;" vector-effect="non-scaling-stroke" x="-112" y="-36" rx="0" ry="0" width="224" height="72" />
</g>
<g transform="matrix(Infinity NaN NaN Infinity 0 0)" id="29611287-bf1c-4faf-8eb1-df32f6424829" >
</g>
<g transform="matrix(0.07 0 0 0.07 382.02 122.8)" id="60cdd44f-027a-437a-92c4-c8d44c60ef9e" >
<path style="stroke: rgb(0,0,0); stroke-width: 0; stroke-dasharray: none; stroke-linecap: butt; stroke-dashoffset: 0; stroke-linejoin: miter; stroke-miterlimit: 4; fill: rgb(50,50,42); fill-rule: nonzero; opacity: 1;" vector-effect="non-scaling-stroke" transform=" translate(-64, -64)" d="M 57.62 61.68 C 55.919999999999995 61.92 54.75 63.46 55 65.11 C 55.1668510745875 66.32380621250819 56.039448735907676 67.32218371690155 57.22 67.65 C 57.22 67.65 64.69 70.11 77.4 71.16000000000001 C 87.61000000000001 72.01 99.2 70.43 99.2 70.43 C 100.9 70.39 102.23 68.98 102.19 67.28 C 102.17037752125772 66.4652516996782 101.82707564255573 65.69186585376654 101.23597809465886 65.13079230830253 C 100.644880546762 64.56971876283853 99.85466451370849 64.26716220997277 99.03999999999999 64.29 C 98.83999999999999 64.29 98.63999999999999 64.33000000000001 98.42999999999999 64.37 C 98.42999999999999 64.37 87.08999999999999 65.78 77.88 65.02000000000001 C 65.72999999999999 64.05000000000001 59.11 61.83000000000001 59.11 61.83000000000001 C 58.63 61.670000000000016 58.1 61.59000000000001 57.62 61.670000000000016 Z M 57.62 46.46 C 55.919999999999995 46.7 54.75 48.24 55 49.89 C 55.1668510745875 51.10380621250818 56.039448735907676 52.10218371690154 57.22 52.43 C 57.22 52.43 64.69 54.89 77.4 55.94 C 87.61000000000001 56.79 99.2 55.21 99.2 55.21 C 100.9 55.17 102.23 53.76 102.19 52.06 C 102.17037752125772 51.245251699678214 101.82707564255573 50.47186585376654 101.23597809465886 49.91079230830253 C 100.644880546762 49.34971876283853 99.85466451370849 49.047162209972754 99.03999999999999 49.07 C 98.83999999999999 49.07 98.63999999999999 49.11 98.42999999999999 49.15 C 98.42999999999999 49.15 87.08999999999999 50.559999999999995 77.88 49.8 C 65.72999999999999 48.83 59.11 46.61 59.11 46.61 C 58.63 46.45 58.1 46.37 57.62 46.45 Z M 57.62 31.240000000000002 C 55.919999999999995 31.48 54.75 33.02 55 34.67 C 55.1668510745875 35.88380621250818 56.039448735907676 36.882183716901544 57.22 37.21 C 57.22 37.21 64.69 39.67 77.4 40.72 C 87.61000000000001 41.57 99.2 39.99 99.2 39.99 C 100.9 39.95 102.23 38.54 102.19 36.84 C 102.17037752125772 36.025251699678215 101.82707564255573 35.25186585376654 101.23597809465886 34.690792308302534 C 100.644880546762 34.12971876283853 99.85466451370849 33.827162209972755 99.03999999999999 33.85 C 98.83999999999999 33.85 98.63999999999999 33.89 98.42999999999999 33.93 C 98.42999999999999 33.93 87.08999999999999 35.339999999999996 77.88 34.58 C 65.72999999999999 33.61 59.11 31.389999999999997 59.11 31.389999999999997 C 58.63 31.229999999999997 58.1 31.189999999999998 57.62 31.229999999999997 Z M 57.62 16.060000000000002 C 55.919999999999995 16.3 54.75 17.840000000000003 55 19.490000000000002 C 55.1668510745875 20.703806212508187 56.039448735907676 21.702183716901544 57.22 22.03 C 57.22 22.03 64.69 24.490000000000002 77.4 25.54 C 87.61000000000001 26.39 99.2 24.81 99.2 24.81 C 100.9 24.77 102.23 23.36 102.19 21.66 C 102.17037752125772 20.84525169967821 101.82707564255573 20.07186585376654 101.23597809465886 19.510792308302534 C 100.644880546762 18.949718762838526 99.8546645137085 18.64716220997276 99.03999999999999 18.67 C 98.83999999999999 18.67 98.63999999999999 18.71 98.42999999999999 18.75 C 98.42999999999999 18.75 87.08999999999999 20.16 77.88 19.4 C 65.72999999999999 18.43 59.11 16.209999999999997 59.11 16.209999999999997 C 58.637850878541954 16.01924514007714 58.12188500879498 15.963839409097599 57.62 16.049999999999997 Z M 36.31 0 C 20.32 0.12 14.39 5.05 14.39 5.05 L 14.39 124.42 C 14.39 124.42 20.2 119.41 38.93 120.18 C 57.66 120.95000000000002 61.5 127.53 84.50999999999999 127.97000000000001 C 107.52 128.41000000000003 113.28999999999999 124.42000000000002 113.28999999999999 124.42000000000002 L 113.60999999999999 2.750000000000014 C 113.60999999999999 2.750000000000014 103.28 5.7 83.09 5.86 C 62.95 6.01 58.11 0.73 39.62 0.12 C 38.49 0.04 37.4 0 36.31 0 Z M 49.67 7.79 C 49.67 7.79 59.36 10.98 77.24000000000001 11.870000000000001 C 92.38000000000001 12.64 107.52000000000001 10.38 107.52000000000001 10.38 L 107.52000000000001 118.53 C 107.52000000000001 118.53 99.85000000000001 122.57000000000001 80.68 121.19 C 65.82000000000001 120.14 49.480000000000004 114.49 49.480000000000004 114.49 L 49.68000000000001 7.799999999999997 Z M 40.35 10.620000000000001 C 42.050000000000004 10.620000000000001 43.46 11.990000000000002 43.46 13.73 C 43.46 15.469999999999999 42.09 16.84 40.35 16.84 C 40.35 16.84 35.34 16.88 32.28 17.16 C 27.150000000000002 17.68 23.64 19.54 23.64 19.54 C 22.150000000000002 20.349999999999998 20.25 19.74 19.48 18.25 C 18.67 16.76 19.28 14.86 20.77 14.09 C 22.259999999999998 13.32 25.33 11.67 31.67 11.06 C 35.34 10.66 40.35 10.620000000000001 40.35 10.620000000000001 Z M 37.36 25.880000000000003 C 39.06 25.840000000000003 40.35 25.880000000000003 40.35 25.880000000000003 C 42.050000000000004 26.080000000000002 43.260000000000005 27.62 43.050000000000004 29.310000000000002 C 42.88374644848126 30.726609090871516 41.76660909087151 31.843746448481262 40.35 32.010000000000005 C 40.35 32.010000000000005 35.34 32.050000000000004 32.28 32.330000000000005 C 27.150000000000002 32.85000000000001 23.64 34.71000000000001 23.64 34.71000000000001 C 22.150000000000002 35.52000000000001 20.25 34.91000000000001 19.48 33.42000000000001 C 18.67 31.93000000000001 19.28 30.03000000000001 20.77 29.26000000000001 C 20.77 29.26000000000001 25.33 26.84000000000001 31.67 26.230000000000008 C 33.53 25.99000000000001 35.67 25.910000000000007 37.36 25.870000000000008 Z M 40.35 41.06 C 42.050000000000004 41.06 43.46 42.43 43.46 44.17 C 43.46 45.910000000000004 42.09 47.28 40.35 47.28 C 40.35 47.28 35.34 47.24 32.28 47.56 C 27.150000000000002 48.080000000000005 23.64 49.940000000000005 23.64 49.940000000000005 C 22.150000000000002 50.75000000000001 20.25 50.14000000000001 19.48 48.650000000000006 C 18.67 47.160000000000004 19.28 45.260000000000005 20.77 44.49000000000001 C 20.77 44.49000000000001 25.33 42.07000000000001 31.67 41.46000000000001 C 35.34 41.02000000000001 40.35 41.06000000000001 40.35 41.06000000000001 Z" stroke-linecap="round" />
</g>
<g transform="matrix(0.07 0 0 0.07 396.05 123.14)" style="" id="eb0df536-c517-4781-a7c0-3f84cd77c272" >
<text xml:space="preserve" font-family="Lato" font-size="40" font-style="normal" font-weight="400" style="stroke: none; stroke-width: 1; stroke-dasharray: none; stroke-linecap: butt; stroke-dashoffset: 0; stroke-linejoin: miter; stroke-miterlimit: 4; fill: rgb(0,0,0); fill-rule: nonzero; opacity: 1; white-space: pre;" ><tspan x="-130" y="12.57" >Read The Docs</tspan></text>
</g>
<g transform="matrix(0.28 0 0 0.28 27.88 36)" id="7b9eddb9-1652-4040-9437-2ab90652d624" >
<path style="stroke: rgb(0,0,0); stroke-width: 0; stroke-dasharray: none; stroke-linecap: butt; stroke-dashoffset: 0; stroke-linejoin: miter; stroke-miterlimit: 4; fill: rgb(50,50,42); fill-rule: nonzero; opacity: 1;" vector-effect="non-scaling-stroke" transform=" translate(-64, -64)" d="M 57.62 61.68 C 55.919999999999995 61.92 54.75 63.46 55 65.11 C 55.1668510745875 66.32380621250819 56.039448735907676 67.32218371690155 57.22 67.65 C 57.22 67.65 64.69 70.11 77.4 71.16000000000001 C 87.61000000000001 72.01 99.2 70.43 99.2 70.43 C 100.9 70.39 102.23 68.98 102.19 67.28 C 102.17037752125772 66.4652516996782 101.82707564255573 65.69186585376654 101.23597809465886 65.13079230830253 C 100.644880546762 64.56971876283853 99.85466451370849 64.26716220997277 99.03999999999999 64.29 C 98.83999999999999 64.29 98.63999999999999 64.33000000000001 98.42999999999999 64.37 C 98.42999999999999 64.37 87.08999999999999 65.78 77.88 65.02000000000001 C 65.72999999999999 64.05000000000001 59.11 61.83000000000001 59.11 61.83000000000001 C 58.63 61.670000000000016 58.1 61.59000000000001 57.62 61.670000000000016 Z M 57.62 46.46 C 55.919999999999995 46.7 54.75 48.24 55 49.89 C 55.1668510745875 51.10380621250818 56.039448735907676 52.10218371690154 57.22 52.43 C 57.22 52.43 64.69 54.89 77.4 55.94 C 87.61000000000001 56.79 99.2 55.21 99.2 55.21 C 100.9 55.17 102.23 53.76 102.19 52.06 C 102.17037752125772 51.245251699678214 101.82707564255573 50.47186585376654 101.23597809465886 49.91079230830253 C 100.644880546762 49.34971876283853 99.85466451370849 49.047162209972754 99.03999999999999 49.07 C 98.83999999999999 49.07 98.63999999999999 49.11 98.42999999999999 49.15 C 98.42999999999999 49.15 87.08999999999999 50.559999999999995 77.88 49.8 C 65.72999999999999 48.83 59.11 46.61 59.11 46.61 C 58.63 46.45 58.1 46.37 57.62 46.45 Z M 57.62 31.240000000000002 C 55.919999999999995 31.48 54.75 33.02 55 34.67 C 55.1668510745875 35.88380621250818 56.039448735907676 36.882183716901544 57.22 37.21 C 57.22 37.21 64.69 39.67 77.4 40.72 C 87.61000000000001 41.57 99.2 39.99 99.2 39.99 C 100.9 39.95 102.23 38.54 102.19 36.84 C 102.17037752125772 36.025251699678215 101.82707564255573 35.25186585376654 101.23597809465886 34.690792308302534 C 100.644880546762 34.12971876283853 99.85466451370849 33.827162209972755 99.03999999999999 33.85 C 98.83999999999999 33.85 98.63999999999999 33.89 98.42999999999999 33.93 C 98.42999999999999 33.93 87.08999999999999 35.339999999999996 77.88 34.58 C 65.72999999999999 33.61 59.11 31.389999999999997 59.11 31.389999999999997 C 58.63 31.229999999999997 58.1 31.189999999999998 57.62 31.229999999999997 Z M 57.62 16.060000000000002 C 55.919999999999995 16.3 54.75 17.840000000000003 55 19.490000000000002 C 55.1668510745875 20.703806212508187 56.039448735907676 21.702183716901544 57.22 22.03 C 57.22 22.03 64.69 24.490000000000002 77.4 25.54 C 87.61000000000001 26.39 99.2 24.81 99.2 24.81 C 100.9 24.77 102.23 23.36 102.19 21.66 C 102.17037752125772 20.84525169967821 101.82707564255573 20.07186585376654 101.23597809465886 19.510792308302534 C 100.644880546762 18.949718762838526 99.8546645137085 18.64716220997276 99.03999999999999 18.67 C 98.83999999999999 18.67 98.63999999999999 18.71 98.42999999999999 18.75 C 98.42999999999999 18.75 87.08999999999999 20.16 77.88 19.4 C 65.72999999999999 18.43 59.11 16.209999999999997 59.11 16.209999999999997 C 58.637850878541954 16.01924514007714 58.12188500879498 15.963839409097599 57.62 16.049999999999997 Z M 36.31 0 C 20.32 0.12 14.39 5.05 14.39 5.05 L 14.39 124.42 C 14.39 124.42 20.2 119.41 38.93 120.18 C 57.66 120.95000000000002 61.5 127.53 84.50999999999999 127.97000000000001 C 107.52 128.41000000000003 113.28999999999999 124.42000000000002 113.28999999999999 124.42000000000002 L 113.60999999999999 2.750000000000014 C 113.60999999999999 2.750000000000014 103.28 5.7 83.09 5.86 C 62.95 6.01 58.11 0.73 39.62 0.12 C 38.49 0.04 37.4 0 36.31 0 Z M 49.67 7.79 C 49.67 7.79 59.36 10.98 77.24000000000001 11.870000000000001 C 92.38000000000001 12.64 107.52000000000001 10.38 107.52000000000001 10.38 L 107.52000000000001 118.53 C 107.52000000000001 118.53 99.85000000000001 122.57000000000001 80.68 121.19 C 65.82000000000001 120.14 49.480000000000004 114.49 49.480000000000004 114.49 L 49.68000000000001 7.799999999999997 Z M 40.35 10.620000000000001 C 42.050000000000004 10.620000000000001 43.46 11.990000000000002 43.46 13.73 C 43.46 15.469999999999999 42.09 16.84 40.35 16.84 C 40.35 16.84 35.34 16.88 32.28 17.16 C 27.150000000000002 17.68 23.64 19.54 23.64 19.54 C 22.150000000000002 20.349999999999998 20.25 19.74 19.48 18.25 C 18.67 16.76 19.28 14.86 20.77 14.09 C 22.259999999999998 13.32 25.33 11.67 31.67 11.06 C 35.34 10.66 40.35 10.620000000000001 40.35 10.620000000000001 Z M 37.36 25.880000000000003 C 39.06 25.840000000000003 40.35 25.880000000000003 40.35 25.880000000000003 C 42.050000000000004 26.080000000000002 43.260000000000005 27.62 43.050000000000004 29.310000000000002 C 42.88374644848126 30.726609090871516 41.76660909087151 31.843746448481262 40.35 32.010000000000005 C 40.35 32.010000000000005 35.34 32.050000000000004 32.28 32.330000000000005 C 27.150000000000002 32.85000000000001 23.64 34.71000000000001 23.64 34.71000000000001 C 22.150000000000002 35.52000000000001 20.25 34.91000000000001 19.48 33.42000000000001 C 18.67 31.93000000000001 19.28 30.03000000000001 20.77 29.26000000000001 C 20.77 29.26000000000001 25.33 26.84000000000001 31.67 26.230000000000008 C 33.53 25.99000000000001 35.67 25.910000000000007 37.36 25.870000000000008 Z M 40.35 41.06 C 42.050000000000004 41.06 43.46 42.43 43.46 44.17 C 43.46 45.910000000000004 42.09 47.28 40.35 47.28 C 40.35 47.28 35.34 47.24 32.28 47.56 C 27.150000000000002 48.080000000000005 23.64 49.940000000000005 23.64 49.940000000000005 C 22.150000000000002 50.75000000000001 20.25 50.14000000000001 19.48 48.650000000000006 C 18.67 47.160000000000004 19.28 45.260000000000005 20.77 44.49000000000001 C 20.77 44.49000000000001 25.33 42.07000000000001 31.67 41.46000000000001 C 35.34 41.02000000000001 40.35 41.06000000000001 40.35 41.06000000000001 Z" stroke-linecap="round" />
</g>
<g transform="matrix(0.9 0 0 0.9 94 36)" style="" id="385bde16-f9fa-4222-bfea-1d5d5efcf730" >
<text xml:space="preserve" font-family="Lato" font-size="15" font-style="normal" font-weight="100" style="stroke: none; stroke-width: 1; stroke-dasharray: none; stroke-linecap: butt; stroke-dashoffset: 0; stroke-linejoin: miter; stroke-miterlimit: 4; fill: rgb(0,0,0); fill-rule: nonzero; opacity: 1; white-space: pre;" ><tspan x="-48.68" y="4.71" >Read The Docs</tspan></text>
</g>
</svg>
\ No newline at end of file
This diff is collapsed.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment