Unverified Commit b9ceb03d authored by Michael's avatar Michael Committed by GitHub
Browse files

[docs] Indent ordered list in add_new_model.md (#29796)

parent de81a677
...@@ -192,46 +192,46 @@ its attention layer, etc. We will be more than happy to help you. ...@@ -192,46 +192,46 @@ its attention layer, etc. We will be more than happy to help you.
2. Clone your `transformers` fork to your local disk, and add the base repository as a remote: 2. Clone your `transformers` fork to your local disk, and add the base repository as a remote:
```bash ```bash
git clone https://github.com/[your Github handle]/transformers.git git clone https://github.com/[your Github handle]/transformers.git
cd transformers cd transformers
git remote add upstream https://github.com/huggingface/transformers.git git remote add upstream https://github.com/huggingface/transformers.git
``` ```
3. Set up a development environment, for instance by running the following command: 3. Set up a development environment, for instance by running the following command:
```bash ```bash
python -m venv .env python -m venv .env
source .env/bin/activate source .env/bin/activate
pip install -e ".[dev]" pip install -e ".[dev]"
``` ```
Depending on your OS, and since the number of optional dependencies of Transformers is growing, you might get a Depending on your OS, and since the number of optional dependencies of Transformers is growing, you might get a
failure with this command. If that's the case make sure to install the Deep Learning framework you are working with failure with this command. If that's the case make sure to install the Deep Learning framework you are working with
(PyTorch, TensorFlow and/or Flax) then do: (PyTorch, TensorFlow and/or Flax) then do:
```bash ```bash
pip install -e ".[quality]" pip install -e ".[quality]"
``` ```
which should be enough for most use cases. You can then return to the parent directory which should be enough for most use cases. You can then return to the parent directory
```bash ```bash
cd .. cd ..
``` ```
4. We recommend adding the PyTorch version of *brand_new_bert* to Transformers. To install PyTorch, please follow the 4. We recommend adding the PyTorch version of *brand_new_bert* to Transformers. To install PyTorch, please follow the
instructions on https://pytorch.org/get-started/locally/. instructions on https://pytorch.org/get-started/locally/.
**Note:** You don't need to have CUDA installed. Making the new model work on CPU is sufficient. **Note:** You don't need to have CUDA installed. Making the new model work on CPU is sufficient.
5. To port *brand_new_bert*, you will also need access to its original repository: 5. To port *brand_new_bert*, you will also need access to its original repository:
```bash ```bash
git clone https://github.com/org_that_created_brand_new_bert_org/brand_new_bert.git git clone https://github.com/org_that_created_brand_new_bert_org/brand_new_bert.git
cd brand_new_bert cd brand_new_bert
pip install -e . pip install -e .
``` ```
Now you have set up a development environment to port *brand_new_bert* to 🤗 Transformers. Now you have set up a development environment to port *brand_new_bert* to 🤗 Transformers.
...@@ -421,29 +421,29 @@ You should do the following: ...@@ -421,29 +421,29 @@ You should do the following:
1. Create a branch with a descriptive name from your main branch 1. Create a branch with a descriptive name from your main branch
```bash ```bash
git checkout -b add_brand_new_bert git checkout -b add_brand_new_bert
``` ```
2. Commit the automatically generated code: 2. Commit the automatically generated code:
```bash ```bash
git add . git add .
git commit git commit
``` ```
3. Fetch and rebase to current main 3. Fetch and rebase to current main
```bash ```bash
git fetch upstream git fetch upstream
git rebase upstream/main git rebase upstream/main
``` ```
4. Push the changes to your account using: 4. Push the changes to your account using:
```bash ```bash
git push -u origin a-descriptive-name-for-my-changes git push -u origin a-descriptive-name-for-my-changes
``` ```
5. Once you are satisfied, go to the webpage of your fork on GitHub. Click on “Pull request”. Make sure to add the 5. Once you are satisfied, go to the webpage of your fork on GitHub. Click on “Pull request”. Make sure to add the
GitHub handle of some members of the Hugging Face team as reviewers, so that the Hugging Face team gets notified for GitHub handle of some members of the Hugging Face team as reviewers, so that the Hugging Face team gets notified for
...@@ -759,7 +759,7 @@ In case you are using Windows, you should replace `RUN_SLOW=1` with `SET RUN_SLO ...@@ -759,7 +759,7 @@ In case you are using Windows, you should replace `RUN_SLOW=1` with `SET RUN_SLO
</Tip> </Tip>
Second, all features that are special to *brand_new_bert* should be tested additionally in a separate test under Second, all features that are special to *brand_new_bert* should be tested additionally in a separate test under
`BrandNewBertModelTester`/``BrandNewBertModelTest`. This part is often forgotten but is extremely useful in two `BrandNewBertModelTester`/`BrandNewBertModelTest`. This part is often forgotten but is extremely useful in two
ways: ways:
- It helps to transfer the knowledge you have acquired during the model addition to the community by showing how the - It helps to transfer the knowledge you have acquired during the model addition to the community by showing how the
...@@ -776,7 +776,7 @@ It is very important to find/extract the original tokenizer file and to manage t ...@@ -776,7 +776,7 @@ It is very important to find/extract the original tokenizer file and to manage t
Transformers' implementation of the tokenizer. Transformers' implementation of the tokenizer.
To ensure that the tokenizer works correctly, it is recommended to first create a script in the original repository To ensure that the tokenizer works correctly, it is recommended to first create a script in the original repository
that inputs a string and returns the `input_ids``. It could look similar to this (in pseudo-code): that inputs a string and returns the `input_ids`. It could look similar to this (in pseudo-code):
```python ```python
input_str = "This is a long example input string containing special characters .$?-, numbers 2872 234 12 and words." input_str = "This is a long example input string containing special characters .$?-, numbers 2872 234 12 and words."
...@@ -827,7 +827,7 @@ the community to add some *Tips* to show how the model should be used. Don't hes ...@@ -827,7 +827,7 @@ the community to add some *Tips* to show how the model should be used. Don't hes
regarding the docstrings. regarding the docstrings.
Next, make sure that the docstring added to `src/transformers/models/brand_new_bert/modeling_brand_new_bert.py` is Next, make sure that the docstring added to `src/transformers/models/brand_new_bert/modeling_brand_new_bert.py` is
correct and included all necessary inputs and outputs. We have a detailed guide about writing documentation and our docstring format [here](writing-documentation). It is always to good to remind oneself that documentation should correct and included all necessary inputs and outputs. We have a detailed guide about writing documentation and our docstring format [here](writing-documentation). It is always good to remind oneself that documentation should
be treated at least as carefully as the code in 🤗 Transformers since the documentation is usually the first contact be treated at least as carefully as the code in 🤗 Transformers since the documentation is usually the first contact
point of the community with the model. point of the community with the model.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment