README.md 6.28 KB
Newer Older
A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
1
![Logo](https://storage.googleapis.com/model_garden_artifacts/TF_Model_Garden.png)
2

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
3
# TensorFlow Official Models
4

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
5
6
7
8
The TensorFlow official models are a collection of models
that use TensorFlow’s high-level APIs.
They are intended to be well-maintained, tested, and kept up to date
with the latest TensorFlow API.
A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
9

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
10
11
12
13
They should also be reasonably optimized for fast performance while still
being easy to read.
These models are used as end-to-end tests, ensuring that the models run
with the same or improved speed and performance with each new TensorFlow build.
14

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
15
## More models to come!
16

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
17
18
19
The team is actively developing new models.
In the near future, we will add:

20
21
* State-of-the-art language understanding models.
* State-of-the-art image classification models.
bhack's avatar
bhack committed
22
* State-of-the-art object detection and instance segmentation models.
A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
23
24

## Table of Contents
25

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
26
27
28
29
30
31
32
33
34
- [Models and Implementations](#models-and-implementations)
  * [Computer Vision](#computer-vision)
    + [Image Classification](#image-classification)
    + [Object Detection and Segmentation](#object-detection-and-segmentation)
  * [Natural Language Processing](#natural-language-processing)
  * [Recommendation](#recommendation)
- [How to get started with the official models](#how-to-get-started-with-the-official-models)

## Models and Implementations
Hongkun Yu's avatar
Hongkun Yu committed
35

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
36
### Computer Vision
Hongkun Yu's avatar
Hongkun Yu committed
37

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
38
39
40
41
42
43
#### Image Classification

| Model | Reference (Paper) |
|-------|-------------------|
| [MNIST](vision/image_classification) | A basic model to classify digits from the [MNIST dataset](http://yann.lecun.com/exdb/mnist/) |
| [ResNet](vision/image_classification) | [Deep Residual Learning for Image Recognition](https://arxiv.org/abs/1512.03385) |
44
| [EfficientNet](vision/image_classification) | [EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks](https://arxiv.org/abs/1905.11946) |
45

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
46
#### Object Detection and Segmentation
47

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
48
49
50
51
| Model | Reference (Paper) |
|-------|-------------------|
| [RetinaNet](vision/detection) | [Focal Loss for Dense Object Detection](https://arxiv.org/abs/1708.02002) |
| [Mask R-CNN](vision/detection) | [Mask R-CNN](https://arxiv.org/abs/1703.06870) |
52
| [ShapeMask](vision/detection) | [ShapeMask: Learning to Segment Novel Objects by Refining Shape Priors](https://arxiv.org/abs/1904.03239) |
53
| [SpineNet](vision/detection) | [SpineNet: Learning Scale-Permuted Backbone for Recognition and Localization](https://arxiv.org/abs/1912.05027) |
A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
54
55

### Natural Language Processing
56

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
57
58
59
60
61
62
63
| Model | Reference (Paper) |
|-------|-------------------|
| [ALBERT (A Lite BERT)](nlp/albert) | [ALBERT: A Lite BERT for Self-supervised Learning of Language Representations](https://arxiv.org/abs/1909.11942) |
| [BERT (Bidirectional Encoder Representations from Transformers)](nlp/bert) | [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805) |
| [NHNet (News Headline generation model)](nlp/nhnet) | [Generating Representative Headlines for News Stories](https://arxiv.org/abs/2001.09386) |
| [Transformer](nlp/transformer) | [Attention Is All You Need](https://arxiv.org/abs/1706.03762) |
| [XLNet](nlp/xlnet) | [XLNet: Generalized Autoregressive Pretraining for Language Understanding](https://arxiv.org/abs/1906.08237) |
64

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
65
66
67
68
69
70
71
### Recommendation

| Model | Reference (Paper) |
|-------|-------------------|
| [NCF](recommendation) | [Neural Collaborative Filtering](https://arxiv.org/abs/1708.05031) |

## How to get started with the official models
Hongkun Yu's avatar
Hongkun Yu committed
72

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
73
74
75
76
77
78
79
80
* The models in the master branch are developed using TensorFlow 2,
and they target the TensorFlow [nightly binaries](https://github.com/tensorflow/tensorflow#installation)
built from the
[master branch of TensorFlow](https://github.com/tensorflow/tensorflow/tree/master).
* The stable versions targeting releases of TensorFlow are available
as tagged branches or [downloadable releases](https://github.com/tensorflow/models/releases).
* Model repository version numbers match the target TensorFlow release,
such that
Hongkun Yu's avatar
Hongkun Yu committed
81
[release v2.2.0](https://github.com/tensorflow/models/releases/tag/v2.2.0)
A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
82
are compatible with
Hongkun Yu's avatar
Hongkun Yu committed
83
[TensorFlow v2.2.0](https://github.com/tensorflow/tensorflow/releases/tag/v2.2.0).
Hongkun Yu's avatar
Hongkun Yu committed
84

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
85
Please follow the below steps before running models in this repository.
86

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
87
### Requirements
88

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
89
* The latest TensorFlow Model Garden release and TensorFlow 2
Hongkun Yu's avatar
Hongkun Yu committed
90
  * If you are on a version of TensorFlow earlier than 2.2, please
A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
91
upgrade your TensorFlow to [the latest TensorFlow 2](https://www.tensorflow.org/install/).
Hongkun Yu's avatar
Hongkun Yu committed
92

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
93
94
95
```shell
pip3 install tf-nightly
```
Hongkun Yu's avatar
Hongkun Yu committed
96

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
97
### Installation
Hongkun Yu's avatar
Hongkun Yu committed
98

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
99
#### Method 1: Install the TensorFlow Model Garden pip package
Hongkun Yu's avatar
Hongkun Yu committed
100

101
102
**tf-models-official** is the stable Model Garden package.
pip will install all models and dependencies automatically.
103

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
104
```shell
105
pip install tf-models-official
A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
106
```
107

Chen Chen's avatar
Chen Chen committed
108
109
110
111
112
113
If you are using nlp packages, please also install **tensorflow-text**:

```shell
pip install tensorflow-text
```

Jared T Nielsen's avatar
Jared T Nielsen committed
114
Please check out our [example](colab/fine_tuning_bert.ipynb)
A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
115
to learn how to use a PIP package.
116

117
118
119
120
121
122
123
124
Note that **tf-models-official** may not include the latest changes in this
github repo. To include latest changes, you may install **tf-models-nightly**,
which is the nightly Model Garden package created daily automatically.

```shell
pip install tf-models-nightly
```

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
125
#### Method 2: Clone the source
126

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
127
1. Clone the GitHub repository:
128

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
129
130
131
```shell
git clone https://github.com/tensorflow/models.git
```
132

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
133
2. Add the top-level ***/models*** folder to the Python path.
134

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
135
136
137
```shell
export PYTHONPATH=$PYTHONPATH:/path/to/models
```
138

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
139
If you are using a Colab notebook, please set the Python path with os.environ.
140

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
141
142
143
144
```python
import os
os.environ['PYTHONPATH'] += ":/path/to/models"
```
145

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
146
3. Install other dependencies
147

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
148
149
150
```shell
pip3 install --user -r official/requirements.txt
```
Hongkun Yu's avatar
Hongkun Yu committed
151

Chen Chen's avatar
Chen Chen committed
152
153
154
155
156
157
158
Finally, if you are using nlp packages, please also install
**tensorflow-text-nightly**:

```shell
pip3 install tensorflow-text-nightly
```

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
159
## Contributions
160

A. Unique TensorFlower's avatar
A. Unique TensorFlower committed
161
If you want to contribute, please review the [contribution guidelines](https://github.com/tensorflow/models/wiki/How-to-contribute).
Hongkun Yu's avatar
Hongkun Yu committed
162
163
164
165
166
167
168
169
170
171
172
173
174
175

## Citing TF Official Model Garden

To cite this repository:

```
@software{tfmodels2020github,
  author = {Chen Chen and Xianzhi Du and Le Hou and Jaeyoun Kim and Pengchong
  Jin and Jing Li and Yeqing Li and Abdullah Rashwan and Hongkun Yu},
  title = {TensorFlow Official Model Garden},
  url = {https://github.com/tensorflow/models/tree/master/official},
  year = {2020},
}
```