Commit f3250a1d authored by Hongkun Yu's avatar Hongkun Yu Committed by A. Unique TensorFlower
Browse files

Update readme: add note for the future change of transformer folder.

PiperOrigin-RevId: 288531984
parent 3991a54f
......@@ -3,11 +3,7 @@
The academic paper which describes BERT in detail and provides full results on a
number of tasks can be found here: https://arxiv.org/abs/1810.04805.
This repository contains TensorFlow 2 implementation for BERT.
N.B. This repository is under active development. Though we intend
to keep the top-level BERT Keras model interface stable, expect continued
changes to the training code, utility function interface and flags.
This repository contains TensorFlow 2.x implementation for BERT.
## Contents
* [Contents](#contents)
......@@ -110,8 +106,8 @@ pip install tf-nightly
```
Warning: More details TPU-specific set-up instructions and tutorial should come
along with official TF 2.x release for TPU. Note that this repo is not officially
supported by Google Cloud TPU team yet.
along with official TF 2.x release for TPU. Note that this repo is not
officially supported by Google Cloud TPU team yet until TF 2.1 released.
## Process Datasets
......
# Transformer Translation Model
This is an implementation of the Transformer translation model as described in
the [Attention is All You Need](https://arxiv.org/abs/1706.03762) paper. The
implementation leverages tf.keras and makes sure it is compatible with TF 2.0.
implementation leverages tf.keras and makes sure it is compatible with TF 2.x.
**Note: this transformer folder is subject to be integrated into official/nlp
folder. Due to its dependencies, we will finish the refactoring after the model
garden 2.1 release.**
## Contents
* [Contents](#contents)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment