Commit 30555418 authored by Zhilin Yang's avatar Zhilin Yang Committed by GitHub
Browse files

Update README.md

parent 3a0086fa
# Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context # Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
This repository contains the code in both **PyTorch** and **TensorFlow** for our paper This repository contains the code in both **PyTorch** and **TensorFlow** for our paper
>[Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context]() >[Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context](http://arxiv.org/abs/1901.02860)
>Zihang Dai\*, Zhilin Yang\*, Yiming Yang, William W. Cohen, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov (*: equal contribution) >Zihang Dai\*, Zhilin Yang\*, Yiming Yang, William W. Cohen, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov (*: equal contribution)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment