README.md 1 KB
Newer Older
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
language:
- multilingual
---

# bert-base-multilingual-cased-sentence

Sentence Multilingual BERT \(101 languages, cased, 12-layer, 768-hidden, 12-heads, 180M parameters\)
is a representation-based sentence encoder for 101 languages of Multilingual BERT.
It is initialized with Multilingual BERT and then fine-tuned on english MultiNLI\[1\] and on dev set
of multilingual XNLI\[2\].
Sentence representations are mean pooled token embeddings in the same manner as in Sentence-BERT\[3\].


\[1\]: Williams A., Nangia N. & Bowman S. \(2017\) A Broad-Coverage Challenge Corpus for Sentence Understanding
through Inference. arXiv preprint [arXiv:1704.05426](https://arxiv.org/abs/1704.05426)

\[2\]: Williams A., Bowman S. \(2018\) XNLI: Evaluating Cross-lingual Sentence Representations.
arXiv preprint [arXiv:1809.05053](https://arxiv.org/abs/1809.05053)

\[3\]: N. Reimers, I. Gurevych \(2019\) Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks.
arXiv preprint [arXiv:1908.10084](https://arxiv.org/abs/1908.10084)