"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "06f561687c94f572b03ef71d707b697401b34ce9"
Unverified Commit f4ad3d8c authored by Joe Davison's avatar Joe Davison Committed by GitHub
Browse files

minor typo fix

*negative* log-likelihood
parent 57c1749e
...@@ -18,8 +18,8 @@ that the metric applies specifically to classical language models (sometimes cal ...@@ -18,8 +18,8 @@ that the metric applies specifically to classical language models (sometimes cal
models) and is not well defined for masked language models like BERT (see :doc:`summary of the models models) and is not well defined for masked language models like BERT (see :doc:`summary of the models
<model_summary>`). <model_summary>`).
Perplexity is defined as the exponentiated average log-likelihood of a sequence. If we have a tokenized sequence Perplexity is defined as the exponentiated average negative log-likelihood of a sequence. If we have a tokenized
:math:`X = (x_0, x_1, \dots, x_t)`, then the perplexity of :math:`X` is, sequence :math:`X = (x_0, x_1, \dots, x_t)`, then the perplexity of :math:`X` is,
.. math:: .. math::
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment