Unverified Commit f6de0003 authored by Thomas Wolf's avatar Thomas Wolf Committed by GitHub
Browse files

Merge pull request #1346 from BramVanroy/documentation

Add small  note about the output of hidden states (closes #1332)
parents da2e47ad 15749bfc
...@@ -452,6 +452,10 @@ outputs = model(input_ids, labels=labels) ...@@ -452,6 +452,10 @@ outputs = model(input_ids, labels=labels)
loss, logits, attentions = outputs loss, logits, attentions = outputs
``` ```
### Using hidden states
By enabling the configuration option `output_hidden_states`, it was possible to retrieve the last hidden states of the encoder. In `pytorch-transformers` as well as `transformers` the return value has changed slightly: `all_hidden_states` now also includes the hidden state of the embeddings in addition to those of the encoding layers. This allows users to easily access the embeddings final state.
### Serialization ### Serialization
Breaking change in the `from_pretrained()`method: Breaking change in the `from_pretrained()`method:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment