"model/git@developer.sourcefind.cn:OpenDAS/ollama.git" did not exist on "05982a95cb9e053fadf309e60ec9ff2bc58ba32e"
Commit 19c17b74 authored by Emanuele Bugliarello's avatar Emanuele Bugliarello Committed by Facebook Github Bot
Browse files

Add option to disable positional embeddings in TransformerModel (#421)

Summary:
Add argument `--no-token-positional-embeddings` to TransformerModel (currently only available in TransformerLanguageModel) to disable positional embeddings.
Pull Request resolved: https://github.com/pytorch/fairseq/pull/421

Differential Revision: D13548450

Pulled By: myleott

fbshipit-source-id: b352c702ed1609e3b84d9a8404941d3274a7f883
parent 03a57dec
......@@ -89,6 +89,8 @@ class TransformerModel(FairseqModel):
parser.add_argument('--share-all-embeddings', action='store_true',
help='share encoder, decoder and output embeddings'
' (requires shared dictionary and embed dim)')
parser.add_argument('--no-token-positional-embeddings', default=False, action='store_true',
help='if set, disables positional embeddings (outside self attention)')
parser.add_argument('--adaptive-softmax-cutoff', metavar='EXPR',
help='comma separated list of adaptive softmax cutoff points. '
'Must be used with adaptive_loss criterion'),
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment