Fix the type annotations of three parameters found in two constructors (#1268)
Summary: As their names suggest, the parameters `embedding_dim`, `ffn_embedding_dim`, and `num_attention_heads` should have type `int`, not `float`. Also validated by https://github.com/pytorch/fairseq/blob/b5f41f828b0ec9b67fa60aceb0778073d1b368b2/fairseq/modules/sparse_transformer_sentence_encoder.py#L22#L24. Pull Request resolved: https://github.com/pytorch/fairseq/pull/1268 Differential Revision: D18372518 Pulled By: myleott fbshipit-source-id: 666739b6270a975536785886068a975e07312bb0
Showing
Please register or sign in to comment