Add a default decoder_attention_mask for EncoderDecoderModel during training (#26752)
* Add a default decoder_attention_mask for EncoderDecoderModel during training Since we are already creating the default decoder_input_ids from the labels, we should also create a default decoder_attention_mask to go with it. * Fix test constant that relied on manual_seed() The test was changed to use a decoder_attention_mask that ignores padding instead (which is the default one created by BERT when attention_mask is None). * Create the decoder_attention_mask using decoder_input_ids instead of labels * Fix formatting in test
Showing
Please register or sign in to comment