Creation of the directory /Iterative_masking-master/results_new failed Tokenize MSA Transformer model imported [10] 1 1 Number of sequences in /Iterative_masking-master/examples/PF00072.fasta: 73062 MSA Imported We are using batch MSAs of 10 sequences MSA converted into tokens tensor of size and type: torch.Size([1, 10, 113]) torch.int64 Generate Class MSA Transformer model imported [10] 1 1 Number of sequences in /Iterative_masking-master/examples/PF00072.fasta: 73062 MSA Imported We are using batch MSAs of 10 sequences MSA converted into tokens tensor of size and type: torch.Size([1, 10, 113]) torch.int64 Compute results from Class Generating MSA with same size as the original one Successfully created the directory /Iterative_masking-master/results_new/Generated_iter-20_pmask-0.1_seqs-100_(only-masked-sampled) --- ### 1 int([x]) -> integer int(x, base=10) -> integer Convert a number or string to an integer, or return 0 if no arguments are given. If x is a number, return x.__int__(). For floating point numbers, this truncates towards zero. If x is not a number or if base is given, then x must be a string, bytes, or bytearray instance representing an integer literal in the given base. The literal can be preceded by '+' or '-' and be surrounded by whitespace. The base defaults to 10. Valid bases are 0 and 2-36. Base 0 means to interpret the base from the string as an integer literal. >>> int('0b100', base=0) 4