nohup.out 1.41 KB
Newer Older
yuhai's avatar
yuhai committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
Creation of the directory /Iterative_masking-master/results_new failed
Tokenize
MSA Transformer model imported
[10] 1 1
Number of sequences in /Iterative_masking-master/examples/PF00072.fasta:  73062
MSA Imported
We are using batch MSAs of 10 sequences
MSA converted into tokens tensor of size and type:
torch.Size([1, 10, 113]) torch.int64
Generate Class
MSA Transformer model imported
[10] 1 1
Number of sequences in /Iterative_masking-master/examples/PF00072.fasta:  73062
MSA Imported
We are using batch MSAs of 10 sequences
MSA converted into tokens tensor of size and type:
torch.Size([1, 10, 113]) torch.int64
Compute results from Class
Generating MSA with same size as the original one
Successfully created the directory /Iterative_masking-master/results_new/Generated_iter-20_pmask-0.1_seqs-100_(only-masked-sampled) 
---

### 1



int([x]) -> integer
int(x, base=10) -> integer

Convert a number or string to an integer, or return 0 if no arguments
are given.  If x is a number, return x.__int__().  For floating point
numbers, this truncates towards zero.

If x is not a number or if base is given, then x must be a string,
bytes, or bytearray instance representing an integer literal in the
given base.  The literal can be preceded by '+' or '-' and be surrounded
by whitespace.  The base defaults to 10.  Valid bases are 0 and 2-36.
Base 0 means to interpret the base from the string as an integer literal.
>>> int('0b100', base=0)
4