Unverified Commit ed2a0adb authored by Nicolas Hug's avatar Nicolas Hug Committed by GitHub
Browse files

Cleanup weight docs (#7074)



* _weight_size -> _file_size

* Better formatting of individual weights tables

* Remove file size from main tables to avoid confusion with weight size (as in RAM)

* Remove unnecessary (file size) suffix

* Fix CI error?

* Formatting
Co-authored-by: default avatarPhilip Meier <github.pmeier@posteo.de>
parent 90cfb10d
......@@ -364,7 +364,7 @@ class ViT_B_16_Weights(WeightsEnum):
}
},
"_ops": 17.564,
"_weight_size": 330.285,
"_file_size": 330.285,
"_docs": """
These weights were trained from scratch by using a modified version of `DeIT
<https://arxiv.org/abs/2012.12877>`_'s training recipe.
......@@ -390,7 +390,7 @@ class ViT_B_16_Weights(WeightsEnum):
}
},
"_ops": 55.484,
"_weight_size": 331.398,
"_file_size": 331.398,
"_docs": """
These weights are learnt via transfer learning by end-to-end fine-tuning the original
`SWAG <https://arxiv.org/abs/2201.08371>`_ weights on ImageNet-1K data.
......@@ -417,7 +417,7 @@ class ViT_B_16_Weights(WeightsEnum):
}
},
"_ops": 17.564,
"_weight_size": 330.285,
"_file_size": 330.285,
"_docs": """
These weights are composed of the original frozen `SWAG <https://arxiv.org/abs/2201.08371>`_ trunk
weights and a linear classifier learnt on top of them trained on ImageNet-1K data.
......@@ -443,7 +443,7 @@ class ViT_B_32_Weights(WeightsEnum):
}
},
"_ops": 4.409,
"_weight_size": 336.604,
"_file_size": 336.604,
"_docs": """
These weights were trained from scratch by using a modified version of `DeIT
<https://arxiv.org/abs/2012.12877>`_'s training recipe.
......@@ -469,7 +469,7 @@ class ViT_L_16_Weights(WeightsEnum):
}
},
"_ops": 61.555,
"_weight_size": 1161.023,
"_file_size": 1161.023,
"_docs": """
These weights were trained from scratch by using a modified version of TorchVision's
`new training recipe
......@@ -496,7 +496,7 @@ class ViT_L_16_Weights(WeightsEnum):
}
},
"_ops": 361.986,
"_weight_size": 1164.258,
"_file_size": 1164.258,
"_docs": """
These weights are learnt via transfer learning by end-to-end fine-tuning the original
`SWAG <https://arxiv.org/abs/2201.08371>`_ weights on ImageNet-1K data.
......@@ -523,7 +523,7 @@ class ViT_L_16_Weights(WeightsEnum):
}
},
"_ops": 61.555,
"_weight_size": 1161.023,
"_file_size": 1161.023,
"_docs": """
These weights are composed of the original frozen `SWAG <https://arxiv.org/abs/2201.08371>`_ trunk
weights and a linear classifier learnt on top of them trained on ImageNet-1K data.
......@@ -549,7 +549,7 @@ class ViT_L_32_Weights(WeightsEnum):
}
},
"_ops": 15.378,
"_weight_size": 1169.449,
"_file_size": 1169.449,
"_docs": """
These weights were trained from scratch by using a modified version of `DeIT
<https://arxiv.org/abs/2012.12877>`_'s training recipe.
......@@ -579,7 +579,7 @@ class ViT_H_14_Weights(WeightsEnum):
}
},
"_ops": 1016.717,
"_weight_size": 2416.643,
"_file_size": 2416.643,
"_docs": """
These weights are learnt via transfer learning by end-to-end fine-tuning the original
`SWAG <https://arxiv.org/abs/2201.08371>`_ weights on ImageNet-1K data.
......@@ -606,7 +606,7 @@ class ViT_H_14_Weights(WeightsEnum):
}
},
"_ops": 167.295,
"_weight_size": 2411.209,
"_file_size": 2411.209,
"_docs": """
These weights are composed of the original frozen `SWAG <https://arxiv.org/abs/2201.08371>`_ trunk
weights and a linear classifier learnt on top of them trained on ImageNet-1K data.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment