[InternLM] Add support for InternLM (#26302)
* Add config.bias to LLaMA to allow InternLM models to be ported as LLaMA checkpoints * Rename bias -> attention_bias and add docstring
Showing
Please register or sign in to comment
* Add config.bias to LLaMA to allow InternLM models to be ported as LLaMA checkpoints * Rename bias -> attention_bias and add docstring