Bump up transformers version & Remove MistralConfig (#1254)
Showing
| ... | @@ -6,8 +6,8 @@ pyarrow # Required for Ray data. | ... | @@ -6,8 +6,8 @@ pyarrow # Required for Ray data. |
| sentencepiece # Required for LLaMA tokenizer. | sentencepiece # Required for LLaMA tokenizer. | ||
| numpy | numpy | ||
| torch == 2.0.1 | torch == 2.0.1 | ||
| transformers >= 4.33.1 # Required for Code Llama. | transformers >= 4.34.0 # Required for Mistral. | ||
| xformers == 0.0.22 | xformers == 0.0.22 # Required for Mistral. | ||
| fastapi | fastapi | ||
| uvicorn[standard] | uvicorn[standard] | ||
| pydantic < 2 # Required for OpenAI server. | pydantic < 2 # Required for OpenAI server. |
Please register or sign in to comment