Commit ad2f7136 authored by chenych's avatar chenych
Browse files

Fix FA head dimension at most 128 bug

parent 77ee3a32
......@@ -228,7 +228,12 @@ with torch.no_grad():
如果针对查询语句没有负文本,你可以随机从整个语料库中选取样本作为负样本,如[toy_finetune_data.jsonl](../../examples/finetune/toy_finetune_data.jsonl)
### Train
您可以跟随下面的步骤训练 reranker:
**Tips:**
报错信息解决方案:
1. `RuntimeError: FlashAttention forward only supports head dimension at most 128`: 加入`--use_flash_attn False`参数;
您可以跟随下面的步骤训练 reranker.
**常规 reranker** (bge-reranker-base / bge-reranker-large / bge-reranker-v2-m3 )
参考: ../../examples/reranker
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment