Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
FlagEmbedding_pytorch
Commits
ad2f7136
Commit
ad2f7136
authored
Aug 20, 2024
by
chenych
Browse files
Fix FA head dimension at most 128 bug
parent
77ee3a32
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
1 deletion
+6
-1
FlagEmbedding/llm_reranker/README.md
FlagEmbedding/llm_reranker/README.md
+6
-1
No files found.
FlagEmbedding/llm_reranker/README.md
View file @
ad2f7136
...
...
@@ -228,7 +228,12 @@ with torch.no_grad():
如果针对查询语句没有负文本,你可以随机从整个语料库中选取样本作为负样本,如
[
toy_finetune_data.jsonl
](
../../examples/finetune/toy_finetune_data.jsonl
)
。
### Train
您可以跟随下面的步骤训练 reranker:
**Tips:**
报错信息解决方案:
1.
`RuntimeError: FlashAttention forward only supports head dimension at most 128`
: 加入
`--use_flash_attn False`
参数;
您可以跟随下面的步骤训练 reranker.
**常规 reranker**
(bge-reranker-base / bge-reranker-large / bge-reranker-v2-m3 )
参考: ../../examples/reranker
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment