当前位置:   article > 正文

NLP Bi-Encoder和Re-ranker(引流:Cross Encoder 交叉编码器 ReRanker)_cross-encoder based re-ranker

cross-encoder based re-ranker

Retrieve & Re-Rank
https://www.sbert.net/examples/applications/retrieve_rerank/README.html
Bi-Encoder vs. Cross-Encoder
https://www.sbert.net/examples/applications/cross-encoder/README.html

Bi-Encoder会用BERT对输入文本编码,再根据cosine相似度分数筛选文本。Cross-Encoder会直接计算两个句子的相关性分数。
在这里插入图片描述

如何将BI和Cross Encoder配合使用?可以先用BI-Encoder选出top 100个候选项,再用Cross-Encoder挑选最佳选项。

Combining Bi- and Cross-Encoders
Cross-Encoder achieve higher performance than Bi-Encoders, however, they do not scale well for large datasets.
Here, it can make sense to combine Cross- and Bi-Encoders, for example in Information Retrieval / Semantic Search scenarios:
First, you use an efficient Bi-Encoder to retrieve e.g. the top-100 most similar sentences for a query.
Then, you use a Cross-Encoder to re-rank these 100 hits by computing the score for every (query, hit) combination.

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/不正经/article/detail/514800
推荐阅读
相关标签
  

闽ICP备14008679号