AI Tools.

Search

text classification

bge-reranker-v2-m3

BGE-Reranker-v2-M3 is BAAI's multilingual cross-encoder reranker built on XLM-RoBERTa, designed for re-ranking retrieved passages in multilingual RAG or search pipelines. It jointly encodes query-passage pairs to produce relevance scores, providing higher accuracy than bi-encoder similarity for the same candidate set. Apache 2.0 licensed with text-embeddings-inference support.

Last reviewed

Use cases

  • Re-ranking multilingual retrieval results in RAG pipelines for higher precision
  • Cross-lingual passage ranking (query and passage in different languages)
  • Second-stage ranking in multilingual search systems
  • Relevance scoring for multilingual FAQ and document retrieval
  • Improving retrieval quality over BGE-M3 dense retrieval as a reranker pair

Pros

  • Multilingual support across 100+ languages from XLM-RoBERTa backbone
  • Apache 2.0 license; text-embeddings-inference compatible
  • Natural pairing with BGE-M3 as a two-stage retrieval system
  • Cross-encoder accuracy improvement over bi-encoder similarity for re-ranking

Cons

  • Re-ranking latency scales with candidate set size — impractical for large first-stage pools
  • Cannot index documents — must process each query-candidate pair
  • XLM-RoBERTa backbone quality gaps for low-resource languages
  • Slower than English-only cross-encoders for English-only pipelines
  • Accuracy improvement over simpler rerankers varies by domain and language

FAQ

What is bge-reranker-v2-m3 used for?

Re-ranking multilingual retrieval results in RAG pipelines for higher precision. Cross-lingual passage ranking (query and passage in different languages). Second-stage ranking in multilingual search systems. Relevance scoring for multilingual FAQ and document retrieval. Improving retrieval quality over BGE-M3 dense retrieval as a reranker pair.

Is bge-reranker-v2-m3 free to use?

bge-reranker-v2-m3 is an open-source model published on HuggingFace. License terms vary by model — check the model card for the specific license.

How do I run bge-reranker-v2-m3 locally?

Most HuggingFace models can be loaded with transformers or the appropriate framework library. See the model card for framework-specific instructions and hardware requirements.

Tags

sentence-transformerssafetensorsxlm-robertatext-classificationtransformerstext-embeddings-inferencemultilingualarxiv:2312.15503arxiv:2402.03216license:apache-2.0endpoints_compatibledeploy:azureregion:us