AI Tools.

Search

bge-large-en-v1.5 vs multilingual-e5-large

bge-large-en-v1.5 and multilingual-e5-large are both feature-extraction models. See each entry for specifics.

bge-large-en-v1.5

Pipeline
feature extraction
Downloads
14,929,062
Likes
657

BGE-Large-EN-v1.5 is BAAI's highest-capacity English embedding model in the v1.5 series, producing 1024-dimensional vectors. It achieves top MTEB retrieval scores among its generation of English-only embedding models, at the cost of higher compute and storage than BGE-small or BGE-base. MIT licensed with ONNX export support.

multilingual-e5-large

Pipeline
feature extraction
Downloads
7,225,099
Likes
1,186

Multilingual-E5-Large is a 560-million-parameter multilingual embedding model from Microsoft Research, supporting 100+ languages via an XLM-RoBERTa backbone. Trained with E5's instruction-following approach (prepending 'query:' or 'passage:' prefixes), it achieves strong MTEB multilingual retrieval scores. MIT licensed with ONNX and OpenVINO export.

Key differences

  • See individual model pages for architecture and use cases.

Common ground

  • Both are open-source models on HuggingFace.

Which should you pick?

Pick based on your compute budget and specific task requirements.