AI Tools.

Search

bge-large-en-v1.5 vs bge-base-en-v1.5

bge-large-en-v1.5 and bge-base-en-v1.5 are both feature-extraction models. See each entry for specifics.

bge-large-en-v1.5

Pipeline
feature extraction
Downloads
14,929,062
Likes
657

BGE-Large-EN-v1.5 is BAAI's highest-capacity English embedding model in the v1.5 series, producing 1024-dimensional vectors. It achieves top MTEB retrieval scores among its generation of English-only embedding models, at the cost of higher compute and storage than BGE-small or BGE-base. MIT licensed with ONNX export support.

bge-base-en-v1.5

Pipeline
feature extraction
Downloads
8,365,829
Likes
414

BGE-Base-EN-v1.5 is BAAI's mid-tier English embedding model in the v1.5 series, producing 768-dimensional vectors. It balances accuracy and compute cost between the small (384d) and large (1024d) variants, making it a practical default for English retrieval tasks where storage and inference overhead of the large model are undesirable. MIT licensed with ONNX export.

Key differences

  • See individual model pages for architecture and use cases.

Common ground

  • Both are open-source models on HuggingFace.

Which should you pick?

Pick based on your compute budget and specific task requirements.