🤖 feature-extraction

bge-reranker-large

BAAI/bge-reranker-large

Get AI Model →
775.8K
Downloads
❤️
455
Likes
🏷️
21
Tags
📦
transformers
Library
Model Details
Full Model IDBAAI/bge-reranker-large
Pipeline / Taskfeature-extraction
Librarytransformers
Downloads (all-time)775.8K
Likes455
Last Modified5/11/2024
Author / OrgBAAI
PrivateNo — public
⚡ Quick Usage (Python)

Using the 🤗 Transformers library. Install with pip install transformers

from transformers import pipeline

# Load the model
pipe = pipeline("feature-extraction", model="BAAI/bge-reranker-large")

# Run inference
result = pipe("Your input here")
print(result)
🏷️ Tags
transformerspytorchonnxsafetensorsxlm-robertatext-classificationmtebfeature-extractionenzharxiv:2401.03462arxiv:2312.15503arxiv:2311.13534arxiv:2310.07554arxiv:2309.07597license:mitmodel-indextext-embeddings-inferenceendpoints_compatibledeploy:azureregion:us
More feature-extraction Models
See all →
all-MiniLM-L6-v2

sentence-transformers/all-MiniLM-L6-v2

203.9M❤️ 4.7K
Get AI Model →
paraphrase-multilingual-MiniLM-L12-v2

sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2

34.4M❤️ 1.2K
Get AI Model →
all-mpnet-base-v2

sentence-transformers/all-mpnet-base-v2

33.4M❤️ 1.3K
Get AI Model →
🚀 Use This Model

Access model files, inference API, and full documentation on Hugging Face.

Open on Hugging Face →Browse Model Files ↗← Browse All Models
🤖 Task: feature-extraction

This model is designed for the feature-extraction task. Explore more models for this use case.

All feature-extraction Models →
📊 Popularity
Downloads775.8K
❤️ Community Likes455
🛠️ Requirements
  • Install: pip install transformers
  • Python 3.8+ recommended for Transformers.
  • GPU (CUDA) speeds up inference significantly.
  • Use model.half() for fp16 on limited VRAM.
👋 Need help with code?