🧠
Model

Uae Large V1

by WhereIsAI hf-model--whereisai--uae-large-v1
Nexus Index
42.1 Top 100%
S: Semantic 50
A: Authority 0
P: Popularity 72
R: Recency 58
Q: Quality 50
Tech Context
Vital Performance
2.6M DL / 30D
0.0%
Audited 42.1 FNI Score
Tiny - Params
- Context
Hot 2.6M Downloads
Commercial MIT License
Model Information Summary
Entity Passport
Registry ID hf-model--whereisai--uae-large-v1
License MIT
Provider huggingface
📜

Cite this model

Academic & Research Attribution

BibTeX
@misc{hf_model__whereisai__uae_large_v1,
  author = {WhereIsAI},
  title = {Uae Large V1 Model},
  year = {2026},
  howpublished = {\url{https://huggingface.co/whereisai/uae-large-v1}},
  note = {Accessed via Free2AITools Knowledge Fortress}
}
APA Style
WhereIsAI. (2026). Uae Large V1 [Model]. Free2AITools. https://huggingface.co/whereisai/uae-large-v1

đŸ”ŦTechnical Deep Dive

Full Specifications [+]

Quick Commands

🤗 HF Download
huggingface-cli download whereisai/uae-large-v1
đŸ“Ļ Install Lib
pip install -U transformers

âš–ī¸ Nexus Index V2.0

42.1
TOP 100% SYSTEM IMPACT
Semantic (S) 50
Authority (A) 0
Popularity (P) 72
Recency (R) 58
Quality (Q) 50

đŸ’Ŧ Index Insight

FNI V2.0 for Uae Large V1: Semantic (S:50), Authority (A:0), Popularity (P:72), Recency (R:58), Quality (Q:50).

Free2AITools Nexus Index

Verification Authority

Unbiased Data Node Refresh: VFS Live
---

🚀 What's Next?

Technical Deep Dive

[Universal AnglE Embedding](https://github.com/SeanLee97/AnglE)

đŸ“ĸ WhereIsAI/UAE-Large-V1 is licensed under MIT. Feel free to use it in any scenario. If you use it for academic papers, you could cite us via 👉 citation info.

🤝 Follow us on:

Welcome to using AnglE to train and infer powerful sentence embeddings.

🏆 Achievements

  • 📅 May 16, 2024 | AnglE's paper is accepted by ACL 2024 Main Conference
  • 📅 Dec 4, 2023 | đŸ”Ĩ Our universal English sentence embedding WhereIsAI/UAE-Large-V1 achieves SOTA on the MTEB Leaderboard with an average score of 64.64!

image/jpeg

🧑‍🤝‍🧑 Siblings:

Usage

1. angle_emb

bash
python -m pip install -U angle-emb
  1. Non-Retrieval Tasks

There is no need to specify any prompts.

python
from angle_emb import AnglE
from angle_emb.utils import cosine_similarity

angle = AnglE.from_pretrained('WhereIsAI/UAE-Large-V1', pooling_strategy='cls').cuda()
doc_vecs = angle.encode([
    'The weather is great!',
    'The weather is very good!',
    'i am going to bed'
], normalize_embedding=True)

for i, dv1 in enumerate(doc_vecs):
    for dv2 in doc_vecs[i+1:]:
        print(cosine_similarity(dv1, dv2))
  1. Retrieval Tasks

For retrieval purposes, please use the prompt Prompts.C for query (not for document).

python
from angle_emb import AnglE, Prompts
from angle_emb.utils import cosine_similarity

angle = AnglE.from_pretrained('WhereIsAI/UAE-Large-V1', pooling_strategy='cls').cuda()
qv = angle.encode(Prompts.C.format(text='what is the weather?'))
doc_vecs = angle.encode([
    'The weather is great!',
    'it is rainy today.',
    'i am going to bed'
])

for dv in doc_vecs:
    print(cosine_similarity(qv[0], dv))

2. sentence transformer

python
from angle_emb import Prompts
from sentence_transformers import SentenceTransformer

model = SentenceTransformer("WhereIsAI/UAE-Large-V1").cuda()

qv = model.encode(Prompts.C.format(text='what is the weather?'))
doc_vecs = model.encode([
    'The weather is great!',
    'it is rainy today.',
    'i am going to bed'
])

for dv in doc_vecs:
    print(1 - spatial.distance.cosine(qv, dv))

3. Infinity

Infinity is a MIT licensed server for OpenAI-compatible deployment.

text
docker run --gpus all -v $PWD/data:/app/.cache -p "7997":"7997" \
michaelf34/infinity:latest \
v2 --model-id WhereIsAI/UAE-Large-V1 --revision "369c368f70f16a613f19f5598d4f12d9f44235d4" --dtype float16 --batch-size 32 --device cuda --engine torch --port 7997

Citation

If you use our pre-trained models, welcome to support us by citing our work:

text
@article{li2023angle,
  title={AnglE-optimized Text Embeddings},
  author={Li, Xianming and Li, Jing},
  journal={arXiv preprint arXiv:2309.12871},
  year={2023}
}

âš ī¸ Incomplete Data

Some information about this model is not available. Use with Caution - Verify details from the original source before relying on this data.

View Original Source →

📝 Limitations & Considerations

  • â€ĸ Benchmark scores may vary based on evaluation methodology and hardware configuration.
  • â€ĸ VRAM requirements are estimates; actual usage depends on quantization and batch size.
  • â€ĸ FNI scores are relative rankings and may change as new models are added.
  • ⚠ License Unknown: Verify licensing terms before commercial use.

Social Proof

HuggingFace Hub
2.6MDownloads
🔄 Daily sync (03:00 UTC)

AI Summary: Based on Hugging Face metadata. Not a recommendation.

📊 FNI Methodology 📚 Knowledge Baseâ„šī¸ Verify with original source

đŸ›Ąī¸ Model Transparency Report

Technical metadata sourced from upstream repositories.

Open Metadata

🆔 Identity & Source

id
hf-model--whereisai--uae-large-v1
slug
whereisai--uae-large-v1
source
huggingface
author
WhereIsAI
license
MIT
tags
sentence-transformers, onnx, safetensors, openvino, bert, feature-extraction, mteb, sentence_embedding, feature_extraction, transformers, transformers.js, en, arxiv:2309.12871, license:mit, model-index, text-embeddings-inference, endpoints_compatible, deploy:azure, region:us

âš™ī¸ Technical Specs

architecture
null
params billions
null
context length
null
pipeline tag
feature-extraction

📊 Engagement & Metrics

downloads
2,593,451
stars
0
forks
0

Data indexed from public sources. Updated daily.