🧠
Model

Otel Embedding 568m

by farbodtavakkoli hf-model--farbodtavakkoli--otel-embedding-568m
Nexus Index
47.5 Top 100%
S / A / P / R / Q Breakdown Calibration Pending

Pillar scores are computed during the next indexing cycle.

Tech Context
0.568B Params
Vital Performance
332.6K DL / 30D
0.0%
Audited 47.5 FNI Score
Tiny 0.568B Params
- Context
Hot 332.6K Downloads
8G GPU ~2GB Est. VRAM
Commercial APACHE License
Model Information Summary
Entity Passport
Registry ID hf-model--farbodtavakkoli--otel-embedding-568m
License Apache-2.0
Provider huggingface
💾

Compute Threshold

~1.7GB VRAM

Interactive
Analyze Hardware
â–ŧ

* Static estimation for 4-Bit Quantization.

📜

Cite this model

Academic & Research Attribution

BibTeX
@misc{hf_model__farbodtavakkoli__otel_embedding_568m,
  author = {farbodtavakkoli},
  title = {Otel Embedding 568m Model},
  year = {2026},
  howpublished = {\url{https://huggingface.co/farbodtavakkoli/otel-embedding-568m}},
  note = {Accessed via Free2AITools Knowledge Fortress}
}
APA Style
farbodtavakkoli. (2026). Otel Embedding 568m [Model]. Free2AITools. https://huggingface.co/farbodtavakkoli/otel-embedding-568m

đŸ”ŦTechnical Deep Dive

Full Specifications [+]

Quick Commands

đŸĻ™ Ollama Run
ollama run otel-embedding-568m
🤗 HF Download
huggingface-cli download farbodtavakkoli/otel-embedding-568m

âš–ī¸ Nexus Index V2.0

47.5
ESTIMATED IMPACT TIER
Semantic (S) 0
Authority (A) 0
Popularity (P) 0
Recency (R) 0
Quality (Q) 0

đŸ’Ŧ Index Insight

FNI V2.0 for Otel Embedding 568m: Semantic (S:0), Authority (A:0), Popularity (P:0), Recency (R:0), Quality (Q:0).

Free2AITools Nexus Index

Verification Authority

Unbiased Data Node Refresh: VFS Live
---

🚀 What's Next?

Technical Deep Dive

OTel-Embedding-568M

OTel-Embedding-568M is a telecom-specialized embedding model fine-tuned on telecommunications domain data. It is part of the OTel Family of Models, an open-source initiative to build industry-standard AI models for the global telecommunications sector.

Model Details

Attribute Value
Base Model BAAI/bge-m3
Parameters 568M
Training Method Full parameter fine-tuning
Language English
License Apache 2.0

Training Data

The model was trained on high-quality telecom-focused data curated by 200+ domain experts from organizations including AT&T, RelationalAI, AMD, GSMA, Purdue University, Khalifa University, University of Leeds, Yale University, The University of Texas at Dallas, NetoAI, and MantisNLP.

Data Sources:

  • GSMA Permanent Reference Documents
  • 3GPP Specifications
  • O-RAN Documentation
  • RFC Series
  • eSIM, terminals, security, networks, roaming, APIs
  • Industry whitepapers and telecom academic papers

Intended Use

This model is optimized for:

  • RAG applications in telecommunications
  • Question answering on telecom specifications and standards

Language Models

Embedding Models

Reranker Models

Training Infrastructure

  • Framework: ScalarLM (GPU-agnostic)
  • Compute: TensorWave with AMD GPUs and Azure with NVIDIA GPUs.

Citation

bibtex
@misc{otel2026,
  title={OTel: Open Telco AI Models},
  author={Tavakkoli, Farbod and Diamos, Gregory and Paulk, Roderic and Terrazas, Jorden},
  year={2026},
  url={https://huggingface.co/farbodtavakkoli}
}

Contact

If you have any technical questions, please feel free to reach out to [email protected] or [email protected]

âš ī¸ Incomplete Data

Some information about this model is not available. Use with Caution - Verify details from the original source before relying on this data.

View Original Source →

📝 Limitations & Considerations

  • â€ĸ Benchmark scores may vary based on evaluation methodology and hardware configuration.
  • â€ĸ VRAM requirements are estimates; actual usage depends on quantization and batch size.
  • â€ĸ FNI scores are relative rankings and may change as new models are added.
  • ⚠ License Unknown: Verify licensing terms before commercial use.

Social Proof

HuggingFace Hub
332.6KDownloads
🔄 Daily sync (03:00 UTC)

AI Summary: Based on Hugging Face metadata. Not a recommendation.

📊 FNI Methodology 📚 Knowledge Baseâ„šī¸ Verify with original source

đŸ›Ąī¸ Model Transparency Report

Verified data manifest for traceability and transparency.

100% Data Disclosure Active

🆔 Identity & Source

id
hf-model--farbodtavakkoli--otel-embedding-568m
slug
farbodtavakkoli--otel-embedding-568m
source
huggingface
author
farbodtavakkoli
license
Apache-2.0
tags
safetensors, xlm-roberta, telecom, telecommunications, gsma, fine-tuned, feature-extraction, en, base_model:baai/bge-m3, base_model:finetune:baai/bge-m3, license:apache-2.0, region:us

âš™ī¸ Technical Specs

architecture
null
params billions
0.568
context length
null
pipeline tag
feature-extraction
vram gb
1.7
vram is estimated
true
vram formula
VRAM ≈ (params * 0.75) + 0.8GB (KV) + 0.5GB (OS)

📊 Engagement & Metrics

downloads
332,589
stars
0
forks
0

Free2AITools Constitutional Data Pipeline: Curated disclosure mode active. (V15.x Standard)