đ§
Otel Embedding 568m model by farbodtavakkoli
â 47.5
đŦTechnical Deep Dive
Full Specifications [+]
đ Daily sync (03:00 UTC)
AI Summary: Based on Hugging Face metadata. Not a recommendation.
đĄī¸ Model Transparency Report
Verified data manifest for traceability and transparency.
100% Data Disclosure Active
đ Identity & Source
- id
- hf-model--farbodtavakkoli--otel-embedding-568m
- slug
- farbodtavakkoli--otel-embedding-568m
- source
- huggingface
- author
- farbodtavakkoli
- license
- Apache-2.0
- tags
- safetensors, xlm-roberta, telecom, telecommunications, gsma, fine-tuned, feature-extraction, en, base_model:baai/bge-m3, base_model:finetune:baai/bge-m3, license:apache-2.0, region:us
âī¸ Technical Specs
- architecture
- null
- params billions
- 0.568
- context length
- null
- pipeline tag
- feature-extraction
- vram gb
- 1.7
- vram is estimated
- true
- vram formula
- VRAM â (params * 0.75) + 0.8GB (KV) + 0.5GB (OS)
đ Engagement & Metrics
- downloads
- 332,589
- stars
- 0
- forks
- 0
Free2AITools Constitutional Data Pipeline: Curated disclosure mode active. (V15.x Standard)