đ§
Distilbert Squad 256seq 8batch Test model by manishiitg
â 22.8
đŦTechnical Deep Dive
Full Specifications [+]
đ Daily sync (03:00 UTC)
AI Summary: Based on Hugging Face metadata. Not a recommendation.
đĄī¸ Model Transparency Report
Technical metadata sourced from upstream repositories.
Open Metadata
đ Identity & Source
- id
- hf-model--manishiitg--distilbert-squad-256seq-8batch-test
- slug
- manishiitg--distilbert-squad-256seq-8batch-test
- source
- huggingface
- author
- manishiitg
- license
- tags
- transformers, pytorch, distilbert, question-answering, endpoints_compatible, region:us
âī¸ Technical Specs
- architecture
- null
- params billions
- 8
- context length
- 4,096
- pipeline tag
- question-answering
- vram gb
- 7.3
- vram is estimated
- true
- vram formula
- VRAM â (params * 0.75) + 0.8GB (KV) + 0.5GB (OS)
đ Engagement & Metrics
- downloads
- 5
- stars
- 0
- forks
- 0
Data indexed from public sources. Updated daily.