đ§
Roberta S2orc Bpe 32k model by nfliu
â 23.3
đŦTechnical Deep Dive
Full Specifications [+]
đ Daily sync (03:00 UTC)
AI Summary: Based on Hugging Face metadata. Not a recommendation.
đĄī¸ Model Transparency Report
Technical metadata sourced from upstream repositories.
Open Metadata
đ Identity & Source
- id
- hf-model--nfliu--roberta_s2orc_bpe_32k
- slug
- nfliu--roberta_s2orc_bpe_32k
- source
- huggingface
- author
- nfliu
- license
- tags
- transformers, pytorch, camembert, fill-mask, endpoints_compatible, region:us
âī¸ Technical Specs
- architecture
- null
- params billions
- null
- context length
- 32,768
- pipeline tag
- fill-mask
đ Engagement & Metrics
- downloads
- 15
- stars
- 0
- forks
- 0
Data indexed from public sources. Updated daily.