đ§
T5 Base Japanese model by sonoisa
â 30.3
đŦTechnical Deep Dive
Full Specifications [+]
đ Daily sync (03:00 UTC)
AI Summary: Based on Hugging Face metadata. Not a recommendation.
đĄī¸ Model Transparency Report
Technical metadata sourced from upstream repositories.
Open Metadata
đ Identity & Source
- id
- hf-dataset--sonoisa--t5-base-japanese
- slug
- sonoisa--t5-base-japanese
- source
- huggingface
- author
- sonoisa
- license
- CC-BY-SA-4.0
- tags
- transformers, pytorch, jax, safetensors, t5, feature-extraction, text2text-generation, seq2seq, ja, dataset:wikipedia, dataset:oscar, dataset:cc100, license:cc-by-sa-4.0, endpoints_compatible, deploy:azure, region:us
âī¸ Technical Specs
- architecture
- null
- params billions
- 5
- context length
- 4,096
- pipeline tag
- feature-extraction
- vram gb
- 5
- vram is estimated
- true
- vram formula
- VRAM â (params * 0.75) + 0.8GB (KV) + 0.5GB (OS)
đ Engagement & Metrics
- downloads
- 4,814
- stars
- 0
- forks
- 0
Data indexed from public sources. Updated daily.