π§ Model
chatglm2-6b
by zai-org
--- language: - zh - en tags: - glm - chatglm - thudm --- π» Github Repo β’ π¦ Twitter β’ π [GLM@ACL 22] [GitHub] β’ π [GLM-130B@ICLR 23] <a href="https://github
π Updated 12/19/2025
π§ Architecture Explorer
Neural network architecture
1 Input Layer
2 Hidden Layers
3 Attention
4 Output Layer
About
π» Github Repo β’ π¦ Twitter β’ π [GLM@ACL 22] [GitHub] β’ π [GLM-130B@ICLR 23] <a href="https://github.com/THUD...
π Limitations & Considerations
- β’ Benchmark scores may vary based on evaluation methodology and hardware configuration.
- β’ VRAM requirements are estimates; actual usage depends on quantization and batch size.
- β’ FNI scores are relative rankings and may change as new models are added.
- β License Unknown: Verify licensing terms before commercial use.
- β’ Data source: [{"source_platform":"huggingface","source_url":"https://huggingface.co/zai-org/chatglm2-6b","fetched_at":"2025-12-19T07:41:01.175Z","adapter_version":"3.2.0"}]
π Related Resources
π Related Papers
No related papers linked yet. Check the model's official documentation for research papers.
π Training Datasets
Training data information not available. Refer to the original model card for details.