🧠 Model

Ministral-8B-Instruct-2410

by mistralai

--- library_name: vllm language: - en - fr - de - es - it - pt - zh - ja - ru - ko license: other license_name: mrl inference: false license_link: https://mistr

πŸ• Updated 12/19/2025

🧠 Architecture Explorer

Neural network architecture

1 Input Layer
2 Hidden Layers
3 Attention
4 Output Layer

About

--- library_name: vllm language: - en - fr - de - es - it - pt - zh - ja - ru - ko license: other license_name: mrl inference: false license_link: https://mistral.ai/licenses/MRL-0.1.md extra_gated_prompt: >- # Mistral AI Research License If You want to use a Mistral Model, a Derivative or an Output for any purpose that is not expressly authorized under this Agreement, You must request a license from Mistral AI, which Mistral AI may grant to You in Mistral AI's sole discretion. To discuss suc...

πŸ“ Limitations & Considerations

  • β€’ Benchmark scores may vary based on evaluation methodology and hardware configuration.
  • β€’ VRAM requirements are estimates; actual usage depends on quantization and batch size.
  • β€’ FNI scores are relative rankings and may change as new models are added.
  • β€’ Data source: [{"source_platform":"huggingface","source_url":"https://huggingface.co/mistralai/Ministral-8B-Instruct-2410","fetched_at":"2025-12-19T07:41:01.181Z","adapter_version":"3.2.0"}]

πŸ“š Related Resources

πŸ“„ Related Papers

No related papers linked yet. Check the model's official documentation for research papers.

πŸ“Š Training Datasets

Training data information not available. Refer to the original model card for details.

πŸ”— Related Models

Data unavailable

πŸš€ What's Next?