📊
Dataset

Ultra Fineweb

by openbmb hf-dataset--openbmb--ultra-fineweb
Nexus Index
35.8 Top 100%
S: Semantic 50
A: Authority 0
P: Popularity 57
R: Recency 50
Q: Quality 30
Tech Context
Vital Performance
0 DL / 30D
0.0%
Data Integrity 35.8 FNI Score
- Size
- Rows
Parquet Format
- Tokens
Dataset Information Summary
Entity Passport
Registry ID hf-dataset--openbmb--ultra-fineweb
License Apache-2.0
Provider huggingface
📜

Cite this dataset

Academic & Research Attribution

BibTeX
@misc{hf_dataset__openbmb__ultra_fineweb,
  author = {openbmb},
  title = {Ultra Fineweb Dataset},
  year = {2026},
  howpublished = {\url{https://huggingface.co/datasets/openbmb/ultra-fineweb}},
  note = {Accessed via Free2AITools Knowledge Fortress}
}
APA Style
openbmb. (2026). Ultra Fineweb [Dataset]. Free2AITools. https://huggingface.co/datasets/openbmb/ultra-fineweb

đŸ”ŦTechnical Deep Dive

Full Specifications [+]

âš–ī¸ Nexus Index V2.0

35.8
TOP 100% SYSTEM IMPACT
Semantic (S) 50
Authority (A) 0
Popularity (P) 57
Recency (R) 50
Quality (Q) 30

đŸ’Ŧ Index Insight

FNI V2.0 for Ultra Fineweb: Semantic (S:50), Authority (A:0), Popularity (P:57), Recency (R:50), Quality (Q:30).

Free2AITools Nexus Index

Verification Authority

Unbiased Data Node Refresh: VFS Live
âŦ‡ī¸
Downloads
60,677

đŸ‘ī¸ Data Preview

📊

Row-level preview not available for this dataset.

Schema structure is shown in the Field Logic panel when available.

🔗 Explore Full Dataset ↗

đŸ§Ŧ Field Logic

đŸ§Ŧ

Schema not yet indexed for this dataset.

Dataset Specification

Social Proof

HuggingFace Hub
60.7KDownloads
🔄 Daily sync (03:00 UTC)

AI Summary: Based on Hugging Face metadata. Not a recommendation.

📊 FNI Methodology 📚 Knowledge Baseâ„šī¸ Verify with original source

đŸ›Ąī¸ Dataset Transparency Report

Technical metadata sourced from upstream repositories.

Open Metadata

🆔 Identity & Source

id
hf-dataset--openbmb--ultra-fineweb
slug
openbmb--ultra-fineweb
source
huggingface
author
openbmb
license
Apache-2.0
tags
task_categories:text-generation, language:en, language:zh, license:apache-2.0, size_categories:1b<n<10b, modality:text, arxiv:2505.05427, arxiv:2506.07900, arxiv:2412.04315, region:us, llm, pretraining, web-corpus, data-filtering, high-quality

âš™ī¸ Technical Specs

architecture
null
params billions
null
context length
null
pipeline tag

📊 Engagement & Metrics

downloads
60,677
stars
333
forks
0

Data indexed from public sources. Updated daily.