🧠
Model

Chemberta 100m Mlm

by DeepChem hf-model--deepchem--chemberta-100m-mlm
Nexus Index
40.4 Top 100%
S: Semantic 50
A: Authority 0
P: Popularity 48
R: Recency 60
Q: Quality 65
Tech Context
Vital Performance
18.6K DL / 30D
0.0%
Audited 40.4 FNI Score
Tiny - Params
- Context
18.6K Downloads
Commercial MIT License
Model Information Summary
Entity Passport
Registry ID hf-model--deepchem--chemberta-100m-mlm
License MIT
Provider huggingface
📜

Cite this model

Academic & Research Attribution

BibTeX
@misc{hf_model__deepchem__chemberta_100m_mlm,
  author = {DeepChem},
  title = {Chemberta 100m Mlm Model},
  year = {2026},
  howpublished = {\url{https://huggingface.co/deepchem/chemberta-100m-mlm}},
  note = {Accessed via Free2AITools Knowledge Fortress}
}
APA Style
DeepChem. (2026). Chemberta 100m Mlm [Model]. Free2AITools. https://huggingface.co/deepchem/chemberta-100m-mlm

đŸ”ŦTechnical Deep Dive

Full Specifications [+]

Quick Commands

🤗 HF Download
huggingface-cli download deepchem/chemberta-100m-mlm
đŸ“Ļ Install Lib
pip install -U transformers

âš–ī¸ Nexus Index V2.0

40.4
TOP 100% SYSTEM IMPACT
Semantic (S) 50
Authority (A) 0
Popularity (P) 48
Recency (R) 60
Quality (Q) 65

đŸ’Ŧ Index Insight

FNI V2.0 for Chemberta 100m Mlm: Semantic (S:50), Authority (A:0), Popularity (P:48), Recency (R:60), Quality (Q:65).

Free2AITools Nexus Index

Verification Authority

Unbiased Data Node Refresh: VFS Live
---

🚀 What's Next?

Technical Deep Dive

ChemBERTa-100M-MLM

ChemBERTa model pretrained on a subset of 100M molecules from ZINC20 dataset using masked language modeling (MLM).

Usage

python
from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("DeepChem/ChemBERTa-100M-MLM")
model = AutoModelForMaskedLM.from_pretrained("DeepChem/ChemBERTa-100M-MLM")

âš ī¸ Incomplete Data

Some information about this model is not available. Use with Caution - Verify details from the original source before relying on this data.

View Original Source →

📝 Limitations & Considerations

  • â€ĸ Benchmark scores may vary based on evaluation methodology and hardware configuration.
  • â€ĸ VRAM requirements are estimates; actual usage depends on quantization and batch size.
  • â€ĸ FNI scores are relative rankings and may change as new models are added.
  • ⚠ License Unknown: Verify licensing terms before commercial use.

Social Proof

HuggingFace Hub
18.6KDownloads
🔄 Daily sync (03:00 UTC)

AI Summary: Based on Hugging Face metadata. Not a recommendation.

📊 FNI Methodology 📚 Knowledge Baseâ„šī¸ Verify with original source

đŸ›Ąī¸ Model Transparency Report

Technical metadata sourced from upstream repositories.

Open Metadata

🆔 Identity & Source

id
hf-model--deepchem--chemberta-100m-mlm
slug
deepchem--chemberta-100m-mlm
source
huggingface
author
DeepChem
license
MIT
tags
transformers, safetensors, roberta, fill-mask, cheminformatics, chemberta, masked-lm, license:mit, endpoints_compatible, region:us

âš™ī¸ Technical Specs

architecture
null
params billions
null
context length
null
pipeline tag
fill-mask

📊 Engagement & Metrics

downloads
18,628
stars
0
forks
0

Data indexed from public sources. Updated daily.