πŸ“Š
Dataset

Bert Base En Fr Cased

by Geotrend hf-model--geotrend--bert-base-en-fr-cased
Nexus Index
26.4 Top 100%
S: Semantic 50
A: Authority 0
P: Popularity 0
R: Recency 100
Q: Quality 38
Tech Context
Vital Performance
0 DL / 30D
0.0%
Data Integrity 26.4 FNI Score
- Size
- Rows
Parquet Format
- Tokens
Dataset Information Summary
Entity Passport
Registry ID hf-model--geotrend--bert-base-en-fr-cased
Provider huggingface
πŸ“œ

Cite this dataset

Academic & Research Attribution

BibTeX
@misc{hf_model__geotrend__bert_base_en_fr_cased,
  author = {Geotrend},
  title = {Bert Base En Fr Cased Dataset},
  year = {2026},
  howpublished = {\url{https://free2aitools.com/dataset/hf-model--geotrend--bert-base-en-fr-cased}},
  note = {Accessed via Free2AITools Knowledge Fortress}
}
APA Style
Geotrend. (2026). Bert Base En Fr Cased [Dataset]. Free2AITools. https://free2aitools.com/dataset/hf-model--geotrend--bert-base-en-fr-cased

πŸ”¬Technical Deep Dive

Full Specifications [+]

βš–οΈ Nexus Index V2.0

26.4
TOP 100% SYSTEM IMPACT
Semantic (S) 50
Authority (A) 0
Popularity (P) 0
Recency (R) 100
Quality (Q) 38

πŸ’¬ Index Insight

FNI V2.0 for Bert Base En Fr Cased: Semantic (S:50), Authority (A:0), Popularity (P:0), Recency (R:100), Quality (Q:38).

Free2AITools Nexus Index

Verification Authority

Unbiased Data Node Refresh: VFS Live

πŸ‘οΈ Data Preview

πŸ“Š

Row-level preview not available for this dataset.

Schema structure is shown in the Field Logic panel when available.

🧬 Field Logic

🧬

Schema not yet indexed for this dataset.

Dataset Specification

bert-base-en-fr-cased

We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.

Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.

For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.

How to use

python
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-fr-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-fr-cased")

To generate other smaller versions of multilingual transformers please visit our Github repo.

How to cite

bibtex
@inproceedings{smallermbert,
  title={Load What You Need: Smaller Versions of Multilingual BERT},
  author={Abdaoui, Amine and Pradel, Camille and Sigel, GrΓ©goire},
  booktitle={SustaiNLP / EMNLP},
  year={2020}
}

Contact

Please contact [email protected] for any question, feedback or request.

πŸ”„ Daily sync (03:00 UTC)

AI Summary: Based on Hugging Face metadata. Not a recommendation.

πŸ“Š FNI Methodology πŸ“š Knowledge Baseℹ️ Verify with original source

πŸ›‘οΈ Dataset Transparency Report

Technical metadata sourced from upstream repositories.

Open Metadata

πŸ†” Identity & Source

id
hf-model--geotrend--bert-base-en-fr-cased
slug
geotrend--bert-base-en-fr-cased
source
huggingface
author
Geotrend
license
tags

βš™οΈ Technical Specs

architecture
null
params billions
null
context length
null
pipeline tag

πŸ“Š Engagement & Metrics

downloads
0
stars
0
forks
0

Data indexed from public sources. Updated daily.