🧠
Model

cardiobertpt

by Pucpr Br hf-model--pucpr-br--cardiobertpt
Nexus Index
25.1 Top 100%
S: Semantic 50
A: Authority 0
P: Popularity 2
R: Recency 15
Q: Quality 50
Tech Context
Vital Performance
16 DL / 30D
0.0%
Audited 25.1 FNI Score
Tiny - Params
- Context
16 Downloads
Commercial APACHE License
Model Information Summary
Entity Passport
Registry ID hf-model--pucpr-br--cardiobertpt
License Apache-2.0
Provider huggingface
πŸ“œ

Cite this model

Academic & Research Attribution

BibTeX
@misc{hf_model__pucpr_br__cardiobertpt,
  author = {Pucpr Br},
  title = {cardiobertpt Model},
  year = {2026},
  howpublished = {\url{https://huggingface.co/pucpr-br/cardiobertpt}},
  note = {Accessed via Free2AITools Knowledge Fortress}
}
APA Style
Pucpr Br. (2026). cardiobertpt [Model]. Free2AITools. https://huggingface.co/pucpr-br/cardiobertpt

πŸ”¬Technical Deep Dive

Full Specifications [+]

Quick Commands

πŸ€— HF Download
huggingface-cli download pucpr-br/cardiobertpt
πŸ“¦ Install Lib
pip install -U transformers

βš–οΈ Nexus Index V2.0

25.1
TOP 100% SYSTEM IMPACT
Semantic (S) 50
Authority (A) 0
Popularity (P) 2
Recency (R) 15
Quality (Q) 50

πŸ’¬ Index Insight

FNI V2.0 for cardiobertpt: Semantic (S:50), Authority (A:0), Popularity (P:2), Recency (R:15), Quality (Q:50).

Free2AITools Nexus Index

Verification Authority

Unbiased Data Node Refresh: VFS Live
---

πŸš€ What's Next?

Technical Deep Dive

CardioBERTpt - Portuguese Transformer-based Models for Clinical Language Representation in Cardiology

This model card describes CardioBERTpt, a clinical model trained on the cardiology domain for NER tasks in Portuguese. This model is a fine-tuned version of bert-base-multilingual-cased on a cardiology text dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4495
  • Accuracy: 0.8864

How to use the model

Load the model via the transformers library:

text
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("pucpr-br/cardiobertpt")
model = AutoModel.from_pretrained("pucpr-br/cardiobertpt")

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15.0

Framework versions

  • Transformers 4.17.0.dev0
  • Pytorch 1.8.0
  • Datasets 1.18.3
  • Tokenizers 0.11.0

More Information

Refer to the original paper, CardioBERTpt - Portuguese Transformer-based Models for Clinical Language Representation in Cardiology for additional details and performance on Portuguese NER tasks.

Acknowledgements

This study was financed in part by the CoordenaΓ§Γ£o de AperfeiΓ§oamento de Pessoal de NΓ­vel Superior - Brasil (CAPES) - Finance Code 001, and by Foxconn Brazil and Zerbini Foundation as part of the research project Machine Learning in Cardiovascular Medicine.

Citation

text
@INPROCEEDINGS{10178779,
  author={Schneider, Elisa Terumi Rubel and Gumiel, Yohan Bonescki and de Souza, JoΓ£o Vitor Andrioli and Mie Mukai, Lilian and Emanuel Silva e Oliveira, Lucas and de Sa Rebelo, Marina and Antonio Gutierrez, Marco and Eduardo Krieger, Jose and Teodoro, Douglas and Moro, Claudia and Paraiso, Emerson Cabrera},
  booktitle={2023 IEEE 36th International Symposium on Computer-Based Medical Systems (CBMS)}, 
  title={CardioBERTpt: Transformer-based Models for Cardiology Language Representation in Portuguese}, 
  year={2023},
  volume={},
  number={},
  pages={378-381},
  doi={10.1109/CBMS58004.2023.00247}}
}

Questions?

Post a Github issue on the CardioBERTpt repo.

⚠️ Incomplete Data

Some information about this model is not available. Use with Caution - Verify details from the original source before relying on this data.

View Original Source β†’

πŸ“ Limitations & Considerations

  • β€’ Benchmark scores may vary based on evaluation methodology and hardware configuration.
  • β€’ VRAM requirements are estimates; actual usage depends on quantization and batch size.
  • β€’ FNI scores are relative rankings and may change as new models are added.
  • ⚠ License Unknown: Verify licensing terms before commercial use.

Social Proof

HuggingFace Hub
16Downloads
πŸ”„ Daily sync (03:00 UTC)

AI Summary: Based on Hugging Face metadata. Not a recommendation.

πŸ“Š FNI Methodology πŸ“š Knowledge Baseℹ️ Verify with original source

πŸ›‘οΈ Model Transparency Report

Technical metadata sourced from upstream repositories.

Open Metadata

πŸ†” Identity & Source

id
hf-model--pucpr-br--cardiobertpt
slug
pucpr-br--cardiobertpt
source
huggingface
author
Pucpr Br
license
Apache-2.0
tags
transformers, pytorch, bert, fill-mask, pt, license:apache-2.0, endpoints_compatible, region:us

βš™οΈ Technical Specs

architecture
null
params billions
null
context length
null
pipeline tag
fill-mask

πŸ“Š Engagement & Metrics

downloads
16
stars
0
forks
0

Data indexed from public sources. Updated daily.