MultiClinNER Multilingual Models

Multilingual clinical NER models trained on all 7 languages (CZ, EN, ES, IT, NL, RO, SV).

Best Model

  • Model: FacebookAI-xlm-roberta-large-C64-H3-E3-Arandom-%0.25-P0.2-42
  • Branch: main

Usage

# Load the best model (main branch)
from transformers import AutoTokenizer, AutoModelForTokenClassification

model = AutoModelForTokenClassification.from_pretrained("IEETA/MultiClinNER-MIXED")
tokenizer = AutoTokenizer.from_pretrained("IEETA/MultiClinNER-MIXED")

# Load a specific model variant
model = AutoModelForTokenClassification.from_pretrained("IEETA/MultiClinNER-MIXED", revision="BRANCH_NAME")

All Models (7 variants)

Branch Model Best?
main FacebookAI-xlm-roberta-large-C64-H3-E3-Arandom-%0.25-P0.2-42 Yes
FacebookAI-xlm-roberta-large-C64-H3-E3-Arandom-pct0.25-P0.2-123 FacebookAI-xlm-roberta-large-C64-H3-E3-Arandom-%0.25-P0.2-123
FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-pct0.1-P0.2-42 FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-%0.1-P0.2-42
FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-pct0.25-P0.2-123 FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-%0.25-P0.2-123
FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-pct0.25-P0.2-42 FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-%0.25-P0.2-42
FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-pct0.25-P0.5-123 FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-%0.25-P0.5-123
FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-pct0.25-P0.5-42 FacebookAI-xlm-roberta-large-C64-H3-E3-Aukn-%0.25-P0.5-42
Downloads last month
81
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including IEETA/MultiClinNER-MIXED