EXAONE 3.5 7.8B AWQ - 119 EMT Chatbot

AWQ 4-bit quantized version of EXAONE 3.5 7.8B fine-tuned for Korean 119 emergency medical services.

Model Details

  • Base Model: LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct
  • Quantization: AWQ 4-bit (5.32GB)
  • Fine-tuning: Korean 119 EMT emergency protocols
  • Language: Korean
  • Use Case: Emergency medical guidance for pediatric patients

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained(
    "hymmmm/exaone-3.5-7.8b-awq",
    device_map="auto"
)

tokenizer = AutoTokenizer.from_pretrained("hymmmm/exaone-3.5-7.8b-awq")

Training Data

Based on "119 EMT Field Emergency Treatment Standard Guidelines 2023 (Pediatric)"

Downloads last month
34
Safetensors
Model size
8B params
Tensor type
I32
·
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for hymmmm/exaone-3.5-7.8b-awq

Quantized
(20)
this model