Chytrej1.5-90M-Base

A fully custom pretrained language model built from scratch on the LLaMA architecture.

Chytrej (Czech slang for "clever/smart") is a long-term model series by PingVortex Labs. Every model in the series will be fully custom pretrained from scratch, then the model may be instruction fine-tuned on the custom base. The ongoing goal: every release must at least know the capital of France.

Built by PingVortex Labs.


Model Details

  • Parameters: 90M
  • Context length: 8,192 tokens
  • Language: English only
  • Format: base model
  • Architecture: LLaMA
  • License: Apache 2.0

Benchmarks

Evaluated with lm-eval-harness, 0-shot:

Task Metric Chytrej1.5 Chytrej1
ARC-Easy acc 41.46% 39.73%
ARC-Easy acc_norm 37.04% 34.47%

Usage

from transformers import LlamaForCausalLM, PreTrainedTokenizerFast

model = LlamaForCausalLM.from_pretrained("pvlabs/Chytrej1.5-90M-Base")
tokenizer = PreTrainedTokenizerFast.from_pretrained("pvlabs/Chytrej1.5-90M-Base")

prompt = "The capital of France is"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100, repetition_penalty=1.3)
print(tokenizer.decode(outputs[0]))

Made by PingVortex.

Downloads last month
683
Safetensors
Model size
89.1M params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including pvlabs/Chytrej1.5-90M-Base