Chytrej
Collection
2 items • Updated
A fully custom pretrained language model built from scratch on the LLaMA architecture.
Chytrej (Czech slang for "clever/smart") is a long-term model series by PingVortex Labs. Every model in the series will be fully custom pretrained from scratch, then the model may be instruction fine-tuned on the custom base. The ongoing goal: every release must at least know the capital of France.
Built by PingVortex Labs.
Evaluated with lm-eval-harness, 0-shot:
| Task | Metric | Chytrej1.5 | Chytrej1 |
|---|---|---|---|
| ARC-Easy | acc | 41.46% | 39.73% |
| ARC-Easy | acc_norm | 37.04% | 34.47% |
from transformers import LlamaForCausalLM, PreTrainedTokenizerFast
model = LlamaForCausalLM.from_pretrained("pvlabs/Chytrej1.5-90M-Base")
tokenizer = PreTrainedTokenizerFast.from_pretrained("pvlabs/Chytrej1.5-90M-Base")
prompt = "The capital of France is"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100, repetition_penalty=1.3)
print(tokenizer.decode(outputs[0]))
Made by PingVortex.