How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="Local-Novel-LLM-project/Assistance")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Local-Novel-LLM-project/Assistance")
model = AutoModelForCausalLM.from_pretrained("Local-Novel-LLM-project/Assistance")
Quick Links

Our Models

THIS IS WIP MODEL

これは Ninja を 小説能力ではなくコードや数学系の知識を持たせたモデルです

Downloads last month
2
Safetensors
Model size
7B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support