Model Card for LegalSafe-Falcon-7B
Model Details
Model Description
LegalSafe-Falcon-7B is a safety-aligned large language model fine-tuned for the Indian legal domain. The model focuses on reducing toxic, biased, and unsafe outputs while maintaining high accuracy in legal reasoning and responses. It integrates Constitutional AI principles and Reinforcement Learning from AI Feedback (RLAIF) to improve ethical alignment.
- Developed by: Gopi M
- Model type: Causal Language Model (LLM)
- Language(s): English (Legal Domain - Indian Context)
- License: Apache 2.0 (inherits from base model)
- Finetuned from model: ybelkada/falcon-7b-sharded-bf16
Uses
Direct Use
- Legal question answering
- Legal document summarization
- Safe text generation in sensitive domains
- Educational tools for legal studies
Downstream Use
- Integration into legal chatbots
- AI-based legal assistants
- Compliance and advisory systems
- Safety-critical NLP applications
Out-of-Scope Use
- Providing official legal advice
- Use in high-risk legal decision-making without human oversight
- Generating harmful, biased, or illegal content
Bias, Risks, and Limitations
- May still exhibit residual bias from training data
- Performance may vary outside Indian legal domain
- Not a substitute for professional legal consultation
- Risk of hallucinations in complex legal scenarios
Recommendations
- Use with human oversight in legal applications
- Validate outputs before deployment in critical systems
- Avoid relying on model for final legal decisions
How to Get Started
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "gopi30/rlaif-safety-alligned"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
prompt = "Explain the concept of fundamental rights in Indian law."
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0]))
- Downloads last month
- 21
Model tree for gopi30/rlaif-safety-alligned
Base model
ybelkada/falcon-7b-sharded-bf16