GGUF version of double7/vicuna-160m.
- Downloads last month
- -
Hardware compatibility
Log In to add your hardware
8-bit
16-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Felladrin/gguf-sharded-vicuna-160m
Base model
double7/vicuna-160m