error with mmproj
#1
by
mksystem - opened
I get error with llama.cpp:
clip_init: failed to load model 'mmproj-LFM2.5-VL-1.6b-Q8_0.gguf': operator(): unable to find tensor mm.input_norm.weight
mtmd_init_from_file: error: Failed to load CLIP model from mmproj-LFM2.5-VL-1.6b-Q8_0.gguf
srv load_model: failed to load multimodal model, 'mmproj-LFM2.5-VL-1.6b-Q8_0.gguf'
srv operator(): operator(): cleaning up before exit...
main: exiting due to model loading error
My command: llama-server -m LFM2.5-VL-1.6B-Q8_0.gguf --mmproj mmproj-LFM2.5-VL-1.6b-Q8_0.gguf -ngl 99 -s 0
I get same error with mmproj-LFM2.5-VL-1.6b-BF16.gguf
@mksystem , use latest llama.cpp release that includes https://github.com/ggml-org/llama.cpp/pull/18594
tarek-liquid changed discussion status to
closed