Please see our fork of llama.cpp for more detail to run MiniCPM-Llama3-V 2.5 with llama.cpp
ollama
Found this model useful? Share it with others!