Offline image analysis with LLaVA 1.6 + Visual Studio build
by Baris
This project shows how to compile and run LLaVA locally on Windows β without any API, without cloud, and without Python dependencies.
- Fully offline LLaVA Vision model
- Compiled llama-llava-cli (.exe)
- GGUF model support
- GUI + image understanding
- Python wrapper via
llava_cpp.py
llama-llava-cli.exeβ compiled CLI interface for LLaVAllava_cpp.pyβ Python wrapper for CLI callsexample_usage.pyβ test scriptmodels/β place your LLaVA + MMProj models herescreenshots/β test images
π Main model:
https://huggingface.co/liuhaotian/llava-v1.6-mistral-7b-GGUF
π MMProj encoder:
https://huggingface.co/liuhaotian/llava-v1.6-mistral-7b-GGUF/tree/main/mmproj
Put both files inside the models/ folder:
from llava_cpp import analyze_image
response = analyze_image("screenshots/test_gui.png")
print("Response:", response)