Skip to content

Barissee/llava_cpp_offline

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 

Repository files navigation

llava-cpp-offline πŸ§ πŸ–ΌοΈ

Offline image analysis with LLaVA 1.6 + Visual Studio build
by Baris

This project shows how to compile and run LLaVA locally on Windows – without any API, without cloud, and without Python dependencies.

πŸ”₯ Features

  • Fully offline LLaVA Vision model
  • Compiled llama-llava-cli (.exe)
  • GGUF model support
  • GUI + image understanding
  • Python wrapper via llava_cpp.py

πŸ“¦ Repository contents

  • llama-llava-cli.exe β†’ compiled CLI interface for LLaVA
  • llava_cpp.py β†’ Python wrapper for CLI calls
  • example_usage.py β†’ test script
  • models/ β†’ place your LLaVA + MMProj models here
  • screenshots/ β†’ test images

🧠 Download models

πŸ‘‰ Main model:
https://huggingface.co/liuhaotian/llava-v1.6-mistral-7b-GGUF

πŸ‘‰ MMProj encoder:
https://huggingface.co/liuhaotian/llava-v1.6-mistral-7b-GGUF/tree/main/mmproj

Put both files inside the models/ folder:

🐍 Python example

from llava_cpp import analyze_image

response = analyze_image("screenshots/test_gui.png")
print("Response:", response)

About

Offline KI-Vision System | Local LLaVA Power | Barissee Dev

Resources

Stars

Watchers

Forks

Packages

No packages published