qwen2.5vl:3b
For speedThe fastest first choice for everyday screenshots and images. Use it when you want local processing to feel responsive on most Macs.
ollama pull qwen2.5vl:3bRun Zush analysis locally on your Mac with Ollama. Your files are processed by a model on your computer instead of a cloud AI provider.
When Local (Ollama) is enabled in Zush, supported file analysis runs through your local Ollama server. Zush does not send analysis content to Zush cloud or third-party AI providers in this mode. You still control which model is installed, where Ollama stores it, and when Ollama is running.
Download Ollama for macOS from the official website, install it, and open the app once so the local server can start.
Open Ollama download pageZush works best with a vision-capable model because many files are images, screenshots, PDFs, or visual previews. Start with:
ollama pull qwen2.5vl:3bOllama usually runs in the background after you open it. If Zush cannot connect, start it from Terminal:
ollama serveOpen Zush, go to AI Setup, turn on Local (Ollama), refresh the model list, select your model, and run Test.
Pick a model based on the job: qwen2.5vl:3b for speed, gemma3:4b for balance, or granite3.2-vision:2b for documents.
The fastest first choice for everyday screenshots and images. Use it when you want local processing to feel responsive on most Macs.
ollama pull qwen2.5vl:3bA good default when you want better descriptions without jumping to a large model. Start here if speed and quality both matter.
ollama pull gemma3:4bA compact vision model that is useful for document previews, scans, and structured visual content.
ollama pull granite3.2-vision:2bOr choose another vision-capable model from the Ollama model catalog .
Run ollama list in Terminal. If the list is empty, pull a model first, then click refresh in Zush.
Make sure Ollama is running and the host is set to http://127.0.0.1:11434 in Zush connection settings.
Use a smaller model like qwen2.5vl:3b, close memory-heavy apps, or switch back to Cloud when you need faster batch processing.
Local mode is separate from Cloud and BYOK. Cloud uses Zush credits by default, BYOK uses your provider key, and Local uses Ollama on your Mac.