Models
Run open-source AI models on Apple devices.
Patagonia AI is built around local model execution so people can experiment with useful AI while keeping work on their own hardware.
Model approach
Patagonia AI is positioned for running local open-source models on Apple devices. The goal is to let users choose models optimized for their device, workflow, and privacy needs.
iPhone local LLM model families
Common small-model families for iPhone local LLM searches include Llama 3.2 1B and 3B, Qwen3 0.6B, Qwen3 1.7B, Qwen3 4B, and Gemma 3 1B. Patagonia AI users should confirm current availability in the in-app model catalog because model support can change by release and device.
Device expectations
Local AI performance depends on device generation, available memory, model size, and available storage. Smaller models usually start faster and use fewer resources, while larger models may produce better results on capable hardware.
Offline use
Offline use requires the app and required model assets to be available on the device. App Store downloads, model downloads, community links, and support email still require internet access.
For a practical app-selection checklist, read the offline LLM app for iPhone guide.
Apple platforms
Get the app
Download Patagonia AI from the App Store and follow the in-app model guidance for your device.