FAQ

Private offline AI, explained clearly.

Answers for search engines, agents, and people evaluating Patagonia AI before installing it.

What is Patagonia AI?

Patagonia AI is a private offline AI app for Apple devices. It is designed to run AI locally so prompts, files, and model interactions can stay on the device.

Does Patagonia AI require internet access?

Patagonia AI is marketed for offline use after the app and required model assets are available on the device. Some setup, downloads, App Store access, community links, or email support may require internet access.

Which devices are supported?

The website markets Patagonia AI for iPhone, iPad, Mac, and Apple Vision Pro, using iOS, iPadOS, macOS, and visionOS.

Which local LLM models are relevant for iPhone?

Common small-model families for iPhone local LLM workflows include Llama 3.2 1B and 3B, Qwen3 0.6B, Qwen3 1.7B, Qwen3 4B, and Gemma 3 1B. Current Patagonia AI availability should be checked in the in-app model catalog.

See the best local LLMs for iPhone guide and the Llama, Qwen, and Gemma guide.

How does Patagonia AI handle privacy?

The product emphasizes on-device AI. The core privacy claim is that prompts and data can stay on the user's device instead of being sent to a cloud chat server.

Does Patagonia AI support open-source models?

The landing page says Patagonia AI supports open-source AI models and Apple-native local execution where available.

How can I contact the team?

Email contact@tlon.sh or join the Discord community.