Local inference
Run 13 models on iPhone and iPad. No server round-trips, no cloud dependency for core use. Models →
Features
Run 13 models on iPhone and iPad. No server round-trips, no cloud dependency for core use. Models →
PDF Reader, Document Reader, File Writer, File Extraction, Web Fetcher, Web Search, Code Interpreter, Data Analyzer, Image Analyzer, OCR, Clipboard, Translation, Siri Bridge, and more.
Once models are downloaded, SoloLLM works without any internet connection. OCR, translation, code execution, and file operations all run locally.
Read and write documents directly through the iOS Files app. Work with PDFs, text files, and more without leaving SoloLLM.
Connect structured tools and external systems via Model Context Protocol. Technical depth for developers, invisible to everyone else. Learn more →
Conversations persist, messages are editable, and everything exports to Markdown or JSON. Your work doesn't disappear.
OCR works on-device. Text is extracted from photos or files and inserted into the prompt for the model. Inference remains text-only.
The Translation tool uses Apple's Translate API for on-device language translation across dozens of languages — no internet required.
Give the model access to your clipboard, making it easy to pipe content in and out of any workflow without manual copy-paste.