
Offline Functionality
Works fully offline without internet access.
Device Optimization
Runs models optimized for Apple silicon.
Personalization Options
Adjust themes, fonts, and system prompts to your preference.
Integration with Shortcuts
Easily connect outputs from local models to other actions.
Fullmoon: local intelligence allows users to interact with large language models directly on their devices, ensuring privacy and convenience. It is designed to work offline, making it perfect for users without consistent internet access. The application is optimized for Apple silicon, providing efficient performance across iOS, iPadOS, macOS, and visionOS platforms.
The app includes models like Llama-3.2-1B-Instruct-4bit (0.7 GB, 193M parameters), Llama-3.2-3B-Instruct-4bit (1.8 GB, 502M parameters), and DeepSeek-R1-Distill-Qwen-1.5B models with options for 4-bit and 8-bit precision.
Individuals seeking private interactions with AI models.
Users needing AI assistance while offline.
Developers looking to integrate local AI outputs into their workflows.
Fullmoon works on iOS, iPadOS, macOS, and visionOS.
Yes, fullmoon is free and open source.
You can install the latest features and models using TestFlight.