Announcing Gemma 3n preview: powerful, efficient, mobile-first AI
a year ago
- #AI
- #Gemma3n
- #OnDevice
- Gemma 3n is a new open model optimized for on-device AI, designed to run efficiently on phones, tablets, and laptops.
- Developed in collaboration with mobile hardware leaders like Qualcomm, MediaTek, and Samsung, it supports fast, multimodal AI experiences.
- Features Per-Layer Embeddings (PLE) to reduce RAM usage, enabling larger models to run with lower memory overhead (2GB-3GB).
- Enables real-time, private AI applications like speech transcription, translation, and contextual text generation on-device.
- Early preview available now, with broader availability on Android and Chrome later this year.
- Undergoes rigorous safety evaluations and aligns with Google's responsible AI policies.
- Initial access includes model weights, documentation, and tools for developers.