Hasty Briefsbeta

Bilingual

Raspberry Pi's New AI Hat Adds 8GB of RAM for Local LLMs

4 months ago
  • #LLM
  • #AI Hardware
  • #Raspberry Pi
  • Raspberry Pi launched the $130 AI HAT+ 2 with Hailo 10H and 8GB LPDDR4X RAM.
  • The Hailo 10H can run LLMs standalone, freeing the Pi's CPU and system RAM.
  • The chip has 40 TOPS INT8 NPU inference and 26 TOPS INT4 machine vision performance.
  • Performance is limited by the 3W power cap, compared to the Pi's 10W CPU.
  • 8GB RAM restricts LLM use; Pi 5 can have 16GB, better for medium-sized models.
  • Hailo 10H is slightly more efficient but underperforms compared to Pi's CPU.
  • Qwen 30B was compressed to fit on a 16GB Pi 5, showing potential for local LLMs.
  • AI HAT+ 2 excels in vision processing but similar performance is available with cheaper alternatives.
  • Mixed mode (vision + inference) had issues with segmentation faults and device readiness.
  • The HAT is best for power-constrained applications needing both vision and inference.