Hasty Briefsbeta

Bilingual

Was my $48K GPU server worth it?

a day ago
  • #independent research
  • #GPU server build
  • #cloud vs on-premise
  • Author quit FAANG job in 2024 to become an independent researcher and built a custom GPU server named 'grumbl' with 6x RTX 6000 Ada GPUs.
  • The server cost $48K, justified by accelerating work to offset income loss, with GPUs chosen based on price/performance, favoring RTX 6000 Ada over A100 and H100 for FP8 support and inference speed.
  • Power constraints in an apartment led to using two power supplies on separate circuits; a professional builder ensured safety, though the server later moved to a basement.
  • Compared to cloud rental, ownership required about a year at 85% utilization to break even, factoring in electricity (~$3,000) and ignoring time costs.
  • GPU usage tracked over time, averaging 76% utilization overall, with 85% since January 2025, below the expected 95%+.
  • Calculations as of March 13, 2026, show cloud rental would have cost $68,000, saving $17,000, and now saving $90-$105 daily after covering costs.
  • The primary goal was to build something cool and enable high-risk experiments, leading to a breakthrough in LLMs, with a product launch planned.
  • Advice includes caution with custom builds, noting issues like slow GPU interconnects and riser failures, and suggesting standard servers or colocation for others.
  • Ownership shifted mentality from cost-per-experiment to feeling obligated to run experiments, avoiding cloud instance management hassles.
  • Acknowledgments include sponsors and resources, with contact details for questions.