30 Years of HPC: many hardware advances, little adoption of new languages
2 days ago
- #parallel computing
- #language design
- #HPC programming
- The keynote at HIPS 2025 reflected on 30 years of HPC, highlighting massive hardware improvements: core counts increased by hundreds to hundreds of thousands, and performance by millions to tens of millions.
- HPC programming notations have largely stagnated; Fortran, C, and C++ remain dominant, with MPI and OpenMP still key, while GPU computing introduced new models like CUDA and HIP.
- Hardware advances (e.g., vector instructions, multicore, GPUs) mostly made programming harder due to increased complexity, except for high-radix networks simplifying topology concerns.
- New mainstream languages (e.g., Python, Rust, Julia) emphasize productivity and safety but lack HPC-specific features like locality control, hindering adoption in HPC.
- Despite many attempts (e.g., Chapel, ZPL, HPF), no new HPC language has been broadly adopted due to factors like legacy code reliance, funding biases toward hardware, and social adoption challenges.
- Chapel is presented as a resilient language that abstracts data movement and adapts to hardware changes, but its future depends on community support and funding for long-term sustainability.
- To advance HPC programming, recommendations include embracing parallelism's ubiquity, creating funding for software transition to production, and encouraging user experimentation with new technologies.