Hasty Briefsbeta

Show HN: Stochastic Gradient in Hilbert Spaces

12 hours ago
  • #optimization theory
  • #infinite-dimensional Hilbert spaces
  • #stochastic gradient methods
  • Development of a rigorous theory of stochastic gradient methods in infinite-dimensional Hilbert spaces.
  • Assembly of functional-analytic and measure-theoretic tools, including key inequalities.
  • Demonstration that various definitions of 'stochastic gradient' in function spaces agree under mild assumptions.
  • Establishment of well-posedness for discrete- and continuous-time dynamics, linking to gradient-flow PDEs.
  • Non-asymptotic convergence guarantees with explicit constants for various regimes: convex, strongly convex, nonconvex landscapes, heavy-tailed noise, and composite models.
  • Comparison of weak versus strong convergence and resolution of measurability issues in infinite dimensions.
  • Spectral analysis of linearized dynamics to explain mode-by-mode behavior and slow directions via operator spectrum.
  • Extensions to Gaussian/RKHS settings, Hilbert manifolds, and analysis of what works in Banach spaces.
  • Analysis of five practical discretizations, proving stability and consistency leading to convergence.
  • Case studies in quantum ground states, elasticity, optimal control, and Bayesian inverse problems.
  • Curated list of open problems for future work on stochastic optimization in infinite dimensions.