Hasty Briefsbeta

Bilingual

Forward propagation of errors through time

6 days ago
  • #Backpropagation Through Time
  • #Recurrent Neural Networks
  • #Forward Propagation
  • The article introduces a novel method called Forward Propagation of Errors Through Time (FPTT) as an alternative to Backpropagation Through Time (BPTT) for training recurrent neural networks (RNNs).
  • FPTT propagates errors forward in time using a warm-up phase to determine initial conditions, eliminating the need for backward passes and reducing memory requirements.
  • The method is theoretically sound and successfully trains deep RNNs on tasks like sequential MNIST, but suffers from numerical instability in 'forgetting' regimes.
  • FPTT's instability arises because contractive RNNs (which are desirable) lead to exploding errors when Jacobians are inverted, making the method sensitive to eigenvalue magnitudes.
  • The article compares FPTT with BPTT and Real-Time Recurrent Learning (RTRL), highlighting FPTT's advantages (exact gradients, no backward pass) and limitations (numerical instability, Jacobian inversion costs).
  • Despite its potential for neuromorphic or analog hardware, FPTT is deemed impractical for widespread use due to its instability and computational overhead.
  • The authors share their findings to inspire new ideas in RNN training, emphasizing the importance of questioning conventional methods like BPTT.