IBM Patented Euler's 200 Year Old Math Technique for 'AI Interpretability'
9 days ago
- #Patent Controversy
- #Continued Fractions
- #Neural Networks
- LeetArxiv is a successor to Papers With Code after its shutdown.
- IBM holds a patent for using derivatives to find convergents of generalized continued fractions, implementing a 200-year-old technique by Gauss, Euler, and Ramanujan in PyTorch.
- The paper 'CoFrNets: Interpretable Neural Architecture Inspired by Continued Fractions' rebrands continued fractions as 'ladders' and basic division as 'The 1/z nonlinearity'.
- Continued fractions have historical applications in approximating Pi, designing gear systems, and were used by Ramanujan.
- The authors implemented a continued fraction library in PyTorch, chaining linear layers with reciprocal as the non-linearity, resembling generalized continued fractions.
- Testing on a waveform dataset yielded 61% accuracy, below state-of-the-art, highlighting limitations in differentiability.
- IBM filed a patent on this implementation, potentially affecting mechanical engineers, mathematicians, and numerical analysts using continued fractions.
- The patent could impact gear design, mathematical research, and numerical analysis, despite the technique's long-standing existence.