Bit is all we need: binary normalized neural networks
10 hours ago
- #Machine Learning
- #Binary Models
- #Neural Networks
- Introduction of binary normalized neural networks using single-bit parameters (0 or 1) for all layers.
- Binary normalized layers can be of any type (fully connected, convolutional, attention, etc.) with slight variations from conventional layers.
- Demonstrated effectiveness through multiclass image classification and language decoder models with performance comparable to 32-bit models.
- Binary normalized models use 32 times less memory than conventional models.
- Easy implementation on current computers using 1-bit arrays without requiring dedicated hardware.
- Potential for deployment on simple and cheap hardware like mobile devices or CPUs.