Xortran - A PDP-11 Neural Network With Backpropagation in Fortran IV
11 days ago
- #FORTRAN
- #Retro Computing
- #Neural Networks
- XORTRAN is a multilayer perceptron (MLP) written in FORTRAN IV, executed on a PDP-11/34A via SIMH simulator.
- It solves the XOR problem using a hidden layer with 4 neurons and leaky ReLU activation.
- Training involves backpropagation with mean squared error loss and He-like initialization.
- Learning rate annealing is applied (0.5 → 0.1 → 0.01) with tanh output.
- The code compiles with DEC FORTRAN IV compiler (1974) and requires 32KB memory and FP11 processor.
- Training 17 parameters takes under a few minutes on real hardware; SIMH throttle set to 500K for realistic speed.
- Output shows MSE loss every 100 epochs, converging to accurate XOR results after a few hundred epochs.
- Example output demonstrates successful approximation of XOR logic.
- Instructions provided for running in RT-11 environment, including compilation and execution steps.
- Project highlights FORTRAN IV's capability for basic neural networks, bridging retro-computing and modern ML.