Bayes, Bits and Brains
13 days ago
- #probability
- #information theory
- #machine learning
- The site focuses on probability and information theory to understand machine learning and the world.
- Riddles are provided to engage readers and will be understood by the end of the mini-course.
- The mini-course will cover KL divergence, entropy, cross-entropy, and their intuitions.
- Explores principles like maximum likelihood and maximum entropy in machine learning.
- Discusses the use of logits, softmax, and Gaussian distributions in machine learning.
- Guidance on setting up loss functions and understanding compression's role in LLMs.
- Encourages readers to delve into the first chapter to explore these concepts deeply.