On a past personal project to learn about ML, I'd been struggling with implementing multiple variable gradient descent, and decided to write up a conceptual explanation of the method to make sure I had my head wrapped around it well enough. I referenced several sources for information about gradient descent, but most heavily influential was Andrew Ng's Supervised Machine Learning course on Coursera.
Stats Related Posts
An Introduction to Gradient Descent
An Introduction to (some) Bayesian Statistics
As part of my work at MIT, I needed to have a very good understanding of Bayesian statistics, which I have not had much exposure to before. For that reason, I wrote up this document going over some introductory facets of Bayesian statistics. It's very incomplete, of course, but it's a useful tool for brushing up. Here, I am walking through Kruschke's "Doing Bayesian Data Analysis." These are more my notes on the topic than any original lessons.
Linear Regression and Gradient Descent Applied to NBA Data
A while back I wanted to gain some experience with ML, so I started off with a very simple application of linear regression to NBA data, detailed here.