Noah Golmant

Spring 2017 I was a (U)GSI for CS 189/289A, the machine learning course. Here are the posts I made during that semester.


Mar. 18, 2017 - Alex Francis (another TA) and I made an article deriving the univariate Gaussian, as well as the Central Limit Theorem and some useful applications. I wrote up the short portion on multivariate normals, which I think is helpful for understanding why the density function looks the way it does. It also gives an intuition for how covariance comes about due to affine transformations on a collection of independent, normally distributed variables. You can access that here.

Mar. 16, 2017 - I've made a short write-up on the basics of information and entropy for understanding decision trees. You can find it here. I hope it helps understand why we define splits the way we do.

Feb. 6, 2017 - I think that this resource by Stanford professor Andrew Ng provides a great intuitition for the max-margin SVM formulation that is slightly different from how we present it here. We start with finding some margin of width gamma while constraining the norm of the weights to 1, which we can do if the data is linearly separable. The norm constraint makes this too hard to solve, so we change the constraints, with our change relying on the relation between the functional and geometric margins. We make one final change to rescale our boundary to have 1 on the right hand side of the constraint inequality.

We can now solve this program using standard quadratic programming techniques (for the curious). A common one is coordinate descent. For the very curious, check out this paper on the version used by libraries like LIBLINEAR.

Finally, for matrix calculus, the matrix cookbook is very useful as a reference.

Jan. 25, 2017 - today I mentioned a useful way to find the gradient of a function when dealing with matrices. I made a short write-up on it along with an example. You can find it here. The following resources will also be very helpful this semester.