ML I, Week 5: Homework Questions

  1. How would you use Bayes' rule and a bunch of density estimators to solve a classification problem?

  2. Why is it a bad idea to solve a two-class classification problem by linear least-squares regression onto the class indicator function? Characterize two solutions that work in terms of the loss functions they minimize.

  3. Verify (by calculating the partial dE/dw) the claim that regression with logistic sigmoid ouput and cross-entropy loss function for a two-class classification problem gives the same gradient descent rule as linear least-squares regression.

due Monday, 25 Nov 2002