Machine Learning and Computational Statistics DS-GA 1003 · Spring 2015 · NYU Center for Data Science

Instructor David Rosenberg
Lecture Wednesdays 5:10pm–7pm, Warren Weaver Hall 109
Lab Thursday 6:10pm–7pm, Warren Weaver Hall 109
Office Hours Instructor: Thursday 7pm–8pm, Warren Weaver Hall 109
Graders: Monday 2pm–4pm / Tuesday 1pm–2pm, CDS common area

This week

Topics

  • Information Theory
  • EM Algorithm (General)

References

      Slides and Notes

      Homework 7

      Tikhonov, Ivanov, Square Hinge, and GLMs

      Due: April 21st, 4pm

      About This Course

      This course covers a wide variety of topics in machine learning and statistical modeling. While mathematical methods and theoretical aspects will be covered, the primary goal is to provide students with the tools and principles needed to solve both the traditional and the novel data science problems found in practice. This course will also serve as a foundation on which more specialized courses and further independent study can build.

      This is a required course for the Center for Data Science's Masters degree in Data Science, and the course is designed for the students in this program. Other interested students who satisfy the prerequisites are welcome to take the class as well. Note that this class is a continuation of DS-GA-1001 Intro to Data Science, which covers some important, fundamental data science topics that may not be explicitly covered in this class (e.g. data cleaning, cross-validation, decision trees, and sampling bias).

      Course details can be found in the syllabus.

      This term we will be using Piazza for class discussion. The system is highly catered to getting you help fast and efficiently from classmates, the Lab Instructor, graders, and the Instructor. Rather than emailing questions to the teaching staff, you are encouraged to post your questions on Piazza. If you have any problems or feedback for the developers, email team@piazza.com. Without registering, you can also view an anonymized version of our Piazza board.

      See the Course Calendar for all schedule information.

      For registration information, please contact Varsha Tiger.

      Prerequisites

      Grading

      Problem sets (60%) + Midterm exam (20%) + Project (20%)

      Up to (2%) extra credit can be earned by answering student questions on Piazza and for positive contributions to class and lab discussions; there will be additional extra credit opportunities in the homework assignments in the form of optional problems and competitions.

      The course conforms to NYU’s policy on academic integrity for students.

      Resources

      Textbooks

      The cover of Machine Learning: a Probabilistic Perspective The cover of Elements of Statistical Learning The cover of Convex Optimization The cover of An Introduction to Statistical Learning
      Machine Learning: A Probabilistic Perspective (Kevin P. Murphy)
      This will be the required textbook for the class. It was chosen for several reasons: First, it covers an unusually broad set of topics, and it will serve you well as a reference for many topics beyond the scope of this course. Second, it is has a surprisingly extensive coverage of recent advances in the field, incorporating material that would otherwise only be available in research papers. Finally, if you're a Matlab coder, there is extensive software support.
      The Elements of Statistical Learning (Hastie, Friedman, and Tibshirani)
      This book is available as a free PDF, and it will serve as a nice complement to Murphy's book. Despite its popularity and the pretty pictures, this is not an easy book. It's written by three statisticians who invented many of the techniques discussed.
      Convex Optimization (Boyd and Vandenberghe)
      This book (also available as a free PDF) was an instant hit in the machine learning community when it was published in 2004. Most of the optimization problems that we know how to solve are convex problems. We will be making light use of this book, mostly for its coverage of Lagrangians and duality. However, it's a good book to get familiar with, as it's very well written and it covers a lot of techniques used in more advanced machine learning literature.
      An Introduction to Statistical Learning (James, Witten, Hastie, and Tibshirani)
      This book (also available as a free PDF), is written by two of the same authors as The Elements of Statistical Learning. It's much less intense mathematically, and it's good for a lighter introduction to the topics.

      Other tutorials and references

      (If you find additional references that you recommend, please share them on Piazza and we can add them here.)

      Software

      Lectures

      Week 1

      Lecture Jan 28 Video

      • Course mechanics
      • Statistical learning theory framework
      • Excess risk decomposition
      • Gradient and stochastic gradient descent

      References

      Lab Jan 29 Video

      • Matrix differentiation (Shanshan Ding)

      Slides and Notes

      Week 2

      Lecture Feb 4 Video

      • Excess Risk Decomposition (cont.)
      • L1/L2 regularization
      • Optimization methods for Lasso

      References

      Lab Feb 5

      • Subgradient descent

      Slides and Notes

      Week 3

      Lecture Feb 11 Video

      • Loss functions
      • Convex Optimization
      • SVM

      References

      • BV Ch 1-5
      • HTF 12.2.1 - 12.2.2

      Lab Feb 12 Video

      • Projections
      • SVM (Geometric Motivation)

      References

      • HTF 3.2.0,4.5
      • HTF pp. 417-419

      Slides and Notes

      Week 4

      Lecture Feb 18 Video

      • Features
      • Kernel Methods
      • Kernelized Ridge Regression
      • Kernelized SVM

      References

      • KPM 14.2.1-14.2.6
      • KPM 14.3.1-14.4.3
      • HTF 12.3.1
      • KPM 14.5

      Lab Feb 19

      • Kernelizations

      References

        Slides and Notes

        Week 5

        Lecture Feb 25 Video

        • Trees
        • Bias and Variance

        References

        • HTF 9.2
        • KPM 16-16.2.4

        Slides and Notes

        Lab Feb 26 Video

        • Homework Review

        References

          Week 6

          Lecture Mar 4 Video

          • Bagging
          • Random Forests
          • Boosting

          References

          • HTF 8.7
          • HTF Ch. 15
          • HTF Ch. 10
          • KPM Ch. 16.4

          Lab Mar 5 Video

          • Probability Distribution Review (Shanshan Ding)

          References

            Slides and Notes

            Week 7

            Lecture Mar 11 Video

            • Gradient Boosting
            • Neural Networks

            Project Adviser Meetings Mar 12

            References

              Slides and Notes

                Week 8

                Lecture Mar 25 Video

                • Probabilistic Modeling
                • Conditional Probability Models

                References

                  Lab Mar 26 Video

                  • Generalized Linear Models

                  Slides and Notes

                  Week 9

                  Lecture Apr 1 Video

                  • Midterm Exam Review

                  References

                    Slides and Notes

                    Lab Apr 2 Video

                    • Midterm Exam Review

                    References

                      Slides and Notes

                        Week 10

                        Midterm Exam Apr 8

                        References

                          Slides and Notes

                            Midterm Recap Apr 9

                            • Midterm Recap

                            References

                              Slides and Notes

                              Week 11

                              Lecture Apr 15 Video

                              • Bayesian Networks

                              References

                              • KPM Chapter 10

                              Slides and Notes

                              Project Adviser Meetings Apr 16

                              References

                                Slides and Notes

                                  Week 12

                                  Lecture Apr 22 Video

                                  • Bayesian Methods
                                  • Bayesian Regression

                                  References

                                  • KPM Chapter 5
                                  • KPM 7.6

                                  Slides and Notes

                                  Lab Apr 23 Video

                                  • Beta-Binomial Model
                                  • Naive Bayes
                                  • Bag-of-Words (Bernoulli Version)

                                  References

                                    Week 13

                                    Lecture Apr 29 Video

                                    • k-means Clustering
                                    • Gaussian Mixture Models
                                    • EM Algorithm (Introduction)

                                    References

                                      Slides and Notes

                                      Lab (Shanshan) Apr 30 Video

                                      • SVD, PCA, Linear Discriminant Analysis

                                      References

                                        Slides and Notes

                                        Week 14

                                        Lecture May 6 Video

                                        • Information Theory
                                        • EM Algorithm (General)

                                        References

                                          Project Adviser Meetings May 7

                                          References

                                            Slides and Notes

                                              Assignments

                                              Homework Submission: Homework should be submitted through NYU Classes.

                                              Late Policy: Homeworks are due at 4pm on the date specified. Homeworks will still be accepted for 48 hours after this time but will have a 20% penalty.

                                              Collaboration Policy: You may discuss problems with your classmates. However, you must write up the homework solutions and the code from scratch, without referring to notes from your joint session. In your solution to each problem, you must write down the names of any person with whom you discussed the problem—this will not affect your grade.

                                              Homework 1

                                              Ridge regression and SGD

                                              Due: February 6th, 4pm

                                              Homework 2

                                              Lasso regression

                                              Due: February 13th, 4pm

                                              Homework 3

                                              SVM and Sentiment Analysis

                                              Due: February 23rd, 4pm

                                              Homework 4

                                              Kernel Methods and Lagrangian Duality

                                              Due: March 3rd, 4pm

                                              Homework 5

                                              Trees and Ensemble Methods

                                              Due: March 25th, 4pm

                                              Homework 6

                                              Midterm Review

                                              Due: April 10th, 4pm

                                              Homework 7

                                              Tikhonov, Ivanov, Square Hinge, and GLMs

                                              Due: April 21st, 4pm

                                              People

                                              Instructor

                                              A photo of David Rosenberg

                                              David Rosenberg

                                              David is Chief Scientist of YP Mobile Labs at YP.

                                              Lab Instructor (Mathematical Foundations)

                                              A photo of Shanshan Ding

                                              Shanshan Ding

                                              Shanshan is a research data scientist in the YP Mobile Labs group at YP.

                                              Graders

                                              Project Advisers