CS 513 (400/401), Applied Machine Learning (e-campus), Spring 2022

“Equations are just the boring part of mathematics. I attempt to see things in terms of geometry.”
-- Stephen Hawking (1942--2021)

Coordinates [Syllabus] [Canvas] [Registrar] [Slack Channel]
Instructor Liang Huang (liang.huang@oregonstate.edu)
TA & Office Hours Junkun Chen (chenjun2@oregonstate.edu); office hours: W/F 5-6pm on his zoom.
Ning Dai (dain@oregonstate.edu); office hours: M 5-6pm on the above zoom.
Instructor office hours available the weeks before HWs are due (usually 3pm Tuesdays); on the same zoom.
Prerequisites
  • CS: algorithms and datastructures. fluent in at least one mainstream languages (Python, C/C++, Java).
    HWs will be done in Python+numpy only.
  • Math: linear algebra, calculus, and basic probability theory. geometric intuitions.

Textbooks
  • Daume. A Course in Machine Learning (CIML). default reference.
  • Bishop (2007). Pattern Recognition and Machine Learning (PRML). Actually I do not recommend it for beginners. But the figures are pretty and I use them in my slides.
Grading
  • Background survey (on Canvas): each student gets 2% by submitting on time.
  • Quizzes (on Canvas, autograded): 10% + 8% = 18%. everybody has two attempts on each quiz.
  • HWs 1-4 (programming): 20% + 15% + 15% + 15% = 65%.
    In Python+numpy only, on a Unix-like environment (Linux or Mac OS X). Windows is not supported or recommended. IDEs are not necessary either.
  • HW5: Paper review: 15%. cutting-edge machine learning research.

  • HWs are generally due on Mondays; Quizzes are generally due on Fridays.
  • Late Penalty: Each student can be late by 24 hours only once without penalty. No more late submissions will be accepted.
Communication We use three tools for communication:
  • Course homepage: textbooks, handouts, slides, ipython notebooks and demo programs, lecture videos, homework, and data.
  • Canvas: announcements (you'll receive emails), homework submission, (autograded) quizzes and surveys, and grades.
  • Slack: discussions. Please post all course-related questions on Slack so that the whole class may benefit from our conversation.
Please contact us privately only for matters of a personal nature. As a strictly enforced course policy, we will not reply to any technical questions via email.

Machine Learning evolves around a central question: How can we make computers to learn from experience and without being explicitly programmed? In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, accurate spam filters, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that everybody uses it dozens of times a day without knowing it.

This course will survey the most important algorithms and techniques in the field of machine learning. The treatment of Math will be rigorous, but unlike most other machine learning courses which feature tons of equations, my version will focus on the geometric intuitions and the algorithmic perspective. I will try my best to visualize every concept.

Even though machine learning appears to be "mathy" on the surface, it is not abstract in any sense, unlike mainstream CS (algorithms, theory, programming languages, etc.). In fact, machine learning is so applied and empirical that it is more like alchemy. So we will also discuss practical issues and implementation details.
Some preparatory materials:
Weekly Materials
Week Topics (CIML References) Slides/Handouts Videos homework/exercises/quizzes
Unit 1 (weeks 1-3): ML intro, k-NN, and math/numpy review
1 1-2. Introduction (§0, §2.7)
3. Training, Test, and Generalization Errors (§2.5--2.6), Underfitting and overfitting (§2.4), and Leave-one-out cross-validation (§5.6).
4. k-nearest neighbor classifier (k-NN) (§3)

5. viewing HW1 data on terminal
6-7. data pre-processing: binarization
slides (topics 1-5)

notebook (python3) (topics 6-7) and toy.txt
background survey (required)
Quiz 1 (ML basics)
HW1 out (k-NN) [tex] [data] [validate.py] [random_output.py]
2 Geometric Review of Linear Algebra
Numpy Tutorial (also matplotlib):
1. ipython notebook; ndarray; %pylab; +/-/*, dot, concat
2. linear regression; np.polyfit; np.random.rand(); broadcasting
3. visualizing vectors operations; dot product, projection
notebook (python3) (topics 1-3)

linear algebra:
handout slides

Quiz 2 (numpy/linear algebra)
you should finish at least parts 1-2 of HW1 by week 2.
3 finish HW1! HW1 DUE
Unit 2 (weeks 4-5): Linear Classification and Perceptron Algorithm
4 Linear Classification and Perceptron
1. Historical Overview; Bio-inspired Learning (§4.1)
2. Linear Classifier (§4.3-4.4); Augmented space (not in CIML)
3. Perceptron Algorithm (§4.2)
4. Convergence Proof (§4.5)
5. Limitations and Non-Linear Feature Map (§4.7, §5.4)

6. ipynb demo
slides

notebook (python3) (from [181])
Quiz 3 (perceptron)
HW2 out [tex]
(same data as HW1)
5 Perceptron Extensions; Perceptron in Practice
1. Python demo
2. Perceptron Extensions: voted and averaged (§4.6)
3. MIRA and aggressive MIRA (not in CIML)
4. Practical Issues (§5.1-5.4)
5. Perceptron vs. Logistic Regression (§9.6)
slides

demo
HW2 due
Unit 3 (weeks 6-7): Linear and Polynonmial Regression
6-7 linear and polynomial regression (mostly not in CIML, but mentiond in §7.6) HW3 (housing price prediction)
kaggle data tex
Unit 4 (weeks 8-9): Applications: Text Classification
8-9 Application: Text Classification
Sentiment Analysis (thumbs up?)
HW4 [tex]
data and code
Unit 5 (week 10): Exposure to cutting-edge ML research
10 Paper Review: cutting-edge ML topics and papers HW5 (paper review)

Classical Papers:
Tech Giants Are Paying Huge Salaries for Scarce A.I. Talent (New York Times)
Liang Huang