“Equations are just the boring part of mathematics. I attempt to see things in terms of geometry.”
-- Stephen Hawking (1942--2021)
Coordinates | [Canvas] [Registrar] [Ed Discussion Forum] |
Instructor | Liang Huang (liang.huang@oregonstate.edu); office hours: Thu 3-4pm on this zoom. |
TA | Ning Dai (dain@oregonstate.edu); office hours: M/F 4-5pm on the same zoom. |
Prerequisites |
|
Textbooks |
|
Grading |
|
Unit 1 (weeks 1-3): ML intro, k-NN, and math/numpy review | |
---|---|
1.0 | Introduction |
1.1 | Machine Learning Settings |
1.2 | Basic Machine Learning Concepts |
1.3 | Nearest Neighbor Classifier |
1.4 | Linear Algebra and Numpy Tutorials |
HW1 | k-NN for income classification [pdf] [data] | Unit 2 (weeks 4-5): linear classification and perceptron |
2.1 | History of Perceptron |
2.2 | Linear Classification |
2.3 | The Perceptron Algorithm |
2.4 | Convergence Theorem and Proof |
2.5 | Inseparable Cases and Feature Engineering |
2.6 | Voted and Averaged Perceptrons |
HW2 | perceptron for sentiment [pdf] [data] | Unit 3 (weeks 6-7): linear and polynomial regression |
3.1 | Linear Regression |
3.2 | Regularize |
3.3 | Gradient Descent |
3.4 | Normal Equation |
3.5 | Nonlinear Regression |
HW3 | regression for housing price prediction [pdf] [data] | Unit 4 (weeks 8-9): a taste of deep learning |
4.1 | Multilayer Neural Networks |
4.2 | Word Embeddings |
HW4 | redo HW2 with word embeddings [pdf] (HW2 data + embeddings) |