CS 534, Machine Learning, Fall 2017
“Equations are just the boring part of mathematics. I attempt to see things in terms of geometry.”
-- Stephen Hawking (b. 1942)
||T/Th 12-1:20pm, WNGR 149
|LH: TBD, KEC 2069 |
TAs: Dezhong T/Th 10-11am, Yilin W/F 4-5pm, both at KEC Atrium.
Additional office hours available before exams.
- CS: algorithms and datastructures. fluent in at least one mainstream languages (Python, C/C++, Java).
HWs will be done in Python+numpy only.
- Math: linear algebra, calculus, and basic probability theory. good sense of geometric intuitions.
- Hal Daume III. A Course in Machine Learning (CIML). default reference. easy to understand.
- Tom Mitchell (1994). Machine Learning. a classical textbook.
an easy read.
outdated but still more helpful than most recent ones.
- Mohri et al (2012). Foundations of Machine Learning. theory perspective. covers more recent advances such as SVMs that weren't covered in Mitchell.
- Bishop (2007). Pattern Recognition and Machine Learning (PRML). Actually I do not recommend it, definitely not for beginners. But the figures are pretty and I use them in my slides.
- Midterm: 25%. NO FINAL EXAM.
- Project (groups of up to 3): 25%. No late submission is allowed.
(5% proposal, 5% presentation, 15% report).
- HWs (programming, groups of up to 3): 10% x 3 = 30%.
- EXs (theoretical, individual): 3% x 2 = 6%.
- Class Participation: 6%.
- Quiz (tentatively before thanksgiving): 8%.
- Late Penalty: Each student can be late by 24 hours only once without penalty.
No more late submissions will be accepted. If a group submission is late,
it is considered late for all teammates.
E.g., if a team of A and B submits late and it's the first late submission from A
and the second from B,
then A will receive credit for this submission but B will not.
Machine Learning evolves around the following central question:
How can we make computers to act without being explicitly programmed and to improve with experience?
In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, accurate spam filters, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that you probably use it dozens of times a day without knowing it.
This course will survey the most important algorithms and techniques
in the field of machine learning.
The treatment of Math will be rigorous,
but unlike most other machine learning courses
tons of equations,
my version will focus on the geometric intuitions and the algorithmic perspective.
I will try my best to visualize every concept.
The overall structure is quite similar to previous offerings by Prof. Xiaoli Fern
Some new aspects in this offering include:
You can study the exams from previous offerings, but do not copy HW solutions
(since we have different HWs).
- It covers fewer topics, and goes deeper on some (e.g., structured prediction).
- HWs and the project must be done in Python.
- It will have in-class demos (ipynb).
See also my previous offering of this course at CUNY.
Some preparatory materials:
- Linear Algebra:
- Probability Theory:
- Python+numpy Tutorial:
week topic HW/EX due
2 logistic ex1 (perceptron theory)
3 multi, linear hw1 (perceptron, logistic)
5 SVM hw2 (linear, multiclass, SVM)
6 midterm ex2 (kernel theory, SVM/KKT theory)
7 struct predict project_proposal
8 kmeans, EM HW3 (struct perc)
Slides and Assignments: