Bishop (2007). Pattern Recognition and Machine Learning (PRML). Actually I do not recommend it for beginners. But the figures are pretty and I use them in my slides.
EXs (theory, concepts): 5% x 3 = 15%. Due on Saturdays. graded by completeness, not correctness.
Quizzes (on Canvas): 5% x 3 = 15%. Due on Saturdays. everybody has two attempts.
HWs (programming): 15% x 3 = 45%. Due on Tuesdays. graded by correctness, including accuracy of predictions.
In Python+numpy only,
on a Unix-like environment (Linux or Mac OS X).
Windows is not supported. IDEs are not necessary nor recommended.
Midterm: 20% (on Canvas). NO FINAL EXAM.
Class Participation: 5%.
Late Penalty: Each student can be late by 24 hours only once without penalty. No more late submissions will be accepted.
Curve: A/A-: ~45%; B+/B/B-: ~50%; C+ and below: ~5%.
Please post all course-related questions on Canvas so that the whole class may benefit from our conversation.
Please contact us privately only for matters of a personal nature
(by default, please cc all TAs unless you want to complain about a TA).
As a course policy we will not reply to any technical questions via email.
Machine Learning evolves around the following central question:
How can we make computers to act without being explicitly programmed and to improve with experience?
In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, accurate spam filters, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that you probably use it dozens of times a day without knowing it.
This course will survey the most important algorithms and techniques
in the field of machine learning.
The treatment of Math will be rigorous,
but unlike most other machine learning courses
tons of equations,
my version will focus on the geometric intuitions and the algorithmic perspective.
I will try my best to visualize every concept.
Even though machine learning appears to be "mathy" on the surface,
it is not abstract at all, unlike mainstream CS (algorithms, theory, programming languages, etc.).
In fact, machine learning is so applied and empirical
that it is more like "alchemy".
So we will also discuss practical issues and
Some preparatory materials:
Perceptron Extensions; Perceptron in Practice
1. Python demo
2. Perceptron Extensions: voted and averaged (4.6)
3. MIRA and aggressive MIRA (not in CIML)
4. Practical Issues (5.1, 5.2. 5.3, 5.4)
5. Perceptron vs. Logistic Regression (9.6)
Aizerman et al 1964. Theoretical foundations of the potential function method in pattern recognition learning. (translated from Russian, in the same journal) (origin of kernels and kernelized perceptron)