Bishop (2007). Pattern Recognition and Machine Learning (PRML). Actually I do not recommend it for beginners. But the figures are pretty and I use them in my slides.
Background survey (on Canvas):
each student gets 2% by submitting on time.
Quizzes (on Canvas): 10% + 5% = 15%. everybody has two attempts.
EX (theory, concepts): 8%. graded by completeness, not correctness.
HWs (programming): 15% x 5 = 75%. graded by correctness.
In Python+numpy only,
on a Unix-like environment (Linux or Mac OS X).
Windows is not supported. IDEs are not necessary nor recommended.
Late Penalty: Each student can be late by 24 hours only once without penalty. No more late submissions will be accepted.
Curve: A/A-: ~45%; B+/B/B-: ~50%; C+ and below: ~5%.
This course is organized into 5 Units,
each with 2 weeks and 1 programming HW
which is usually out on Mondays and due on Saturdays the week after.
See syllabus for more details.
We post textbook, slides, videos, homework, data, readings here on this homepage.
Canvas is only used for announcements, discussions, homework submission, quizzes, and grades.
Please post all course-related questions on Canvas so that the whole class may benefit from our conversation.
Please post questions on Canvas under the corresponding Units (e.g. "Unit 1 Q/A").
Please contact us privately only for matters of a personal nature.
As a strictly enforced course policy, we will not reply to any technical questions via email.
Machine Learning evolves around a central question:
How can we make computers to
learn from experience and without being explicitly programmed?
In the past decade, machine learning has given us
practical speech recognition,
effective web search,
accurate spam filters,
and a vastly improved understanding of the human genome.
Machine learning is so pervasive today that everybody uses it
dozens of times a day without knowing it.
This course will survey the most important algorithms and techniques
in the field of machine learning.
The treatment of Math will be rigorous,
but unlike most other machine learning courses
tons of equations,
my version will focus on the geometric intuitions
and the algorithmic perspective.
I will try my best to visualize every concept.
Even though machine learning appears to be "mathy" on the surface,
it is not abstract in any sense,
unlike mainstream CS (algorithms, theory, programming languages, etc.).
In fact, machine learning is so applied and empirical
that it is more like alchemy.
So we will also discuss practical issues and
Some preparatory materials:
Perceptron Extensions; Perceptron in Practice
1. Python demo
2. Perceptron Extensions: voted and averaged (4.6)
3. MIRA and aggressive MIRA (not in CIML)
4. Practical Issues (5.1, 5.2. 5.3, 5.4)
5. Perceptron vs. Logistic Regression (9.6)
Aizerman et al 1964. Theoretical foundations of the potential function method in pattern recognition learning. (translated from Russian, in the same journal) (origin of kernels and kernelized perceptron)