Machine Learning, EX1 TO BE DONE INDIVIDUALLY. 4%. -------------------------------------- Due Wed 10/4 at 11:59pm on Canvas. Only .txt or .pdf are accepted. LaTeXing is recommended but not required. This EX prepares for HW1. -------------------------------------- 1. Prove that perceptron converges regardless of initial weight vector and any constant learning rate. 2. Work out the MIRA update formula *geometrically* for the negative example case. 3. (optional) Come up with a variable learning rate scheme and prove convergence. 4. For real-valued features, we often transform each feature to be zero-mean and unit-variance. Geometrically, why this helps learning? 5. If you are to build a spam detector with a vocabulary size of V. Each email has a maximum length of n, and there are D training emails. You will run perceptron over the dataset T times, and make U updates in total (U<