EECS 545 Home Page - EECS @ Michigan - University of Michigan [PDF]

EECS 545: Machine Learning. University of Michigan, Fall 2015. Instructor: Clayton Scott (clayscot) Classroom: GG Brown

3 downloads 12 Views 43KB Size

Recommend Stories


EECS 516 Home Page
Don’t grieve. Anything you lose comes round in another form. Rumi

EECS 498
If your life's work can be accomplished in your lifetime, you're not thinking big enough. Wes Jacks

MICHIGAN PUbLISHING UNIVERSITY OF MICHIGAN PRESS
Never wish them pain. That's not who you are. If they caused you pain, they must have pain inside. Wish

Untitled - EECS Berkeley
What we think, what we become. Buddha

University of Michigan University Library
Ask yourself: When was the last time you did something that you were afraid of? Next

University of Michigan University Library
Open your mouth only if what you are going to say is more beautiful than the silience. BUDDHA

EECS 221: Multicore Programming
How wonderful it is that nobody need wait a single moment before starting to improve the world. Anne

University of Michigan University Library
You often feel tired, not because you've done too much, but because you've done too little of what sparks

University of Michigan University Library
Ego says, "Once everything falls into place, I'll feel peace." Spirit says "Find your peace, and then

Western Michigan University
Ask yourself: How am I spending too much time on things that aren't my priorities? Next

Idea Transcript


EECS 545: Machine Learning University of Michigan, Fall 2015 Instructor: Clayton Scott (clayscot) Classroom: GG Brown 1571 Time: MW 10:30--12:00 Office: 4433 EECS Office hours: Monday 1-4 PM or by appointment GSI: Efren Cruz ([email protected]) GSI office hours: Tuesday 12-3, room EECS 2420, or by appointment. Required text: None. Recommended texts: (on reserve at the Art, Architecture, and Engineering Library) Hastie, Tibshirani, and Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Springer, Second Edition, pdf available for download. This book is also available online to UM users, who may purchase a $25 print version by clicking on the "buy now" link from the online book. Murphy, Machine Learning: A Probabilistic Perspective, MIT Press, 2012. Mohri, Rostamizadeh, and Talwalkar, Foundations of Machine Learning, MIT Press, 2012, available online to UM users. Bishop, Pattern Recognition and Machine Learning, Springer, 2006. Duda, Hart, and Stork, Pattern Classification, Wiley, 2001, available online to UM users. Sutton and Barto, Reinforcement Learning: An Introduction, MIT Press, 1998, available online to UM users. To access the books available online through the library, follow one of the links above, and then click the words "available online" which are not highlighted. Additional references: Scholkopf and Smola, Learning with Kernels, MIT Press, 2002, available online to UM users. Mardia, Kent, and Bibby, Multivariate Analysis, Academic Press, 1979 (good for PCA, MDS, and factor analysis). Boyd and Vandenberghe, Convex Optimization, Cambridge University Press, 2004, pdf available for download, and also available online to UM users (only 7 users at a time). Shalev-Shwartz and Ben-David, Understanding machine learning: from theory to algorithms, Cambridge University Press, 2014. Prerequisites: (the current formal prerequisite is currently listed as EECS 492, Artificial Intelligence, but this is inaccurate) Probability: jointly distributed random variables, multivariate densities and mass functions, expectation, independence, conditional distributions, Bayes rule, the multivariate normal distribution Linear algebra: rank, nullity, linear independence, inner products, orthogonality, positive (semi-) definite matrices, eigenvalue decompositions. Multivariable calculus: partial derivatives, gradients, chain rule It is expected that students will have a good working knowledge of these topics. Students with most but not all of this background should be able to catch up during the semester with some additional effort. Topics: Statistical machine learning Unconstrained optimization Bayes classifiers Linear discriminant analysis Naive Bayes Logistic regression Separating hyperplanes Linear regression Empirical risk minimization Kernels Kernel ridge regression Constrained optimization Support vector machines Model selection Principal component analysis PCA, the SVD, and the generalized Rayleigh quotient k-means Expectation maximization and Gaussian mixture models Kernel density estimation Spectral clustering Multidimensional scaling Nonlinear dimensionality reduction Feature selection The alternating direction method of multipliers Reinforcement learning Markov decision processes Optimal planning Learning policies from experience Decision trees Ensemble methods Boosting Neural Networks Reproducing kernel Hilbert spaces Learning theory Additional topics Grading: Homework: 45% Midterm exam: 30%, Thursday Nov. 19, 6-9 PM, location TBA. Final project: 25%. Project due Thurs. Dec. 17 at 5 pm, reviews due Mon. Dec. 21 at 12 noon. Homeworks: Homeworks will be assigned weekly. Applications will be developed through Matlab programming exercises, including face recognition, spam filtering, handwritten digit recognition, image compression, and image segmentation. Most assignments will involve some computer programming. MATLAB will serve as the official programming language of the course. I will sometimes provide you with data, fragments of code, or suggested commands, in MATLAB. Exam: You may use three cheat sheets (front and back), and no other materials are allowed. Please notify me the first week of class if you have a conflict. Final Project: There will be a final project. Groups will be allowed. The project must explore a methodology or application not covered in the lectures. You will be asked to select a paper on a methodology not covered in class, and implement the method. Because of the expected large enrollment, students will assist in grading by reviewing and evaluating other projects in the class. Collaboration on homeworks: Each student will prepare the final write-up/coding of his or her homework solutions without reference to any other person or source, aside from the student's own notes or scrap work. Students may consult classmates for the purpose of brainstorming, but not for obtaining the details of solutions. Under no circumstances may you copy solutions or code from a classmate or other source. Computer use in class: Please refrain from using computers or personal electronic devices during class, as these are distracting to me and your classmates. If you wish to use a laptop or tablet to take notes during class, please consult me first for permission. Honor Code All undergraduate and graduate students are expected to abide by the College of Engineering Honor Code as stated in the Student Handbook and the Honor Code Pamphlet. Students with Disabilities Any student with a documented disability needing academic adjustments or accommodations is requested to speak with me during the first two weeks of class. All discussions will remain confidential.

Smile Life

When life gives you a hundred reasons to cry, show life that you have a thousand reasons to smile

Get in touch

© Copyright 2015 - 2024 PDFFOX.COM - All rights reserved.