EE787: Fundamentals of Machine Learning
Announcements
The class will not meet on Thursday, Nov 8th.
Homework #3 was posted.
The class will not meet on Thursday, Oct 18th. You may submit your homeworks on the next Tuesday.
Homework #2 was posted.
Welcome to EE787: Fundamentals of machine learning.
Course Info.
Course descriptions
Fundamental concepts and theories in machine learning, supervised and unsupervised learning, regression and classification, loss function selection and its effect on learning, regularization and robustness to outliers, numerical experiments on data from a wide variety of engineering and other discplines.
Lectures
Office hours
Prerequisites
Previous exposure to linear algebra, probability, and programming.
Working knowledge on optimization will be a plus.
Reference textbooks
Grading policy
Lecture notes
The course material is reproduced from the EE104: Introduction to machine learning by Sanjay Lall and Stephen Boyd at Stanford university, under their kind permission.
Course overview
Supervised learning via empirical risk minimization
Least squares linear regression
Validation
Features
Regularization
House prices example
Non-quadratic losses
Non-quadratic regularizers
Optimization
Prox-gradient method
Boolean classification
Multi-class classification
Neural networks
Unsupervised learning
Principal component analysis
Assignments
Several sets of occasional homeworks will be assigned.
You are encouraged to work in groups, however everyone should turn in his/her own work.
Homework 1 (due 10/4): nearest_neighbor_data.json, fitting_outliers.json, all_pairs_data.json
Homework 2 (due 10/18): inductor_data.json, rational_fit_data.json, prostate_cancer_data.json, to_one_hot.jl
Homework 3 (due 11/15): power_demand_data.json
Homework 4 (due 11/27): tomodata_fullysampled.json, tomodata_undersampled.json, line_pixel_length.jl, TV_inpainting.ipynb
Homework 5 (optional): homework_scores.csv
Julia
Julia language
We will be using Julia, which excels in high performance technical computing, for homework assignments.
You are not expected to have a strong background in programming (with Julia or otherwise), because the program you will write will use only a tiny subset of Julia's (many and powerful) features.
Reference webpages
Files
These are some data and the Julia codes in .ipynb notebook files that we are using for lectures or homework assignments.
To run .ipynb files, first download the code by right-clicking “code” and clicking “Save link as…”.
Then, upload the file to JuliaBox with the “upload” button under the “Jupyter” tab.
Julia example: KHU logo was a guitar pick. (khulogo.jpg, khu_logo_compression.ipynb)
Julia example: Huber-based dynamical system estimation. (huber-based_dynamical_system_estimation.ipynb)
JHK is safe from diabetes. (diabetes.data, diabetes.ipynb)
Polynomial fit. (polynomial_fit.ipynb)
Monomial-exponential fit. (monomial-exponential_fit.ipynb)
.json file read example. (readclassjson.jl)
Different norms. (Different_norms.ipynb)
Sound signal recovery. (Sound_signal_recovery.ipynb)
TV_inpainting. (neo_bw.png, neo_bw_corrupted.png, corrupted_image.json, TV_inpainting.ipynb)
Network topology identification. (Network_topology.ipynb)
Least squares classifier, logistic regression, and support vector machine. (ls_lr_svm.ipynb)
Iris classification. (iris.csv, iris.ipynb)
Handwriting image classification. (mnist_train.csv, mnist_test.csv, handwriting.ipynb)
|