# MECH-8290-43

Introduction to Machine Learning using Python

## Course Description

Broad introduction to machine learning, datamining, and statistical pattern recognition. The intention of the course is to familiarize students with different topics in the field, explain concepts, and implement the learning in numerous case-studies. This course will try to cover (i) Supervised learning (parametric/nonparametric algorithms, support vector machines, kernels, neural networks). (ii) Unsupervised learning (clustering, dimensionality reduction)

## Class and lab information

Lecture: 3 hours/week

Additional, approximate study hours: 5 hours/week, as student requires

Estimated division of Learning hours

hands-on labs and activities: 10%

group work: N/A

lecture: 20%

individual work (including homework, project, and tutorials): 50%

class discussion (during lecture time): 10%

Credit weight: 3.0

Course format: face-to-face content summaries, examples, and questions with online and textbook-based readings, videos, tutorials, etc.

Pre-requisites

You should understand basic probability and statistics, algebra, and calculus.

For programming, you should have some background in programming, and it would be helpful if you know Matlab or Python. During the course Python is mostly used and some tutorials will be done in class. Your project code must be in Python.

## Resources

Course Blackboard site: available via student Blackboard login

Interactive in-class activities via tools discussed at the beginning of the course.

There should be no costs to students for using these interactive tools.

Primary text

Additional resources

Murphy, K. P. (2013). Machine learning: A probabilistic perspective. Cambridge, MA: MIT Press.

Mello, R. F., & Ponti, M. A. (2018). Machine learning: A practical approach on the statistical learning theory. Cham, Switzerland: Springer.

Web resources

Organizations: Leddy Library, Knovel

## Course Schedule

The following course schedule is approximate. (L) for lecture and (T) for tutorial. The readings are from Bishop.

Week

01

02

03

04

05

06

07

08

09

10

11

12

Topics

L 01: Course Introduction + Linear Regression

L 02: Probability + MLE + MAP + Gradient Descent + Cross Validation

L03: Linear Classification + Logistic Regression

L04: Non-parametric (Nearest Neighbor) + Multi-class Classification (KNN)

L 05: Probabilistic Classifiers (GDA + Naive Bayes)

L 06: Neural Networks (MLP)

L 07: Clustering (k-means) + Mixture of Gaussians (GMM, EM)

L 08: Principal Component Analysis (PCA) & Autoencoders

L 09: Support Vector Machines (SVM)

L 10: Ensemble Methods

L 11: Project Presentation

L 12: Final Exam Review

Readings

1.0, 1.1, 1.2, 3.1

2.0, 2.1, 2.3, 4.2.4, 1.3

pp. 179-195, 203-207

2.5, pp. 179-184, 4.1.2, 4.3.4

4.2.2, pp. 380-381

5.1-5.3

9.1, 9.2, 9.3, 2.3.9

12.1, 4.1

7.1, 4.1.1, 4.1.2, 6.1, 6.2, pp. 325-337

14.2-14.3

N/A

N/A