Machine Learning for Analytics Syllabus / 4/27/2015

MSA 8150: Machine Learning for Analytics

©2015 William N. Robinson

Table of Contents

MSA 8150: Machine Learning 1

1 Catalog Description 2

1.1 Prerequisites 2

1.2 Sections 2

2 Instructor 2

2.1 Contact the instructor… Please! 2

2.2 Course web site 2

3 Overview 2

3.1 Intended audience 2

3.2 Learning objectives 2

4 Schedule 3

5 Readings by Session 5

5.1 References 5

5.2 E-book from Books24x7 5

5.3 Software 5

5.4 Data sets 6

6 Evaluation 6

7 In class exercises 7

7.1 Demonstration: R, StatET Eclipse 7

7.2 Demonstration: KNIME 7

7.3 Demonstration: Python 7

7.4 Demonstration: Octave 7

7.5 Data Mining Headlines 8

8 Homework 8

8.1 HW Bayes nets 8

8.2 HW k-means 8

8.3 HW HMM 8

8.4 HW Support Vector Machines 8

8.5 HW Gaussian processes 8

8.6 HW Neural Networks 8

8.7 HW Gibbs sampling 8

8.8 HW Variation Bayes 8

8.9 HW Boltzmann machines 8

9 Examinations 8

9.1 Exam 1 8

9.2 Exam 2 8

10 How to scan Computing literature 9

10.1 Software 9

10.2 Literature review 9

11 Workload Expectations 9

12 Student Behavior 10

12.1 Discrimination and harassment 10

12.2 Official department class policies 10

As with any document, be aware that this may contain clerical errors. Please tell me if you spot one.

The instructor reserves the right to modify the syllabus as necessary to improve student learning and provide appropriate evaluation. Students will be notified of any such modification in-class and via the web site.

1  Catalog Description

The current university catalog description of this course can be obtained in the University’s Catalog:

http://www.gsu.edu/es/catalogs_courses.html

A recent university catalog description follows:

The course will cover theory, methods, and tools for automated inference from data. This introductory course will include (1) supervised learning, (2) unsupervised learning methods, (3) graphical structure models, and (4) deep learning. The course will prepare students in the fundamentals of machine learning, as well as provide practical skills in apply current software tools to machine inference from large data sets.

1.1  Prerequisites

Required: assumes some experience with probability

1.2  Sections

Room / Days / Time

2  Instructor

Dr. William N. Robinson; http://wrobinson.cis.gsu.edu;

Office (404) 413-7374; Dept: (404) 413-7360; FAX: (404) 413-7394

Office hours: TBA & by Appointment. Ask me about Instant Messaging (MS Messenger Live).

2.1  Contact the instructor… Please!

During the term, it is highly recommended that you contact the instructor, in-person or via email. I am available to help you focus your projects, gain access to resources, and answer your questions. Please try to see me, phone me, or e-mail me at least once during the term to discuss your project. Your class members are also a good source of help.

2.2  Course web site

Web sites for our course is on www.onedrive.com. See your email for details.

3  Overview

This class covers the principles of machine learning emphasizing fundamentals, methods, and tools.

3.1  Intended audience

Anyone with a keen interest in data mining and machine inference will do well in this course. It’s mainly geared to produced Data Analysts.

3.2  Learning objectives

Upon successful completion of this course, you will accomplish the following objectives and outcomes. In particular, students who complete this course will gain “Ready for work” skills (along with theory), including:

1.  Specifying and reasoning about data mining problems

2.  Applying data mining tools & techniques

Specific objectives include the following:

  1. Understand the data mining context
  2. Describe the challenges of data mining and machine learning
  3. Define machine learning
  4. Understand common machine learning models
  5. Supervised learning
  6. Linear regression
  7. Classification
  8. Neural networks
  9. Support vector machines
  10. Unsupervised learning
  11. Graphical model structure learning
  12. Deep learning models
  13. Boltzmann machines
  14. Understand and apply common mining tools
  15. R and R environments
  16. Python libraries for machine learning
  17. Data workflow tools (e.g., KNIME)
  18. Create descriptive and predictive models from data
  19. Know current concepts in machine learning, data mining, and prediction
  20. Demonstrate critical thinking, integrative reasoning, & communication skills

4  Schedule

The following table defines the schedule. However, the topics and readings may change according to the interests and abilities of the class. See the Academic Calendar.
On the web, the underlined items link to supporting information. Materials may be updated 24 hours prior to class; please check before attending class.

# / Date / Readings / In class / Due /
1 / Introduction: §1,2,5,6(Murphy 2012) / Demonstration: R, StatET Eclipse /
2 / Discrete models (chains, trees): §3, 10(Murphy 2012); §2(Schutt et al. 2013) / Demonstration: KNIME / HW Bayes nets
G1 Student presentation /
3 / Gaussian models: §4(Murphy 2012) / Demonstration: Python / G2 Student presentation /
4 / Linear regression: §7 – 9(Murphy 2012); §3(Schutt et al. 2013); Either: (Lantz 2013) or (Harrington 2012) / Demonstration: Octave / G3 Student presentation /
5 / Mixture models(k-means): §11(Murphy 2012); §4(Schutt et al. 2013) / Demonstration: KNIME / G4 Student presentation /
6 / Hidden Markov model: §17(Murphy 2012) / HW k-means /
7 / Exam 1 / HW HMM /
8 / Kernels and kernel methods: §13, 14(Murphy 2012) /
9 / Gaussian processes: §15(Murphy 2012) / HW Support Vector Machines
G5 Student presentation /
10 / Adaptive basis function models, neural networks: §16(Murphy 2012) / HW Gaussian processes
G6 Student presentation /
11 / Markov random fields, MCMC introduction: §19, 23(Murphy 2012) / HW Neural Networks
G7 Student presentation /
12 / MCMC inference: §24(Murphy 2012) / G8 Student presentation /
13 / Variational inference: §21, 22(Murphy 2012) / G9 Student presentation /
14 / Graphical model structure: §26(Murphy 2012) / HW Gibbs sampling /
15 / Deep learning: §27, 28(Murphy 2012); §15-16(Schutt et al. 2013) / HW Variation Bayes /
/ Exam 2 / HW Boltzmann machines /

5  Readings by Session

Readings provide content for class discussions. Thus, readings must be read prior the class. For example, week 1 readings must be read prior to class on week 1. The readings are in order of importance. Thus, where there are a lot of readings, you may need to scan the last articles.

Don’t get more than 1 week ahead of the class in the readings. Sometimes (mostly rarely) readings may be changed 1 week prior to their presentation in class.

1.  Introduction: §1,2,5,6(Murphy 2012); §1(Schutt et al. 2013)

2.  Discrete models (chains, trees): §3, 10(Murphy 2012); §2(Schutt et al. 2013)

3.  Gaussian models: §4(Murphy 2012)

4.  Linear regression: §7 – 9(Murphy 2012); §3(Schutt et al. 2013); Either: (Lantz 2013) or (Harrington 2012)

5.  Mixture models(k-means): §11(Murphy 2012); §4(Schutt et al. 2013)

6.  Hidden Markov model: §17(Murphy 2012)

7.  Exam 1

8.  Kernels and kernel methods: §13, 14(Murphy 2012)

9.  Gaussian processes: §15(Murphy 2012)

10.  Adaptive basis function models, neural networks: §16(Murphy 2012)

11.  Markov random fields, MCMC introduction: §19, 23(Murphy 2012)

12.  MCMC inference: §24(Murphy 2012)

13.  Variational inference: §21, 22(Murphy 2012)

14.  Graphical model structure: §26(Murphy 2012)

15.  Deep learning: §27, 28(Murphy 2012); §15-16(Schutt et al. 2013)

16.  Exam 2

5.1  References

Students must have access to the primary textbook:

Primary Textbook: Murphy, K.P. Machine learning: a probabilistic perspective MIT press, 2012.

Some books can be accessed from E-book from Books24x7. Most articles have a URL, which can be used to . download the article. (This assumes that you are on the university network directly or VPN. You may be prompted for your campus ID and password.)

Some articles may be only available from our web site. To find other articles, use the method described in section 10, How to scan Computing literature.)

Readings

  1. Harrington, P. Machine learning in action Manning Publications Co., 2012.
  2. Lantz, B. Machine learning with R Packt Publishing Ltd, 2013.
  3. Murphy, K.P. Machine learning: a probabilistic perspective MIT press, 2012.
  4. Schutt, R., and O'Neil, C. Doing data science: Straight talk from the frontline O'Reilly Media, Inc., 2013.

5.2  E-book from Books24x7

Consider the E-books as good resource; they are free to our students. See this note: http://www2.cis.gsu.edu/cis/news/newandnoteworthy2.asp Access from the GSU online library: http://homer.gsu.edu/search/databases/proxy/GLL25038; select the link Books24x7. You can also scroll down to Books 24x7 in the list of “databases”: http://homer.gsu.edu/search/databases/alphabetical#B

5.3  Software

Additionally, much of the software is available for download, either from the instructor, or from the CIS agreements with MSDNAA and the IBM Academic Initiative.

In particular, student may select their tool set, from either R, Python, or KNIME (having both):

·  The R Project for Statistical Computing

·  R support

o  RStudio

o  StatET Eclipse plugin (for R)

·  KNIME analytics platform

·  Python

5.4  Data sets

These datasets are from the UCI Machine Learning Repository.

Datasets and description files.
DATASETS / DATATYPES / DESCRIPTIONS
Iris (CSV) / Real / Iris description (TXT)
Wine (CSV) / Integer, real / Wine description (TXT)
Haberman’s Survival (CSV) / Integer / Haberman description (TXT)
Housing (TXT) / Categorical, integer, real / Housing description (TXT)
Blood Transfusion Service Center (CSV) / Integer / Transfusion description (TXT)
Car evaluation (CSV) / Categorical / Car description (TXT)
Mushroom (CSV - 1.9MB) / Binary / Mushroom description (TXT)
Pen-based recognition of handwritten digits (CSV) / Integer / Digits description (TXT)

6  Evaluation

Students are evaluated by the deliverables summarized in Table 1. The course credits are earned according to the following Table 1.

Table 1 Relative weights assigned to course deliverables.

Assignment / Percentage
Exam 1 / 25
Exam 2 / 30
In class exercises / 5
Data Mining Headlines (team) / 10
Homework / 30
Total / 100

The following table overviews how credit will be assigned. Note that all group work includes a peer review, which can distinguish an individual’s assigned points from the group’s assigned points. (See Self-Managed Teams in the Workload Expectations section.)

Table 2 Grading standards.

Work quality / Percent
Absolutely fantastic, walk on water, overflow grade / 110
Excellent answer on all counts / 100
Excellent answer on most counts / 90
Very good answer, but not excellent / 80
Professionally done and adequate / 70
Inadequate, needs work / 60
Varying degrees of inadequacy / 0 - 50

The following breakout depicts how grades will be assigned under this system.

Grade / Percentage
A+ / ≥ 97
A / ≥ 90
A- / ≥ 87
B+ / ≥ 83
B / ≥ 80
B- / ≥ 77
C+ / ≥ 73
C / ≥ 70
C- / ≥ 67
D / ≥ 60
F / < 60

7  In class exercises

Each exercise is intended as a group effort, which illustrates important concepts introduced in the associated readings. More detailed description and associated materials shall be found on the course web site.

·  Deliver your results to the course web site during class (only).

o  Authors shall receive credit for each in-class exercise.

o  Prominently (at the top) of the delivered document, place the names of authors.

o  Do not include the name of anyone who is absent or did not contribute. Doing so will result in zero credit for all ‘authors’.

o  Late deliverables (after class) shall receive zero credit.

7.1  Demonstration: R, StatET Eclipse

7.2  Demonstration: KNIME

7.3  Demonstration: Python

7.4  Demonstration: Octave

7.5  Data Mining Headlines

Your group will present a data mining issue that has been in the headlines within the last 10 years (or 20 years if it is still considered a significant case). The goals of the assignment are to:

·  Show the relevance of data mining for everyone

·  Present data mining course materials in the context of real, ongoing, problems

·  Generate discussion about data mining —in particular, tradeoffs, decision-making, and consequences of data mining for organizations and people

In your presentation:

·  Show the news article(s), blogs, etc.

·  Present a few PowerPoint slides summarizing the articles, the data mining issues, and provide issues and questions for subsequent discussion

·  Moderate a brief discussion

Deliver to our web site:

·  Your PowerPoint slides

·  Any notes that might be relevant to aid further study

8  Homework

See the web site for the most recent and detailed information on these assignments. The following is provided as an introduction to each assignment.

8.1  HW Bayes nets

8.2  HW k-means

8.3  HW HMM

8.4  HW Support Vector Machines

8.5  HW Gaussian processes

8.6  HW Neural Networks

8.7  HW Gibbs sampling

8.8  HW Variation Bayes

8.9  HW Boltzmann machines

9  Examinations

Online review guides to be updated one-half week prior to the exam.

9.1  Exam 1

See the online exam review for a description.

9.2  Exam 2

Comprehensive! Similar in nature to a certification exam. See the online exam review for a description.

10 How to scan Computing literature

10.1  Software

Install EndNote:

  1. Free EndNote @ GSU

10.2  Literature review

Search for peer reviewed articles using keywords:

  1. Scan the web
  2. www.google.com
  3. Scan the web using scholar search engines
  4. http://scholar.google.com/
  5. Set the Google Scholar Preferences to
  6. Show library access links for Georgia State University
  7. Show links to import citations into EndNote
  8. http://academic.live.com/
  9. http://citeseer.ist.psu.edu/
  10. Scan using library databases (@GSU)
  11. http://www.galileo.usg.edu
  12. In particular, the following databases
  13. ABI/INFORM Complete
  14. ACM Digital Library
  15. IEEE Xplore

11 Workload Expectations

Students should plan for 2 - 3 hours of work outside of class each week for each course credit hour. Thus, a 3-credit course averages between 6 and 9 hours of student work outside of the classroom, each week. See GSU sites for Academic Success:

·  http://www2.gsu.edu/~wwwcam/incept/successtips.html

·  http://www2.gsu.edu/~wwwctr/sac/StudySkills.htm

Students must take responsibility for their learning. In contrast to high school, college has fewer opportunities for student teacher interactions. Consequently, students must prepare to gain the most from each interaction.