G53MLE Machine Learning Web Page (2011/2012)

This module is part of the Intelligent Systems theme in the School of Computer Science. Machine learning aims to build computer systems that learn from experience or data. Instead of being programmed by humans to follow the rules of human experts, learning systems develop their own rules from trial-and-error experience to solve problems. These systems require  learning algorithms that specify how they should respond as a result of experience or examples they have been shown. Machine learning is an exciting interdisciplinary field with roots in computer science, pattern recognition, mathematics and even neuroscience. The field is experiencing rapid development and has found numerous exciting real-world applications. This course gives an introduction to the principles, techniques and applications of machine learning. Topics covered include
(i) Introduction and review of basic maths, (ii) Decision tree learning, (iii) Artificial neural networks, (iv) Bayesian learning, (v) Data processing and representations (vi) Instance based learning, (vii) Support vector machines, (viii) Clustering analysis (viii)  Emerging machine learning paradigms and (x) Applications of machine learning

Timetable

Lectures: Tuesday 12:00 - 13:00 JC-AMEN-B18 || Friday 15:00 - 16:00 JC-DEARING-C41
Labs: Thursday 15:00 - 16:00 JC-COMPSCI-C11
 
Labs 2011/12

Lab 1(L1 matlab)   Lab 2 (L2 matlab)   Lab 3 (L3 matlab)   Lab 4 (L4 matlab)   Lab 5 (L5 matlab)

Coursework 2011/12

Coursework
Out 20th February 2012
Deadline 4:00 PM 4th May 2012
Coursework Marks & Feedback

Main References

Main Textbook

Tom M. Mitchell, Machine Learning, McGraw-Hill, 1997

Further Reading
  1. Chris Bishop, Pattern Recognition and Machine Learning, Springer 2006
  2. I Witten and E. Frank, Data Mining - Practical Machine Learning Tools and Techniques, Elsevier, 2005
  3. J. Han and M. Kamber, Data Mining, Conceptes and techniques, Elsevier, 2006
  4. S. Haykin, Neural Networks - A Comprehensive Foundation, Prentic-Hall, 1999

Slides and Handouts

I have divided my slides into distinctive topics, sometimes these will be presented within one lecture, on other occasions they will be spread across multiple lectures. Slides and supporting materials will appear at least one week in advance of presentation during the course. Slides and handouts are no replacement of textbooks, you are expected to study recommended reading materials and do the exercise questions.

Topic 1 – Introduction and review of basic maths

Short Notes 01 Slides .ppt .pdf
Readings - Chapter 1 of Mitchell, The Discipline of Machine Learning
Exercises (see end of above slides)

Topic 2 –Artificial neural networks 

Short Notes 02 Slides, perceptron (.ppt .pdf), ADLINE & MSE (.ppt .pdf), MLP (.ppt .pdf) Short Notes 03
Readings - Chapter 4 of Mitchell, Partial derivative, Gradient
Exercises (see end of above slide sets)

Topic 3 – Bayesian learning 

Slides (.ppt .pdf)
Readings -  Chapter 6 of Mitchell
Exercises (see end of above slides)

Topic 4 – Instance based learning

Slides, K-NN (.ppt, .pdf)
Readings - Chapter 8 of Mitchell
Exercises - (see end of above slides)

Topic 5 – Clustering analysis

Slides (.ppt, .pdf)
Readings -  A survey paper on data clustering algorithms, please read the relevant sections covered in class.
Exercises
- (see end of above slides)

Topic 6 – Data processing and representations

Slides(.ppt .pdf)
Readings - A tutorial on PCA by Jonathon Schlens, also see relevant chapters of the above references
Exercises: see examples in the slides

Topic 7 – Support vector machines & other machine learning paradigms

Slides (.ppt .pdf)
Readings, Chapter 6 of reference 4 of the above reading list and tutorial articles
Exercises: examples in the slide and lab 3

Topic 8 –  Decision tree learning

Slides (.ppt .pdf) 2012slides
Readings - Chapter 3 of Mitchell
Exercises - See slides and textbook

Topic 9 – Summary
    

Past Exam Papers
– 10/11