Learning Machines
Taught by Patrick Hebron at ITP, Fall 2016
Previous Edition: Fall 2015
Overview:
This course aims to introduce machine learning, a complex and quickly evolving subject. In the first half of the semester, we will investigate the conceptual and technical workings of a few key machine learning models, their underlying mathematics and their philosophical value in understanding the general phenomena of learning and experience. In the second half of the semester, we will focus on the development of software projects that apply machine learning to real-world problems.
Required Text:
- Anderson, Britt. Computational Neuroscience and Cognitive Modelling: A Student's Introduction to Methods and Procedures. Los Angeles: SAGE, 2014.
Syllabus Overview:
- Introductions
- What is Learning?
- Categories of Machine Learning Algorithms
- Machine Memorization vs. Machine Learning
- Getting Started in Python
- Homework Review
- A Brief Tour of Fuzzy Logic
- A Brief Tour of Graph Theory
- Dimensions from 1 to N
- Linear Algebra Primer (Part 1)
- Homework Review
- A Brief Look at k-means clustering
- The Perceptron
- Getting Started with Plotting in Python and Matplotlib
- Homework Review
- Linear Algebra Primer (Part 2)
- Calculus Primer
- The Multilayer Perceptron
- Homework Review
- A General Methodology for Working with Neural Networks
- Working with Data
- Practical Limitations in Supervised Learning
- Index, Icon and Symbol: Charles Peirce's Theory of Signs
- An Introduction to Unsupervised Learning
- Homework Review
- The Restricted Boltzmann Machine
- The Deep Belief Network
- What Deep Learning Can Tell Us About Learning in General
- Homework Review
- Embedding Spaces: Self-Organizing Maps, t-SNE and word2vec
- The Design Implications of Machine Learning
- Homework Review
- Practical Considerations for Application Development
- Project Development: From Intuition to Exploration
- Homework Review
- An Introduction to Recurrent Neural Networks
- Project Development: From Exploration to Formalization
- Homework Review
- An Introduction to Convolutional Neural Networks
- Project Development: From Formalization to Iteration
- Homework Review
- What is Learning?
- Project Development: From Iteration to Reflection on Process
Week 12:
- Student Project Presentations
- Final Thoughts
A Note About Primary Sources:
Before an advancement in machine learning is distilled into textbooks, tutorials, blogs and open-source implementations, it is generally introduced in the form of an academic research paper. Many of these papers can be found at Arxiv and the other sites listed in the Academic Research Tools section below. These documents are not easy to read - they often describe ideas using mathematical nomenclature and assume that the reader is already familiar with the subject. Yet, these research papers are the best way to access the current cutting edge within machine learning. For this reason, it is important to become familiar with the format and decyphering its contents. To aid this process, we will read and discuss a primary source research paper each week. The primary source readings are labeled as such in the syllabus.
Additional Resources:
Python Installation Resources:
Python Resources:
Math for Machine Learning:
Academic Research Tools:
Going Further: