The course will:

- run in Fall 2018 and is open to students from any department. Undergraduates may take the course with permission of the instructor. Graduate students must register for the course (i.e. cannot audit) due to the the large number of students interested.
- cover both theory and applications of deep learning.
- require a working knowledge of linear algebra and probability.
- ask students to write an original article with the goal of submitting to ICML 2019.

Resources:

- Andrej Karpathy's ArXiv Sanity Preserver
- Geoff Hinton's Course - general introduction
- Fei Fei Li's Course - neural nets for machine vision
- Chris Manning and Richard Socher's Course - neural nets for natural language processing
- Python-based neural net libraries: Keras and PyTorch
- Yoshua Bengio, Aaron Courville, and Ian Goodfellow's Textbook
- Google's Dataset Search (in beta)
- Adam Harley's 3d MNIST Visualization - fully connected - convolutional
- TensorFlow Word Vector Visualization
- Bayesian ML - Tensorflow or Python

Lectures: [password protected - I am happy to provde the password if you email me (bhanin at tamu dot edu)]

- Lecture 1: Introduction
- Lecture 2: Initialization
- Lecture 3: Gradient Descent by Backpropagation
- Lecture 4: Categorical Variables, Learning Rate v. Batch Size
- Lecture 5: Expressivity of Deep ReLU Nets
- Lecture 6: Expressivity of Shallow Nets
- Lecture 7: ConvNets for Machine Vision
- Lecture 8: ResNets, DenseNets, BatchNorm
- Lecture 9: Loss Surface in Linear Networks
- Lecture 10: Loss Surface and SGD in Non-Linear Networks
- Lecture 11: RNNs and Word Embeddings
- Lecture 12: Encoder/Decoders for NLP
- Lecture 13: Wide Nets via Mean Field Theory 1
- Lecture 14: Wide Nets via Mean Field Theory 2
- Lecture 15: Bayesian Neural Nets 1
- Lecture 16: Bayesian Neural Nets 2
- Lecture 17: VAEs and PCA 1
- Lecture 18: VAEs and PCA 2
- Lecture 19: GANs 1
- Lecture 20: GANs 2
- Lecture 21: Generalization 1
- Lecture 22: Generalization 2