Course Director
Reza Abbasi-Asl (Reza.AbbasiAsl@ucsf.edu)
Guest Lecturers
- Hani Goodarzi (hani.goodarzi@ucsf.edu)
- Jan Christoph (jan.christoph@ucsf.edu)
- Rima Arnaout (rima.arnaout@ucsf.edu)
- Maryam Bijanzadeh (maryam.bijanzadeh@ucsf.edu)
Invited Seminars by:
- Ehsan Adeli, Stanford University
- Sri Nagarajan, UCSF
- Yusuf Roohani, Arc Institute
- Nilah Ioannidis, UC Berkeley
- Yasaman Bahri, Google DeepMind
- Erdrin Azemi, Apple
- Serina Chang, UC Berkeley
Teaching Assistant
Clay Smyth (clay.smyth@ucsf.edu)
Course Description
This 10-week course will establish the foundations of practical deep learning through a hands-on approach in Python and PyTorch. We will cover the basics of regression and classification, the optimization and training of neural networks, and model architectures including autoencoders, convolutional neural networks, and transformers. The primary goal of this course is to equip students with the necessary foundations to apply basic neural networks to their own research. Students will have an opportunity to apply deep learning to problems of their own interest through a final, team-based project, with presentations on the final day of the course.
Course Information
Schedule – Jan 8th – March 14th, 2025
Lectures (Required) – Wednesdays 1-3 PM
Seminars (Required) – Fridays 1-2 PM (Except Jan 17th and 31st which will be 2-3 PM)
Recitations and Lab Sessions – Fridays 2-3 PM (Except Jan 17th and 31st which will be 1-2 PM)
Class location info: All classes will be in Mission Hall 2108, Mission Bay except on the following days:
- Wed, Jan 8th: Helen Diller Family Cancer Research Bldg160, HD-160, Mission Bay
- Wed, Jan 15th: Genentech Hall S-201
- Wed, Jan 22nd: Helen Diller Family Cancer Research Bldg160, HD-160, Mission Bay
- Wed, Jan 29th: Helen Diller Family Cancer Research Bldg160, HD-160, Mission Bay
Prerequisites – Prior formal Python coursework or its equivalent is required for the course, including general familiarity with basic statistics and data analysis, and experience with Python modules such as NumPy, Pandas, Matplotlib and Scikit-learn. We ask the students to complete a quick, 5-minute Python self-assessment test and let us know as soon as possible if you have concerns. If you need a short refresher on Numpy and Jupyter notebooks, please check out this excellent page out of Stanford’s CS231n notes by Justin Johnson.
Grading – The course will be graded based on attendance, participation, and the completion of the homework assignment and course project.
Collaboration Policy – Collaboration and discussion with others in the course is highly encouraged. Feel free to work in small teams on the homework assignment. However, the final assignment must be your own, original work (your own written code and answers to questions), and cannot be copied and pasted from your collaborators.
Course Materials – Lectures, notebooks, and homeworks will be posted to the Google Classroom
Slack workspace – We have a dedicated workspace on Slack for general discussion and posting useful links and resources.
Homework Assignment – There will be 1-2 coding homework in this class meant to help you gain familiarity with PyTorch.
Class Project – A final project using deep learning will be due on the last lecture day. Project should be done in small groups (please try to find a group of 3-4), and can focus on any deep learning application to the biological sciences (loosely defined). 15-minute project presentations will be held on the final lecture. To make sure that you have adequate time to scope out a project, we ask that you find a group by the second lecture day and submit a short, one-paragraph project proposal by the end of the third lecture day. Additional information on the project will be distributed during the first week of class.
Optional but Recommended Reading – There are many great books and other reading resources that we’ll point to along the way, including links and references in the lectures. You may find it helpful to supplement the lectures and discussions with additional reading material. For this class, we suggest the following additional readings:
- Dive Into Deep Learning, by Aston Zhang, Zachary C. Lipton, Mu Li, Alexander Smola, a free online textbook, with coding examples in Python (although not PyTorch)
- the Stanford CS231n Course Notes, that provide an excellent and detailed description of many of these foundational concepts
- the free online textbook Deep Learning, by Ian Goodfellow, Yoshua Bengio, Aaron Courville.
Other highly recommended books that require purchasing include:
- Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems, 2nd Edition. Aurélien Géron
- Deep Learning with Python. Francois Chollet
- Fundamentals of Deep Learning. Nikhil Buduma.
Lecture Schedule
Lecture 1: Jan 8th – Class Introduction and Overview of Machine and Deep Learning (Reza Abbasi-Asl)
Recommended Reading:
- Dive into Deep Learning, Chapter 2: Preliminaries.
- CS231 Notes: Numpy/Jupyter Refresher
Seminar 1: Jan 10th – Reza Abbasi-Asl: Interpretable machine learning for scientific discovery
Reading materials:
- Interpretable machine learning: definitions, methods, and applications, PNAS, 2019.
- Unsupervised pattern identification in spatial gene expression atlas reveals mouse brain regions beyond established ontology, PNAS, 2024. (Code)
- Interpretable video-based tracking and quantification of parkinsonism clinical motor states, npj Parkinson’s Disease, 2024. (Code)
Lecture 2: Jan 15th – Basics of Machine Learning (Reza Abbasi-Asl)
→ Project Teams Due After Class
Recommended Reading:
- Dive into Deep Learning, Chapter 3: Linear Neural Networks
- CS231n Notes: Linear Classification
Seminar 2: Jan 17th – Ehsan Adeli, Stanford University: Data-Driven Exploration of the Interplay between Human Actions and Neural Circuitry
Reading materials:
- Data-Driven Discovery of Movement-Linked Heterogeneity in Neurodegenerative Diseases, Nature Machine Intelligence 2024 [PDF] [Code]
- GAMMA-PD: Graph-based Analysis of Multi-Modal Motor Impairment Assessments in Parkinson’s Disease, MICCAI GRaphs in biomedical Image analysis 2024 [PDF]
- An Explainable Geometric-Weighted Graph Attention Network for Identifying Functional Networks Associated with Gait Impairment, Medical Image Computing and Computer Assisted Intervention (MICCAI 2023), Lecture Notes in Computer Science, [PDF][Code]
- GaitForeMer: Self-Supervised Pre-Training of Transformers via Human Motion Forecasting for Few-Shot Gait Impairment Severity Estimation, 25th International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2022), Resort World Convention Centre, Singapore, September 18-22, 2022. [PDF] [Code][Video]
- Quantifying Parkinson’s Disease Motor Severity Under Uncertainty Using MDS-UPDRS Videos, Medical Image Analysis, Volume 73, October 2021, 102179. [PDF] [Code]
Lecture 3: Jan 22nd – Feed-Forward Neural Networks and Backpropagation (Maryam Bijanzadeh)
→ Team Project Proposal Due Before Class
→ Homework 1 Released: Basics of neural networks
Recommended Reading:
- Dive into Deep Learning, Chapter 4.1 – 4.3, 4.7: Multi-Layer Perceptrons
- CS231n Notes: Optimization and Backpropagation and the Chain Rule
Seminar 3: Jan 24th – Sri Nagarajan, UCSF
Lecture 4: Jan 29th – Optimization, Loss Functions, and Neural Network Training (Jan Christoph)
Recommended Reading:
- Dive into Deep Learning, Chapter 11.1, 11.2-11.6: Optimization
- CS231n Notes: Neural Network Training Pt. 1, Pt. 2, Pt. 3
Seminar 4: Jan 31st – Yusuf Roohani, Arc Institute
Lecture 5: Feb 5th – Special Workshop (Rima Arnaout)
→ Homework 1 Due Before Class
- Title: “Designing Deep Learning Experiments for Healthcare”
- Goal: The goal of this workshop is to think about the experimental design behind using deep learning to solve biomedical problems, beyond just implementing the algorithm. This is not intended to be a deep dive; rather, think of this as a rehearsal for the process you’ll go through defining a rotation or thesis project.
Seminar 5: Feb 7th – Nilah Ioannidis, UC Berkeley
Lecture 6: Feb 12th – Convolutional Neural Networks Pt. 1 (Reza Abbasi-Asl)
Recommended Reading:
- Dive into Deep Learning, Chapter 6: Convolutional Neural Networks
- CS231n Notes: Convolutional Neural Networks
Seminar 6: Feb 14th – Yasaman Bahri, Google DeepMind
Lecture 7: Feb 19th – Convolutional Neural Networks Pt. 2 (Reza Abbasi-Asl)
→ Homework 2 Released: MNIST Autoencoder
Recommended Reading:
- Dive into Deep Learning, Chapter 7: Modern Convolutional Neural Networks
- CS231n Notes: Transfer Learning
Seminar 7: Feb 21st – TBD
Lecture 8: Feb 26th – Autoencoders (Jan Christoph)
Recommended Reading:
- this blog post: https://gertjanvandenburg.com/blog/autoencoder/
Seminar 8: Feb 28th – Erdrin Azemi, Apple
Lecture 9: March 5th – Attention mechanism in DNNs (Hani Goodarzi)
→ Homework 2 Due Before Class
Recommended Reading:
- Dive into Deep Learning, Chapter 10: Attention Mechanisms
Seminar 9: March 7th – Serina Chang, UC Berkeley
Lectures 10 and 11: March 12th and 14th – Class Project Presentations.
→ All Final Projects Due before Lecture 10.
Recitation/Lab Schedule
All recitations and labs are optional but highly encouraged. These are meant to solidify additional topics from the lectures through additional discussion and coding examples. They also double as office hours, and as a place to work with other people to work together on projects and homework assignments.
We’ve allocated 1 hour for each recitation. The first ~20-30 minutes for additional Q&A from previous lectures, and a short coding example to walk through. A rough outline of topics for recitations is presented below, but these topics are flexible based on student interests and questions.
Recitation 1
- Review: Basics of Machine Learning
- Practical: Getting Set Up with Google Collab and PyTorch Basics
Recitation 2
- Review: Basics of Machine Learning
- Practical: Getting Set Up with Google Collab and PyTorch Basics
Recitation 3
- Review: Practical Neural Network Training
- Coding: PyTorch MNIST with Fully-Connected Neural Networks
Recitation 4
- Review: Practical Neural Network Training
- Coding: PyTorch MNIST with Fully-Connected Neural Networks
Recitation 5
- Review: Autoencoders and Unsupervised Learning
- Coding: MNIST and CIFAR10 with Convolutional Neural Networks in PyTorch
Recitation 6
- Review: Autoencoders and Unsupervised Learning
- Coding: MNIST and CIFAR10 with Convolutional Neural Networks in PyTorch
Recitation 7
- Review: Convolutional Neural Networks
- Coding: Fine-Tuning a Pretrained ImageNet Model
Recitation 8
- Review: Convolutional Neural Networks
- Coding: Fine-Tuning a Pretrained ImageNet Model
Recitation 9
- Review: Transformers and attention mechanism
- Additional Feedback for Final Projects