Course Director
Reza Abbasi-Asl (Reza.AbbasiAsl@ucsf.edu)
Guest Lecturers
- Hani Goodarzi (hani.goodarzi@ucsf.edu)
- Jan Christoph (jan.christoph@ucsf.edu)
- Rima Arnaout (rima.arnaout@ucsf.edu)
- Maryam Bijanzadeh (maryam.bijanzadeh@ucsf.edu)
Invited Seminars by:
- Yasaman Bahri, Google DeepMind
- Ziad Obermeyer, UC Berkeley
- Reza Negahdar, NVIDIA
- Serina Chang, UC Berkeley
- Ehsan Sedaghat-Nejad, Precision Neuroscience
Teaching Assistant
- Vishvak Subramanyam (vishvak.subramanyam@ucsf.edu)
- Alex Ho (alex.ho2@ucsf.edu)
Course Description
This 10-week course will establish the foundations of practical deep learning through a hands-on approach in Python and PyTorch. We will cover the basics of regression and classification, the optimization and training of neural networks, and model architectures including autoencoders, convolutional neural networks, and transformers. The primary goal of this course is to equip students with the necessary foundations to apply basic neural networks to their own research. Students will have an opportunity to apply deep learning to problems of their own interest through a final, team-based project, with presentations on the final day of the course.
Course Information
Schedule – Jan 7th – March 13th, 2026
Lectures (Required) – Wednesdays 1-3 PM
Seminars (Required) – Fridays 1-2 PM
Recitations and Lab Sessions – Fridays 2-3 PM
Class location info: All classes will be in Mission Hall 1400
Prerequisites – Prior formal Python coursework or its equivalent is required for the course, including general familiarity with basic statistics and data analysis, and experience with Python modules such as NumPy, Pandas, Matplotlib and Scikit-learn. We ask the students to complete a quick, 5-minute Python self-assessment test and let us know as soon as possible if you have concerns. If you need a short refresher on Numpy and Jupyter notebooks, please check out this excellent page out of Stanford’s CS231n notes by Justin Johnson.
Grading – The course will be graded based on attendance, participation, and the completion of the homework assignment and course project.
Collaboration Policy – Collaboration and discussion with others in the course is highly encouraged. Feel free to work in small teams on the homework assignment. However, the final assignment must be your own, original work (your own written code and answers to questions), and cannot be copied and pasted from your collaborators.
Course Materials – Lectures, notebooks, and homeworks will be posted to the Google Classroom
Slack workspace – We have a dedicated workspace on Slack for general discussion and posting useful links and resources.
Homework Assignment – There will be 3 coding homework in this class meant to help you gain familiarity with PyTorch.
Class Project – A final project using deep learning will be due on the last lecture day. Project should be done in small groups (please try to find a group of 3-4), and can focus on any deep learning application to the biological sciences (loosely defined). 15-minute project presentations will be held on the final lecture. To make sure that you have adequate time to scope out a project, we ask that you find a group by the second lecture day and submit a short, one-paragraph project proposal by the end of the third lecture day. Additional information on the project will be distributed during the first week of class.
Optional but Recommended Reading – There are many great books and other reading resources that we’ll point to along the way, including links and references in the lectures. You may find it helpful to supplement the lectures and discussions with additional reading material. For this class, we suggest the following additional readings:
- Dive Into Deep Learning, by Aston Zhang, Zachary C. Lipton, Mu Li, Alexander Smola, a free online textbook, with coding examples in Python (although not PyTorch)
- the Stanford CS231n Course Notes, that provide an excellent and detailed description of many of these foundational concepts
- the free online textbook Deep Learning, by Ian Goodfellow, Yoshua Bengio, Aaron Courville.
Other highly recommended books that require purchasing include:
- Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems, 2nd Edition. Aurélien Géron
- Deep Learning with Python. Francois Chollet
- Fundamentals of Deep Learning. Nikhil Buduma.
Lecture Schedule
Lecture 1: Jan 7th – Class Introduction and Overview of Machine and Deep Learning (Reza Abbasi-Asl)
Recommended Reading:
- Dive into Deep Learning, Chapter 2: Preliminaries.
- CS231 Notes: Numpy/Jupyter Refresher
Focused Discussion 1: Jan 9th – Manifold Hypothesis
Lecture 2: Jan 14th – Basics of Machine Learning (Reza Abbasi-Asl)
→ Project Teams Due After Class
Recommended Reading:
- Dive into Deep Learning, Chapter 3: Linear Neural Networks
- CS231n Notes: Linear Classification
Seminar 1: Jan 16th – Reza Abbasi-Asl: Interpretable machine learning for scientific discovery
Reading materials:
- Interpretable machine learning: definitions, methods, and applications, PNAS, 2019.
- Data-driven fine-grained region discovery in the mouse brain with transformers, Nature Comm., 2025. (Code)
- Interpretable video-based tracking and quantification of parkinsonism clinical motor states, npj Parkinson’s Disease, 2024. (Code)
Lecture 3: Jan 21nd – Feed-Forward Neural Networks and Backpropagation (Maryam Bijanzadeh)
→ Team Project Proposal Due Before Class
→ Homework 1 Released: Basics of neural networks
Recommended Reading:
- Dive into Deep Learning, Chapter 4.1 – 4.3, 4.7: Multi-Layer Perceptrons
- CS231n Notes: Optimization and Backpropagation and the Chain Rule
Focused Discussion: Jan 23rd – The Unreasonable Effectiveness of Deep Learning
Lecture 4: Jan 28th – Optimization, Loss Functions, and Neural Network Training (Jan Christoph)
Recommended Reading:
- Dive into Deep Learning, Chapter 11.1, 11.2-11.6: Optimization
- CS231n Notes: Neural Network Training Pt. 1, Pt. 2, Pt. 3
Seminar 2: Jan 30th – Jan Christoph: Applying Convolutional Autoencoders in Cardiovascular Imaging
Recommended Reading:
- C. Ounkomol et al., Label-free prediction of three-dimensional fluorescence images from transmitted light microscopy, Nature Methods, 2018
- This blog post: https://gertjanvandenburg.com/blog/autoencoder/
Lecture 5: Feb 4th – Special Workshop (Rima Arnaout)
→ Homework 1 Due Before Class
- Title: “Designing Deep Learning Experiments for Healthcare”
- Goal: The goal of this workshop is to think about the experimental design behind using deep learning to solve biomedical problems, beyond just implementing the algorithm. This is not intended to be a deep dive; rather, think of this as a rehearsal for the process you’ll go through defining a rotation or thesis project.
Seminar 3: Feb 6th – Ziad Obermeyer (UC Berkeley): Bedside to bench
Reading Materials:
Lecture 6: Feb 11th – Convolutional Neural Networks Pt. 1 (Reza Abbasi-Asl)
Recommended Reading:
- Dive into Deep Learning, Chapter 6: Convolutional Neural Networks
- CS231n Notes: Convolutional Neural Networks
Seminar 4: Feb 13th – Serina Chang, UC Berkeley: Graph Neural Networks and Generative AI to Model Human Networks in Public Health
Reading Materials:
- Moritz U. G. Kraemer, Joseph L.-H. Tsui, Serina Chang, Spyros Lytras, Mark P. Khurana, […], and Samir Bhatt. “Artificial intelligence for modelling infectious disease epidemics.” Nature 2025.
- Serina Chang, Adam Fourney, and Eric Horvitz. “Measuring vaccination coverage and concerns of vaccine holdouts from web search logs.” Nature Communications 2024.
- Serina Chang, Alicja Chaszczewicz, Emma Wang, Maya Josifovska, Emma Pierson, and Jure Leskovec. “LLMs generate structurally realistic social networks but overestimate political homophily.” arXiv 2024 / under review.
Lecture 7: Feb 18th – Convolutional Neural Networks Pt. 2 (Reza Abbasi-Asl)
→ Homework 2 Released: MNIST Autoencoder
Recommended Reading:
- Dive into Deep Learning, Chapter 7: Modern Convolutional Neural Networks
- CS231n Notes: Transfer Learning
Seminar 5: Feb 20th – Ehsan Sedaghat-Nejad (Precision Neuroscience)
Seminar 6: Feb 25th – TBD
Seminar 7: Feb 27th – Yasaman Bahri, Google DeepMind: Sequence modeling, RNNs, and elements of building foundation models
→ Homework 2 Due Before Seminar
→ Homework 3 Released: Transformers
Reading materials:
- Quantum many-body physics calculations with large language models
- Dive into Deep Learning, Ch. 9
- Sequence modeling and design from molecular to genome scale with Evo
Lecture 8: March 4th – Attention mechanism in DNNs (Hani Goodarzi)
Recommended Reading:
- Dive into Deep Learning, Chapter 10: Attention Mechanisms
Seminar 8: Reza Nageahdar (NVIDIA)
→ Homework 3 Due before Seminar
Lectures 9 and 10: March 11th and 13th – Class Project Presentations.
→ All Final Projects Due before Lecture 10.
Recitation/Lab Schedule
All recitations and labs are optional but highly encouraged. These are meant to solidify additional topics from the lectures through additional discussion and coding examples. They also double as office hours, and as a place to work with other people to work together on projects and homework assignments.
We’ve allocated 1 hour for each recitation. The first ~20-30 minutes for additional Q&A from previous lectures, and a short coding example to walk through. A rough outline of topics for recitations is presented below, but these topics are flexible based on student interests and questions.
Recitation 1
- Review: Basics of Machine Learning
- Practical: Getting Set Up with Google Collab and PyTorch Basics
Recitation 2
- Review: Basics of Machine Learning
- Practical: Getting Set Up with Google Collab and PyTorch Basics
Recitation 3
- Review: Practical Neural Network Training
- Coding: PyTorch MNIST with Fully-Connected Neural Networks
Recitation 4
- Review: Practical Neural Network Training
- Coding: PyTorch MNIST with Fully-Connected Neural Networks
Recitation 5
- Review: Autoencoders and Unsupervised Learning
- Coding: MNIST and CIFAR10 with Convolutional Neural Networks in PyTorch
Recitation 6
- Review: Autoencoders and Unsupervised Learning
- Coding: MNIST and CIFAR10 with Convolutional Neural Networks in PyTorch
Recitation 7
- Review: Convolutional Neural Networks
- Coding: Fine-Tuning a Pretrained ImageNet Model
Recitation 8
- Review: Convolutional Neural Networks
- Coding: Fine-Tuning a Pretrained ImageNet Model
Recitation 9
- Review: Transformers and attention mechanism
- Additional Feedback for Final Projects