BMI/BioE 212: Deep Learning for Biological and Clinical Research

Course Director

Reza Abbasi-Asl (Reza.AbbasiAsl@ucsf.edu)

Guest Lecturers

Invited Seminars by:

  • Ehsan Adeli, Stanford University
  • Sri Nagarajan, UCSF
  • Yusuf Roohani, Arc Institute
  • Nilah Ioannidis, UC Berkeley
  • Yasaman Bahri, Google DeepMind
  • Erdrin Azemi, Apple
  • Serina Chang, UC Berkeley

Teaching Assistant

Clay Smyth (clay.smyth@ucsf.edu)

Course Description

This 10-week course will establish the foundations of practical deep learning through a hands-on approach in Python and PyTorch. We will cover the basics of regression and classification, the optimization and training of neural networks, and model architectures including autoencoders, convolutional neural networks, and transformers. The primary goal of this course is to equip students with the necessary foundations to apply basic neural networks to their own research. Students will have an opportunity to apply deep learning to problems of their own interest through a final, team-based project, with presentations on the final day of the course.

Course Information

Schedule – Jan 8th – March 14th, 2025

Lectures (Required) – Wednesdays 1-3 PM

Seminars (Required) – Fridays 1-2 PM (Except Jan 17th, 31st, and Feb 21st, which will be 2-3 PM)

Recitations and Lab Sessions – Fridays 2-3 PM (Except Jan 17th, 31st, and Feb 21st which will be 1-2 PM)

Class location info: All classes will be in Helen Diller Family Cancer Research, HD-160, Mission Bay except on the following days:

  • Jan 24th: Mission Hall 1400, Mission Bay
  • Jan 31st: Genentech GH 106, Byers Auditorium
  • Feb 5th: Genentech GH N-114
  • Feb 14th: Mission Hall 1400, Mission Bay
  • Feb 19th: Rock Hall 102 Pottruck, Rock Hall Auditorium
  • March 5th: Mission Hall 1400, Mission Bay

Prerequisites – Prior formal Python coursework or its equivalent is required for the course, including general familiarity with basic statistics and data analysis, and experience with Python modules such as NumPy, Pandas, Matplotlib and Scikit-learn. We ask the students to complete a quick, 5-minute Python self-assessment test and let us know as soon as possible if you have concerns. If you need a short refresher on Numpy and Jupyter notebooks, please check out this excellent page out of Stanford’s CS231n notes by Justin Johnson.

Grading – The course will be graded based on attendance, participation, and the completion of the homework assignment and course project.

Collaboration Policy – Collaboration and discussion with others in the course is highly encouraged. Feel free to work in small teams on the homework assignment. However, the final assignment must be your own, original work (your own written code and answers to questions), and cannot be copied and pasted from your collaborators.

Course Materials – Lectures, notebooks, and homeworks will be posted to the Google Classroom

Slack workspace – We have a dedicated workspace on Slack for general discussion and posting useful links and resources.

Homework Assignment – There will be 1-2 coding homework in this class meant to help you gain familiarity with PyTorch.

Class Project – A final project using deep learning will be due on the last lecture day. Project should be done in small groups (please try to find a group of 3-4), and can focus on any deep learning application to the biological sciences (loosely defined). 15-minute project presentations will be held on the final lecture. To make sure that you have adequate time to scope out a project, we ask that you find a group by the second lecture day and submit a short, one-paragraph project proposal by the end of the third lecture day. Additional information on the project will be distributed during the first week of class.

Optional but Recommended Reading – There are many great books and other reading resources that we’ll point to along the way, including links and references in the lectures. You may find it helpful to supplement the lectures and discussions with additional reading material. For this class, we suggest the following additional readings:

  1. Dive Into Deep Learning, by Aston Zhang, Zachary C. Lipton, Mu Li, Alexander Smola, a free online textbook, with coding examples in Python (although not PyTorch)
  2. the Stanford CS231n Course Notes, that provide an excellent and detailed description of many of these foundational concepts
  3. the free online textbook Deep Learning, by Ian Goodfellow, Yoshua Bengio, Aaron Courville.

Other highly recommended books that require purchasing include:

  1. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems, 2nd Edition. Aurélien Géron
  2. Deep Learning with Python. Francois Chollet
  3. Fundamentals of Deep Learning. Nikhil Buduma.

Lecture Schedule

Lecture 1: Jan 8th  – Class Introduction and Overview of Machine and Deep Learning (Reza Abbasi-Asl)

Recommended Reading:


Seminar 1: Jan 10th  – Reza Abbasi-Asl: Interpretable machine learning for scientific discovery

Reading materials:


Lecture 2: Jan 15th  – Basics of Machine Learning (Reza Abbasi-Asl)

Project Teams Due After Class

Recommended Reading:


Seminar 2: Jan 17th  – Ehsan Adeli, Stanford University: Data-Driven Exploration of the Interplay between Human Actions and Neural Circuitry

Reading materials:

  • Data-Driven Discovery of Movement-Linked Heterogeneity in Neurodegenerative Diseases, Nature Machine Intelligence 2024 [PDF] [Code]
  • GAMMA-PD: Graph-based Analysis of Multi-Modal Motor Impairment Assessments in Parkinson’s Disease, MICCAI GRaphs in biomedical Image analysis 2024 [PDF]
  • An Explainable Geometric-Weighted Graph Attention Network for Identifying Functional Networks Associated with Gait Impairment, Medical Image Computing and Computer Assisted Intervention (MICCAI 2023), Lecture Notes in Computer Science, [PDF][Code]
  • GaitForeMer: Self-Supervised Pre-Training of Transformers via Human Motion Forecasting for Few-Shot Gait Impairment Severity Estimation, 25th International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2022), Resort World Convention Centre, Singapore, September 18-22, 2022. [PDF] [Code][Video]
  • Quantifying Parkinson’s Disease Motor Severity Under Uncertainty Using MDS-UPDRS Videos, Medical Image Analysis, Volume 73, October 2021, 102179. [PDF] [Code]

Lecture 3: Jan 22nd – Feed-Forward Neural Networks and Backpropagation (Maryam Bijanzadeh)

Team Project Proposal Due Before Class

→ Homework 1 Released: Basics of neural networks

Recommended Reading:


Seminar 3: Jan 24th  – Sri Nagarajan, UCSF: “Bayesian inference with deep learning in medical imaging” 

In this lecture I will review two ways in which deep learning and Bayesian inference are combined. For MRI reconstruction applications, deep non-linear inversion techniques can be used for fast reconstruction with parallel imaging. Deep neural networks can also be used for black-box Bayesian inference of parameters in non-linear models. The attached papers illustrate these applications to medical imaging data.

Reading materials:


Lecture 4: Jan 29th – Optimization, Loss Functions, and Neural Network Training (Jan Christoph)

Recommended Reading:


Seminar 4: Jan 31st  –  Yusuf Roohani, Arc Institute: Foundation models for cell and molecular biology

Readings materials:

  • Universal Cell Embeddings [Link]
  • How to build the virtual cell with artificial intelligence: Priorities and opportunities [Link]
  • Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences [Link]
  • Predicting transcriptional outcomes of novel multigene perturbations with GEARS [Link]

Lecture 5: Feb 5th Special Workshop (Rima Arnaout)

Homework 1 Due Before Class

  • Title: “Designing Deep Learning Experiments for Healthcare”
  • Goal: The goal of this workshop is to think about the experimental design behind using deep learning to solve biomedical problems, beyond just implementing the algorithm. This is not intended to be a deep dive; rather, think of this as a rehearsal for the process you’ll go through defining a rotation or thesis project.

Seminar 5: Feb 7th – Nilah Ioannidis, UC Berkeley: Personal genome interpretation with sequence-based genomic deep learning models

Reading materials:


Lecture 6: Feb 12th – Convolutional Neural Networks Pt. 1 (Reza Abbasi-Asl)

Recommended Reading:


Seminar 6: Feb 14th Yasaman Bahri, Google DeepMind


Lecture 7: Feb 19th – Convolutional Neural Networks Pt. 2 (Reza Abbasi-Asl)

Homework 2 Released: MNIST Autoencoder

Recommended Reading:

  • Dive into Deep Learning, Chapter 7: Modern Convolutional Neural Networks
  • CS231n Notes: Transfer Learning

Seminar 7: Feb 21st – Irene Y. Chen, UC Berkeley/UCSF


Lecture 8: Feb 26th – Autoencoders (Jan Christoph)

Recommended Reading:


Seminar 8: Feb 28th – Erdrin Azemi, Apple


Lecture 9: March 5th – Attention mechanism in DNNs (Hani Goodarzi)

Homework 2 Due Before Class

Recommended Reading:

  • Dive into Deep Learning, Chapter 10: Attention Mechanisms

Seminar 9: March 7th – Serina Chang, UC Berkeley


Lectures 10 and 11: March 12th and 14th – Class Project Presentations.

All Final Projects Due before Lecture 10.

Recitation/Lab Schedule

All recitations and labs are optional but highly encouraged. These are meant to solidify additional topics from the lectures through additional discussion and coding examples. They also double as office hours, and as a place to work with other people to work together on projects and homework assignments.

We’ve allocated 1 hour for each recitation. The first ~20-30 minutes for additional Q&A from previous lectures, and a short coding example to walk through. A rough outline of topics for recitations is presented below, but these topics are flexible based on student interests and questions.

Recitation 1

  • Review: Basics of Machine Learning
  • Practical: Getting Set Up with Google Collab and PyTorch Basics

Recitation 2

  • Review: Basics of Machine Learning
  • Practical: Getting Set Up with Google Collab and PyTorch Basics

Recitation 3

  • Review: Practical Neural Network Training
  • Coding: PyTorch MNIST with Fully-Connected Neural Networks

Recitation 4  

  • Review: Practical Neural Network Training
  • Coding: PyTorch MNIST with Fully-Connected Neural Networks

Recitation 5

  • Review: Autoencoders and Unsupervised Learning
  • Coding: MNIST and CIFAR10 with Convolutional Neural Networks in PyTorch

Recitation 6

  • Review: Autoencoders and Unsupervised Learning
  • Coding: MNIST and CIFAR10 with Convolutional Neural Networks in PyTorch

Recitation 7

  • Review: Convolutional Neural Networks
  • Coding: Fine-Tuning a Pretrained ImageNet Model

Recitation 8

  • Review: Convolutional Neural Networks
  • Coding: Fine-Tuning a Pretrained ImageNet Model

Recitation 9

  • Review: Transformers and attention mechanism
  • Additional Feedback for Final Projects