Deep learning is called the “new electricity” for modern science and technology. Getting a good understanding of modern deep learning methods would be critical for statisticians who are interested in “big data” research. 2023 was probably one of the most notable years in recent history where AI becomes a daily buzzword. Not a single day goes by without us hearing something about AI. For us as data scientists and biostatisticians, we need to know AI to be successful in our future careers.
This course is jointly taught by Drs. Sen Yang and Dajiang Liu. It covers basic theory and applications for modern deep learning methods in genomics and health informatics. We will survey basic theory for deep neural networks, convoluted neural networks, sequence models, generative adversarial networks, etc.
We will not follow any textbooks, but a majority of the fundamental deep learning stuff will be taken from the following sources:
1. Deep Learning book by Ian Goodfellow and Yoshua Bengio and Aaron Courville (link)
2. Deep Learning with Python book by Francois Chollet (link)
3. Deep Learning with R book also by Francois Chollet (link)
Both books can be read for free online. In the meantime, many new topics will be drawn directly from research papers.
The programming language used in this course will be Python or R. I am more experienced with deep learning modeling in Python. Yet, more recently, much progress has been made adapting Torch or TensorFlow to R. In this semester, we will still stick with Python for most of the illustrations.
Lectures slides and notes.
- Introduction (slides )
- Deep feed-forward network (slides notes)
- Python and PyTorch tutorials (link1 link2 link3)
- Computation (slides notes)
- Convoluted neural network (slides notes)
- CNN hands on (link)
- Recurrent neural network (slides notes)
- Generative models (slides notes)
- Attention and transformer (slides)