IFML Seminar: Fantastic Sparse Masks and Where to Find Them

Join us Friday, January 20 at 12 pm CT! The Institute for Foundations of Machine Learning will welcome Shiwei Liu, Postdoctoral Fellow in the VITA group.

Shiwei Liu

Friday, January 20, 2023

12-1 pm CT

Register at


Abstract: Sparsity has widely shown its versatility in model compression, robustness improvement, and overfitting mitigation by selectively masking out a portion of parameters. However, traditional methods for obtain such masks usually involve pre-training a dense model. As powerful foundation models become prevailing, the cost of the pre-training step can be prohibitive. In this talk, I will present our recent work on efficient methods to obtain such fantastic masks by training sparse neural networks from scratch, without the need for any dense pre-training steps.  


Bio: Shiwei Liu joined IFML in Fall 2022 as a postdoctoral fellow in the VITA group, under the supervision of Dr. Atlas Wang. His research interests include (1) designing efficient (sparse) neural networks and training recipes in pursuit of efficient training and inference; and (2) studying the behavior of deep learning from a scientific perspective, by conducting empirical experiments in practice. Shiwei obtained his Ph.D. degree at the Eindhoven University of Technology (TU/e), the Netherlands, with the Distinguished Dissertation Award, under the supervision of Prof. Mykola Pechenizkiy and Dr. Decebal Constantin Mocanu. He has recently co-received the best paper award from the inaugural Learning on Graphs (LoG) Conference 2022 (as the senior author).


Contact Us: