Skip to main content

ML+X Seminar with George Biros: N-body Hessians

Join us Friday, April 23 at 3 pm CT! We will be joined by George Biros, W. A. "Tex'' Moncrief Chair in Simulation-Based Engineering Sciences in the Oden Institute for Computational Engineering at UT Austin.

George Biros

The Machine Learning Lab is hosting a series of talks that highlight the diverse applications of machine learning. ML+X seminars welcome faculty from across UT Austin whose work intersects with machine learning and are held every other Friday during the semester from 3-4 pm CT. These talks spark engaging conversation and collaboration. In this upcoming Seminar, Prof. George Biros will join us to talk about N-body methods.

 

Friday, April 23, 2021
3:00 PM – 4:00 PM CT
Register at https://utexas.qualtrics.com/jfe/form/SV_0jIIlDVLpEIT0y2 

 

Abstract

N-body methods are ubiquitous in science and engineering and are a core computational primitive for many production codes in simulation and data analysis.  In machine learning these methods can be used to approximate large dense matrices. Examples include, covariance matrices, kernel matrices, Hessians, and inverse graph Laplacians.   In the first part of my talk, I will first describe GOFMM (Geometry-Oblivious Fast-Multipole Method), an N-body method that can be used to compress an arbitrary SPD matrix. For many (but not all) problems of practical interest, GOFMM enables an approximate matrix-vector multiplication in O(N log N) or even O(N) time. Compression requires O(N log N) storage and work. In general, GOFMM belongs to the hierarchical matrix approximation methods. In particular, it generalizes FMM to a purely algebraic setting by only requiring the ability to sample matrix entries. Neither geometric information (i.e., point coordinates) nor knowledge of how the matrix entries have been generated is required, thus, the term “geometry oblivious”. In the second part of my talk, I will describe using GOFMM to approximate the Gauss-Newton Hessian of a fully connected multilayer perceptron. The algorithm uses double randomization, both for GOFMM and for the entries of the Hessian. I will present results for simple autoencoders and compare with the K-FAC and global low-rank Hessian approximations.

Speaker Bio

George Biros is the W. A. "Tex'' Moncrief Chair in Simulation-Based Engineering Sciences in the Oden Institute for Computational Engineering and Sciences and has Full Professor appointments with the departments of Mechanical Engineering and Computer Science (by courtesy) at the University of Texas at Austin. From 2008 to 2011, he was an Associate Professor in the School of Computational Science and Engineering at Georgia Tech and The Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. From 2003 to 2008, he was an Assistant professor in Mechanical Engineering and Applied Mechanics at the University of Pennsylvania.  He received his BS in Mechanical Engineering from Aristotle University in Greece (1995), his MS in Biomedical Engineering from Carnegie Mellon (1996), and his PhD in Computational Science and Engineering also from Carnegie Mellon (2000).  He was a postdoctoral associate at the Courant Institute of Mathematical Sciences from 2000 to 2003. With collaborators, he received the ACM Gordon Bell Prize in 2003 and in 2010. 

Contact us: ML-Lab@austin.utexas.edu