Analysis of stochastic Lanczos quadrature methods for log-determinant approximation

Abstract:

Calculating logarithms of determinants of large positive definite matrices is a fundamental problem in wide applications , such as Gaussian process kernel learning, Markov random fields, and Bayesian inference. Classical method to solve this problem like Cholesky is computational prohibitive for large dense problems and is not scalable. Stochastic Lanczos Quadrature (SLQ) method is one promising approach to scale the computing whereas the analysis of such a method involves some sophisticated mathematics. In this talk, we will present our recent results on analysis of the SLQ methods with asymmetric quadruature node. And then we will use the variance reduction technique for multilevel Monte Carlo to accelerate the SLQ method. Some probabilistic approach is introduced to construct a suboptimal subspace for variance reduction. Some new random numerical linear algebra tricks and new matrix concentration equalities are used to analyze the accelerated SLQ methods.


Bio:

Dr. Shenxin Zhu received a bachelor degree in Information and computational mathematics from Xiamen university,and a MSC degree in computational mathematics from CAEP and a Phd degree in Numerical analysis from University of Oxford. He is currently an associate professor BNU-UIC jointed research center for mathematics mathematics. DR. Zhu has a broad reseach interest, in particular, high performance numerical linear algebra,high dimensional approximation,learning theory,and knowledge discovery. DR ZHU has close collaboration with industries,involves several industrial projects and serves as a consultant in industrial mathematics.