We study sketching algorithms for approximating the full eigenspectrum of a symmetric n x n matrix A. We present a simple sublinear time algorithm that approximates all eigenvalues of A up to additive error ±epsilon*n*||A||_infty using those of a randomly sampled O(log^3 n/epsilon^3)×O(log^3 n/epsilon^3) principal submatrix. Our result can be viewed as a concentration bound on the complete eigenspectrum of a random submatrix, significantly extending known bounds on just the singular values (the magnitudes of the eigenvalues). We give improved error bounds of epsilon*nnz(A)^{1/2} and epsilon*||A||_F when the rows of A can be sampled with probabilities proportional to their sparsities or their squared ell_2 norms respectively. Here nnz(A) is the number of non-zero entries in A and ||A||_F is its Frobenius norm. Even for the strictly easier problems of approximating the singular values or testing the existence of large negative eigenvalues (Bakshi, Chepurko, and Jayaram, FOCS '20), our results are the first that take advantage of non-uniform sampling to give improved error bounds. From a technical perspective, our results require several new eigenvalue concentration and perturbation bounds for matrices with bounded entries. Our non-uniform sampling bounds require a new algorithmic approach, which judiciously zeroes out entries of a randomly sampled submatrix to reduce variance, before computing the eigenvalues of that submatrix as estimates for those of A.


Video Recording