Abstract
I will talk about ideas and techniques from approximation theory for approximating functions such as $x^s,$ $x^{-1}$ and $e^{-x}$, and demonstrate how these results play a crucial role in the design of fast algorithms for problems which are increasingly relevant. The key lies in the fact that such results imply faster ways to compute primitives such as $A^sv,$ $A^{-1}v,$ $\exp({-A})v,$ eigenvalues, and eigenvectors, which are fundamental to many spectral algorithms. Indeed, many fast algorithms reduce to the computation of such primitives, which have proved useful for speeding up several fundamental computations such as random walk simulation, graph partitioning, and solving systems of linear equations.
Based on a recent monograph with Sushant Sachdeva.