CMU-CS-20-119
Computer Science Department
School of Computer Science, Carnegie Mellon University



CMU-CS-20-119

Automatic Differentiation of Sketched Regression

Hang Liao

M.S. Thesis

August 2020

CMU-CS-20-119.pdf


Keywords: Theory, Deep Learning, Automatic Differentiation, Sketching, Least Squares Regression

In this work, we explore the possibility of applying sketching, or dimensionality reduction, in the least squares regression (LLS) problem in differentiable programming settings. To motivate automatic differentiation (AD) for systems with a sketched regression component, we need to answer the following questions: do we yield similar derivatives (AD transformations) in differentiable programming systems with LLS and sketched LLS? In practice, does a system containing sketched LLS converge faster than the same system with LLS in training? How close are the results after convergence? To answer them, we first provide a bound on the operator norm of a sketched pseudoinverse matrix product, which is useful when analyzing the derivatives of sketched regression. We then give analysis on the approximation errors of derivatives in two proposed ways of sketched regression. Finally, we run experiments on both synthetic and real-world datasets to test the performance of our sketching methods.

42 pages

Thesis Committee:
Davie P. Woodruff (Chair)
J. Zico Kolter

Srinivasan Seshan, Head, Computer Science Department
Martial Hebert, Dean, School of Computer Science


Return to: SCS Technical Report Collection
School of Computer Science

This page maintained by [email protected]