CMU-CS-19-109
Computer Science Department
School of Computer Science, Carnegie Mellon University



CMU-CS-19-109

Differentiable Optimization-Based Modeling for Machine Learning

Brandon Amos

Ph.D. Thesis

May 2019

CMU-CS-19-109.pdf


Keywords: Machine learning, statistical modeling, convex optimization, deep learning, control, reinforcement learning

Domain-specific modeling priors and specialized components are becoming increasingly important to the machine learning field. These components integrate specialized knowledge that we have as humans into model. We argue in this thesis that optimization methods provide an expressive set of operations that should be part of the machine learning practitioner's modeling toolbox.

We present two foundational approaches for optimization-based modeling: 1) the OptNet architecture that integrates optimization problems as individual layers in larger end-to-end trainable deep networks, and 2) the input-convex neural network (ICNN) architecture that helps make inference and learning in deep energy-based models and structured prediction more tractable.

We then show how to use the OptNet approach 1) as a way of combining model-free and model-based reinforcement learning and 2) for top-@i learning problems. We conclude by showing how to differentiate cone programs and turn the cvxpy domain specific language into a differentiable optimization layer that enables rapid prototyping of the approaches in this thesis.

The source code for this thesis document is available in open source form.

147 pages

Thesis Committee:
J. Zico Kolter (Chair)
Barnabás Póczos
Jeff Schneider
Vladlen Koltun (Intel Labs)

Srinivasan Seshan, Head, Computer Science Department
Tom M. Mitchell, Interim Dean, School of Computer Science


Return to: SCS Technical Report Collection
School of Computer Science

This page maintained by [email protected]