Tyler
Johnson

Graduate Student
Machine Learning

318 Allen Center
University of Washington

Overview

I am a PhD student at the University of Washington, working with Carlos Guestrin on problems in machine learning.

In 2012, I received bachelor's degrees in electrical engineering and computer science from the University of Michigan in Ann Arbor. At Michigan, I also worked on some research projects with Clayton Scott.

Research

I research methods for large-scale machine learning. I am interested in how existing algorithms can be modified to reduce computation. Currently I am looking at several ideas for this in the context of sparse optimization.

Publications

Unified Methods for Exploiting Piecewise Linear Structure in Convex Optimization.

Tyler B. Johnson and Carlos Guestrin.

Advances in Neural Information Processing Systems 29, 2016.

We develop methods for rapidly identifying important components of a convex optimization problem for the purpose of achieving fast convergence times. By considering a novel problem formulation—the minimization of a sum of piecewise functions—we describe a principled and general mechanism for exploiting piecewise linear structure in convex optimization. This result leads to a theoretically justified working set algorithm and a novel screening test, which generalize and improve upon many prior results on exploiting structure in convex optimization. In empirical comparisons, we study the scalability of our methods. We find that screening scales surprisingly poorly with the size of the problem, while our working set algorithm convincingly outperforms alternative approaches.
@inproceedings{johnson2016piecewise,
author = {Tyler B. Johnson and Carlos Guestrin},
title = {Unified Methods for Exploiting Piecewise Linear Structure in Convex Optimization},
booktitle = {Advances in Neural Information Processing Systems 29},
year = {2016}
}

Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization.

Tyler B. Johnson and Carlos Guestrin.

International Conference on Machine Learning, 2015.

By reducing optimization to a sequence of small subproblems, working set methods achieve fast convergence times for many challenging problems. Despite excellent performance, theoretical understanding of working sets is limited, and implementations often resort to heuristics to determine subproblem size, makeup, and stopping criteria. We propose Blitz, a fast working set algorithm accompanied by useful guarantees. Making no assumptions on data, our theory relates subproblem size to progress toward convergence. This result motivates methods for optimizing algorithmic parameters and discarding irrelevant variables as iterations progress. Applied to L1-regularized learning, Blitz convincingly outperforms existing solvers in sequential, limited-memory, and distributed settings. Blitz is not specific to L1-regularized learning, making the algorithm relevant to many applications involving sparsity or constraints.
@inproceedings{johnson2015blitz,
author = {Tyler B. Johnson and Carlos Guestrin},
title = {Blitz: A Principled Meta-Algorithm for Scaling Sparse Optimization},
booktitle = {International Conference on Machine Learning},
year = {2015}
}

Software

A Python package implementing the Blitz algorithm for sparse regression can be found on GitHub here.

Teaching

I have enjoyed helping teach the following courses: