IDK Cascades: Fast Deep Learning by Learning not to Overthink

Xin Wang

Advances in deep learning have led to substantial increases in prediction accuracy but have been accompanied by increases in the cost of rendering predictions. We conjecture that for a majority of real-world inputs, the recent advances in deep learning have created models that effectively "overthink" on simple inputs. In this paper, we revisit the classic question of building model cascades that primarily leverage class asymmetry to reduce cost. We introduce the "I Don't Know"(IDK) prediction cascades framework, a general framework to systematically compose a set of pre-trained models to accelerate inference without a loss in prediction accuracy. We propose two search based methods for constructing cascades as well as a new cost-aware objective within this framework. The proposed IDK cascade framework can be easily adopted in the existing model serving systems without additional model re-training. We evaluate the proposed techniques on a range of benchmarks to demonstrate the effectiveness of the proposed framework.

Published On: July 18, 2018

Presented At/In: Conference on Uncertainty in Artificial Intelligence (UAI) 2018


Authors: Xin Wang, Yujia Luo, Dan Crankshaw, Alexey Tumanov, Fisher Yu, Joseph Gonzalez