Structured Prediction Cascades

Stay connected

EVENTS

NEWS

Share on facebook
Share on twitter
Share on linkedin

CIS Colloquium, Sep 28, 2010, 11:00AM – 12:00PM, TECH Center Room 111

Structured Prediction Cascades

Ben Taskar, University of Pennsylvania

Structured prediction tasks pose a fundamental bias-computation trade-off: The need for complex models to increase predictive power on the one hand and the limited computational resources for inference in the exponentiallysized output spaces on the other. We formulate and develop structured prediction cascades to address this trade-off: a sequence of increasingly complex models that progressively filter the space of possible outputs. We represent an exponentially large set of filtered outputs using max marginals and propose a novel convex loss for learning cascades that balances filtering error with filtering efficiency. We derive generalization bounds for error and efficiency losses and evaluate our approach on several natural language and vision problems: handwriting recognition, part-of-speech tagging and articulated pose estimation in images and videos. We find that the learned cascades are capable of reducing the complexity of inference by up to several orders of magnitude, enabling the use of models which incorporate higher order dependencies and features and yield significantly higher accuracy.

Ben Taskar received his bachelor’s and doctoral degree in Computer Science from Stanford University. After a postdoc at the University of California at Berkeley, he joined the faculty at the University of Pennsylvania Computer and Information Science Department in 2007, where he currently co-directs PRiML: Penn Research in Machine Learning. His research interests include machine learning, natural language processing and computer vision. He has been awarded the Sloan Research Fellowship and selected for the Young Investigator Program by the Office of Naval Research and the DARPA Computer Science Study Group. His work on structured prediction has received best paper awards at NIPS and EMNLP conferences.