Foundations and Trends® in Machine Learning > Vol 5 > Issue 2–3

Determinantal Point Processes for Machine Learning

By Alex Kulesza, University of Michigan, USA, kulesza@umich.edu | Ben Taskar, University of Pennsylvania, USA, taskar@cis.upenn.edu

 
Suggested Citation
Alex Kulesza and Ben Taskar (2012), "Determinantal Point Processes for Machine Learning", Foundations and Trends® in Machine Learning: Vol. 5: No. 2–3, pp 123-286. http://dx.doi.org/10.1561/2200000044

Publication Date: 18 Dec 2012
© 2012 A. Kulesza and B. Taskar
 
Subjects
Graphical models,  Kernel methods,  Nonparametric methods,  Spectral methods
 

Free Preview:

Download extract

Share

Download article
In this article:
1 Introduction 
2 Determinantal Point Processes 
3 Representation and Algorithms 
4 Learning 
5 k-DPPs 
6 Structured DPPs 
7 Conclusion 
References 

Abstract

Determinantal point processes (DPPs) are elegant probabilistic models of repulsion that arise in quantum physics and random matrix theory. In contrast to traditional structured models like Markov random fields, which become intractable and hard to approximate in the presence of negative correlations, DPPs offer efficient and exact algorithms for sampling, marginalization, conditioning, and other inference tasks. We provide a gentle introduction to DPPs, focusing on the intuitions, algorithms, and extensions that are most relevant to the machine learning community, and show how DPPs can be applied to real-world applications like finding diverse sets of high-quality search results, building informative summaries by selecting diverse sentences from documents, modeling nonoverlapping human poses in images or video, and automatically building timelines of important news stories.

DOI:10.1561/2200000044
ISBN: 978-1-60198-628-3
166 pp. $99.00
Buy book (pb)
 
ISBN: 978-1-60198-629-0
166 pp. $220.00
Buy E-book (.pdf)
Table of contents:
1: Introduction
2: Determinantal point processes
3: Representation and algorithms
4: Learning
5: k-DPPs
6: Structured DPPs
7: Conclusion
References

Determinantal Point Processes for Machine Learning

Determinantal point processes (DPPs) are elegant probabilistic models of repulsion that arise in quantum physics and random matrix theory. In contrast to traditional structured models like Markov random fields, which become intractable and hard to approximate in the presence of negative correlations, DPPs offer efficient and exact algorithms for sampling, marginalization, conditioning, and other inference tasks. While they have been studied extensively by mathematicians, giving rise to a deep and beautiful theory, DPPs are relatively new in machine learning. Determinantal Point Processes for Machine Learning provides a comprehensible introduction to DPPs, focusing on the intuitions, algorithms, and extensions that are most relevant to the machine learning community, and shows how DPPs can be applied to real-world applications like finding diverse sets of high-quality search results, building informative summaries by selecting diverse sentences from documents, modeling non-overlapping human poses in images or video, and automatically building timelines of important news stories. It presents the general mathematical background to DPPs along with a range of modeling extensions, efficient algorithms, and theoretical results that aim to enable practical modeling and learning.

 
MAL-044