# Overview

This page collects some material and references related to submodular optimization, with applications in particular in machine learning and AI. Convex optimization has become a main workhorse for many machine learning algorithms during the past ten years. When minimizing a convex loss function for, e.g., training a Support Vector Machine, we can rest assured to efficiently find an optimal solution, even for large problems. In recent years, another fundamental problem structure, which has similar beneficial properties, has emerged as very useful in a variety of machine learning applications: Submodularity is an intuitive diminishing returns property, stating that adding an element to a smaller set helps more than adding it to a larger set. Similarly to convexity, submodularity allows one to efficiently find provably (near-)optimal solutions.# Tutorials

- Tutorial on Submodularity in Machine Learning -- New Directions at ICML 2013 by Stefanie Jegelka and Andreas Krause [pdf part 1] [pdf part 2].
- Tutorials on Submodularity in Machine Learning and Computer Vision at DAGM 2012 and ECAI 2012 by Stefanie Jegelka and Andreas Krause [preliminary pdf].
- Invited tutorial Intellgent Optimization with Submodular Functions at LION 2012 by Andreas Krause. Slides: [pdf]
- Intelligent Information Gathering and Submodular Function Optimization at IJCAI 2009 by Andreas Krause and Carlos Guestrin. Slides: [ppt]
- Beyond Convexity: Submodularity in Machine Learning" at ICML 2008 by Andreas Krause and Carlos Guestrin. Video
(recorded Oct 17 2008 at Carnegie Mellon University)
Part I: Minimizing submodular functions Part II: Maximizing submodular functions

# Software, Materials and References

- High-performance implementation of the minimum norm point algorithm for submodular function minimization with several applications [link]
- MATLAB Toolbox for submodular function optimization [link] maintained by Andreas Krause. Journal of Machine Learning Research Open Source Software paper [pdf]
- Survey on Submodular Function Maximization by Daniel Golovin and Andreas Krause. To appear as chapter in Tractability: Practical Approaches to Hard Problems (This draft is for personal use only. No further distribution without permission).
- Class on Submodular Functions by Jeff Bilmes
- Annotated bibliography.

# Related Meetings and Workshops

- Cargese Workshop on Combinatorial Optimization, Topic: Submodular Functions organized by Samuel Fiorini, Gianpaolo Oriolo, Gautier Stauffer and Paolo Ventura.
- NIPS 2012 Workshop on Discrete Optimization in Machine Learning: Structure and Scalability organized by Stefanie Jegelka, Andreas Krause, Pradeep Ravikumar, Jeff Bilmes.
- Modern Aspects of Submodularity workshop at GeorgiaTech organized by Shabbir Ahmed, Nina Balcan, Satoru Iwata and Prasad Tetali
- NIPS 2011 Workshop on Discrete Optimization in Machine Learning: Uncertainty, Generalization and Feedback organized by Andreas Krause, Pradeep Ravikumar, Jeff Bilmes and Stefanie Jegelka. [videos]
- NIPS 2010 Workshop on Discrete Optimization in Machine Learning: Structures, Algorithms and Applications organized by Andreas Krause, Pradeep Ravikumar, Jeff Bilmes and Stefanie Jegelka. [videos]
- NIPS 2009 Workshop on Discrete Optimization in Machine Learning: Submodularity, Sparsity and Polyhedra organized by Andreas Krause, Pradeep Ravikumar and Jeff Bilmes