Continuous and discrete optimization, historically, have followed two largely distinct trajectories. The study of discrete optimization has been intertwined with that of theoretical computer science: the foundations of computational complexity and algorithm design, including NP-completeness, approximation algorithms and inapproximability, all blossomed around the study of discrete optimization problems. In contrast, continuous optimization has been grounded in the well-developed mathematical theory of convex analysis and geometry. It has been inspired by numerous applications in operations research, statistics, control theory and, most recently, machine learning. The substantial overlap with scientific computing has led continuous optimization to become a basic tool in many areas of science that adopt continuous models to describe and understand natural phenomena.

While there have always been connections between the foundations of continuous and discrete optimization, the active areas of research in these two fields have significantly differed until recently. In the last decade, partly stimulated by the growth of machine learning and by the proliferation of massive datasets, a number of new research areas have emerged at the intersection of the two fields. The expanded interface between continuous and discrete optimization has already led to a number of breakthroughs in both areas, including, among many others, faster algorithms for maximum flow problems in graphs, improved interior-point method solvers, novel approaches for fundamental discrete problems such as the asymmetric traveling salesman problems, and stronger impossibility results on the capability of extended formulations to approximate NP-hard problems.

In this program, we aim to bring together these two communities and stimulate further interaction in areas of shared interest, examples of which can be found among the themes of the program workshops. We welcome participants from all areas of discrete and continuous optimization, as well as their applications.

Note: This is a "jumbo" program, and will run alone in its semester.  It will be roughly twice the size of a standard program.

This program is supported in part by the National Science Foundation, as part of the DIMACS/Simons Institute Collaboration on Bridging Continuous and Discrete Optimization.


Seffi Naor (Technion Israel Institute of Technology)

Long-Term Participants (including Organizers)

Seffi Naor (Technion Israel Institute of Technology)
Mike Luby (International Computer Science Institute)

Research Fellows

Nima Anari (Stanford University; Microsoft Research Fellow)
Ludwig Schmidt (Massachusetts Institute of Technology; Microsoft Research Fellow)

Visiting Graduate Students and Postdocs

Yue Sun (University of Washington)
Lin Yang (UCLA; Facebook/Novi Research Fellow)