About this Research Topic
The goals of this Research Topic are broadly defined by the following two questions: How can discrete optimization problems arising in machine learning and data science applications be solved efficiently and exactly (or with provable approximation guarantees)? Conversely, how can general-purpose or problem-specific discrete optimization algorithms -- in particular, based in branch-and-bound -- be improved by utilizing machine learning techniques?
Recent results have demonstrated that exact discrete optimization can be brought to bear efficiently, scalably and successfully for several important problems such as, e.g., sparse regression or the design of optimal decision trees for classification. The crucial ingredients to achieve such results can include the utilization of novel relaxations, mixed-integer modeling tricks, or problem-specific cutting planes and other algorithmic solver components. Similarly, recent advances have also shown that even sophisticated modern branch-and-bound solvers for discrete problems (usually modeled as (mixed-)integer linear or nonlinear programs) can be further improved by incorporating machine learning approaches, e.g., to learn good branching selections, to schedule heuristics execution, or to automatically discover problem decompositions.
These examples are just the beginning -- we hope the Research Topic inspires contributions that further advance the state-of-the-art in discrete optimization with and for machine learning.
Research topic contributions should address either:
- the solution of a discrete/combinatorial (optimization) problem from machine learning/data science, exactly or with certifiable solution quality bounds, or
- the improvement of general-purpose or problem-specific solution methods for discrete/combinatorial (optimization) problems by utilizing machine learning techniques.
Topics of interest and applications include, but are not limited to:
- learning-based components of mixed-integer linear and nonlinear programming frameworks, e.g., branching, node or cut selection, and other algorithmic parameter/selection rules
- learning-based solution methods for discrete problems with provable approximation bounds
- novel models and (exact) solution approaches for discrete problems arising in machine learning or data science, e.g., subset selection, sparse regression, classification, clustering, neural architecture search, etc.
Theoretical and empirical analyses are both welcome; note that "provable bounds" means results with rigorous theoretical (preferably non-probabilistic) proofs. All submissions will be screened by the editors and, if deemed suitable, undergo peer review.
Keywords: discrete optimization, machine learning, mixed-integer programming, combinatorial optimization, data science
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.