Many problems arising in the fields of machine learning and data science are of an inherently discrete or combinatorial nature. However, the solution of such problems is often approached with suboptimal heuristic methods or easier but inexact relaxations, despite the availability of powerful modern algorithmic techniques, e.g., in mixed-integer linear and nonlinear programming. One reason for this may be the typical need for scalability in machine learning applications, but several recent results demonstrated that sophisticated discrete models and problem-specific solvers can, in fact, enable the exact solution also in large-scale regimes. Conversely, discrete optimization can also benefit from machine learning techniques, e.g., by means of learning-enhanced heuristics or via replacing expert-designed algorithmic decisions such as branching within a branch-and-cut mixed-integer solver framework. Expanding, improving and further investigating such aspects at the intersection of discrete optimization and machine learning motivates this Research Topic.
The goals of this Research Topic are broadly defined by the following two questions: How can discrete optimization problems arising in machine learning and data science applications be solved efficiently and exactly (or with provable approximation guarantees)? Conversely, how can general-purpose or problem-specific discrete optimization algorithms -- in particular, based in branch-and-bound -- be improved by utilizing machine learning techniques?
Recent results have demonstrated that exact discrete optimization can be brought to bear efficiently, scalably and successfully for several important problems such as, e.g., sparse regression or the design of optimal decision trees for classification. The crucial ingredients to achieve such results can include the utilization of novel relaxations, mixed-integer modeling tricks, or problem-specific cutting planes and other algorithmic solver components. Similarly, recent advances have also shown that even sophisticated modern branch-and-bound solvers for discrete problems (usually modeled as (mixed-)integer linear or nonlinear programs) can be further improved by incorporating machine learning approaches, e.g., to learn good branching selections, to schedule heuristics execution, or to automatically discover problem decompositions.
These examples are just the beginning -- we hope the Research Topic inspires contributions that further advance the state-of-the-art in discrete optimization with and for machine learning.
Research topic contributions should address either:
- the solution of a discrete/combinatorial (optimization) problem from machine learning/data science, exactly or with certifiable solution quality bounds, or
- the improvement of general-purpose or problem-specific solution methods for discrete/combinatorial (optimization) problems by utilizing machine learning techniques.
Topics of interest and applications include, but are not limited to:
- learning-based components of mixed-integer linear and nonlinear programming frameworks, e.g., branching, node or cut selection, and other algorithmic parameter/selection rules
- learning-based solution methods for discrete problems with provable approximation bounds
- novel models and (exact) solution approaches for discrete problems arising in machine learning or data science, e.g., subset selection, sparse regression, classification, clustering, neural architecture search, etc.
Theoretical and empirical analyses are both welcome; note that "provable bounds" means results with rigorous theoretical (preferably non-probabilistic) proofs. All submissions will be screened by the editors and, if deemed suitable, undergo peer review.
Many problems arising in the fields of machine learning and data science are of an inherently discrete or combinatorial nature. However, the solution of such problems is often approached with suboptimal heuristic methods or easier but inexact relaxations, despite the availability of powerful modern algorithmic techniques, e.g., in mixed-integer linear and nonlinear programming. One reason for this may be the typical need for scalability in machine learning applications, but several recent results demonstrated that sophisticated discrete models and problem-specific solvers can, in fact, enable the exact solution also in large-scale regimes. Conversely, discrete optimization can also benefit from machine learning techniques, e.g., by means of learning-enhanced heuristics or via replacing expert-designed algorithmic decisions such as branching within a branch-and-cut mixed-integer solver framework. Expanding, improving and further investigating such aspects at the intersection of discrete optimization and machine learning motivates this Research Topic.
The goals of this Research Topic are broadly defined by the following two questions: How can discrete optimization problems arising in machine learning and data science applications be solved efficiently and exactly (or with provable approximation guarantees)? Conversely, how can general-purpose or problem-specific discrete optimization algorithms -- in particular, based in branch-and-bound -- be improved by utilizing machine learning techniques?
Recent results have demonstrated that exact discrete optimization can be brought to bear efficiently, scalably and successfully for several important problems such as, e.g., sparse regression or the design of optimal decision trees for classification. The crucial ingredients to achieve such results can include the utilization of novel relaxations, mixed-integer modeling tricks, or problem-specific cutting planes and other algorithmic solver components. Similarly, recent advances have also shown that even sophisticated modern branch-and-bound solvers for discrete problems (usually modeled as (mixed-)integer linear or nonlinear programs) can be further improved by incorporating machine learning approaches, e.g., to learn good branching selections, to schedule heuristics execution, or to automatically discover problem decompositions.
These examples are just the beginning -- we hope the Research Topic inspires contributions that further advance the state-of-the-art in discrete optimization with and for machine learning.
Research topic contributions should address either:
- the solution of a discrete/combinatorial (optimization) problem from machine learning/data science, exactly or with certifiable solution quality bounds, or
- the improvement of general-purpose or problem-specific solution methods for discrete/combinatorial (optimization) problems by utilizing machine learning techniques.
Topics of interest and applications include, but are not limited to:
- learning-based components of mixed-integer linear and nonlinear programming frameworks, e.g., branching, node or cut selection, and other algorithmic parameter/selection rules
- learning-based solution methods for discrete problems with provable approximation bounds
- novel models and (exact) solution approaches for discrete problems arising in machine learning or data science, e.g., subset selection, sparse regression, classification, clustering, neural architecture search, etc.
Theoretical and empirical analyses are both welcome; note that "provable bounds" means results with rigorous theoretical (preferably non-probabilistic) proofs. All submissions will be screened by the editors and, if deemed suitable, undergo peer review.