Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

A Sublevel Moment-SOS Hierarchy for Polynomial Optimization

Abstract : We introduce a sublevel Moment-SOS hierarchy where each SDP relaxation can be viewed as an intermediate (or interpolation) between the d-th and (d+1)-th order SDP relaxations of the Moment-SOS hierarchy (dense or sparse version). With the flexible choice of determining the size (level) and number (depth) of subsets in the SDP relaxation, one is able to obtain different improvements compared to the d-th order relaxation, based on the machine memory capacity. In particular, we provide numerical experiments for d=1 and various types of problems both in combinatorial optimization (Max-Cut, Mixed Integer Programming) and deep learning (robustness certification, Lipschitz constant of neural networks), where the standard Lasserre's relaxation (or its sparse variant) is computationally intractable. In our numerical results, the lower bounds from the sublevel relaxations improve the bound from Shor's relaxation (first order Lasserre's relaxation) and are significantly closer to the optimal value or to the best-known lower/upper bounds.
Document type :
Preprints, Working Papers, ...
Complete list of metadata
Contributor : Tong Chen <>
Submitted on : Thursday, January 14, 2021 - 10:56:41 AM
Last modification on : Wednesday, June 9, 2021 - 10:00:24 AM

Links full text


  • HAL Id : hal-03109978, version 1
  • ARXIV : 2101.05167


Tong Chen, Jean-Bernard Lasserre, Victor Magron, Edouard Pauwels. A Sublevel Moment-SOS Hierarchy for Polynomial Optimization. 2021. ⟨hal-03109978⟩



Record views