首页 | 本学科首页   官方微博 | 高级检索  
     检索      


DRL*: A hierarchy of strong block-decomposable linear relaxations for 0-1 MIPs
Authors:M Minoux
Institution:
  • Université Paris 6, Paris, France
  • Abstract:In this paper we introduce DRL*, a new hierarchy of linear relaxations for 0-1 mixed integer linear programs (MIPs), based on the idea of Reformulation-Linearization, and explore its links with the Lift-and-Project (L&P) hierarchy and the Sherali-Adams (RLT) hierarchy. The relaxations of the new hierarchy are shown to be intermediate in strength between L&P and RLT relaxations, and examples are shown for which it leads to significantly stronger bounds than those obtained from Lift-and-Project relaxations. On the other hand, as opposed to the RLT relaxations, a key advantage of the DRL* relaxations is that they feature a decomposable structure when formulated in extended space, therefore lending themselves to more efficient solution algorithms by properly exploiting decomposition. Links between DRL* and both the L&P and RLT hierarchies are further explored, and those constraints which should be added to the rank d L&P relaxation (resp to the rank d RLT relaxation) to make it coincide with the rank d DRL* relaxation (resp: to the rank d RLT relaxation) are identified. Furthermore, a full characterization of those 0-1 MIPs for which the DRL* and RLT relaxations coincide is obtained. As an application, we show that both the RLT and DRL* relaxations are the same up to rank d for the problem of optimizing a pseudoboolean function of degree d over a polyhedron. We report computational results comparing the strengths of the rank 2 L&P, DRL* and RLT relaxations. Impact on possible improved efficiency in computing some bounds for the quadratic assignment problem and other directions for future research are suggested in the conclusions.
    Keywords:Integer programming  Disjunctive-programming  Lift-and-Project  Reformulation-Linearization-Technique  Pseudoboolean optimization
    本文献已被 ScienceDirect 等数据库收录!
    设为首页 | 免责声明 | 关于勤云 | 加入收藏

    Copyright©北京勤云科技发展有限公司  京ICP备09084417号