论文标题
Kullback-Leibler Divergence的圆锥重新印度限制了分布强大的优化和应用
Conic Reformulations for Kullback-Leibler Divergence Constrained Distributionally Robust Optimization and Applications
论文作者
论文摘要
在本文中,我们考虑了一个分布强大的优化(DRO)模型,其中歧义集被定义为一组分布,其kullback-leibler(KL)与经验分布的差异是有界的。利用KL差异是指数代表的函数的事实,我们获得了KL Divergence限制DRO问题的强大对应物,作为在基础优化问题的轻度假设下,作为双重指数锥体约束程序。最初的优化问题的圆锥重新印象可以由商业锥形编程求解器直接解决。我们将通用配方专门用于两个经典优化问题,即新闻供应商问题和无竞争性的设施位置问题。我们在样本外分析中的计算研究表明,通过DRO方法获得的解决方案在分散成本实现方面的性能明显更好,而与随机编程获得的解决方案相比,中心趋势仅略有恶化。
In this paper, we consider a distributionally robust optimization (DRO) model in which the ambiguity set is defined as the set of distributions whose Kullback-Leibler (KL) divergence to an empirical distribution is bounded. Utilizing the fact that KL divergence is an exponential cone representable function, we obtain the robust counterpart of the KL divergence constrained DRO problem as a dual exponential cone constrained program under mild assumptions on the underlying optimization problem. The resulting conic reformulation of the original optimization problem can be directly solved by a commercial conic programming solver. We specialize our generic formulation to two classical optimization problems, namely, the Newsvendor Problem and the Uncapacitated Facility Location Problem. Our computational study in an out-of-sample analysis shows that the solutions obtained via the DRO approach yield significantly better performance in terms of the dispersion of the cost realizations while the central tendency deteriorates only slightly compared to the solutions obtained by stochastic programming.