论文标题

对受风险敏感的马尔可夫决策过程的近似解决方案

Approximate Solutions To Constrained Risk-Sensitive Markov Decision Processes

论文作者

M, Uday Kumar, Bhat, Sanjay P, Kavitha, Veeraruna, Hemachandra, Nandyala

论文摘要

本文考虑了有限状态行动,无限 - 霍森,限制风险敏感的马尔可夫决策过程(CRSMDPS)的有限状态行动的近乎最佳的马尔可夫随机(MR)策略的问题。约束是标准预期折扣成本功能的形式,以及预期的风险敏感折扣成本功能,而不是有限和无限视野。主要贡献是表明该问题如果可行,则具有解决方案,并提供两种方法,以最终固定(美国)MR政策的形式找到近似解决方案。后者是通过两个近似有限的Horizo​​n CRSMDP来实现的,它们是通过原始CRSMDP构建的,该crsmdp通过时间弄清原始目标和约束成本函数,并适当地扰动约束上限。第一个近似值给出了$ε$ - 最佳且可行的美国政策,而第二个近似值给出了几乎最佳的美国政策,其违反原始约束的行为在上面由指定的$ε$限制。证明的关键步骤是适当的指标选择,该度量使三个CRSMDPS紧凑型的无限 - 摩尼斯MR策略和可行区域以及目标和约束函数的连续。还给出了基于线性编程的公式,用于求解近似有限的Horizo​​n CRSMDP。

This paper considers the problem of finding near-optimal Markovian randomized (MR) policies for finite-state-action, infinite-horizon, constrained risk-sensitive Markov decision processes (CRSMDPs). Constraints are in the form of standard expected discounted cost functions as well as expected risk-sensitive discounted cost functions over finite and infinite horizons. The main contribution is to show that the problem possesses a solution if it is feasible, and to provide two methods for finding an approximate solution in the form of an ultimately stationary (US) MR policy. The latter is achieved through two approximating finite-horizon CRSMDPs which are constructed from the original CRSMDP by time-truncating the original objective and constraint cost functions, and suitably perturbing the constraint upper bounds. The first approximation gives a US policy which is $ε$-optimal and feasible for the original problem, while the second approximation gives a near-optimal US policy whose violation of the original constraints is bounded above by a specified $ε$. A key step in the proofs is an appropriate choice of a metric that makes the set of infinite-horizon MR policies and the feasible regions of the three CRSMDPs compact, and the objective and constraint functions continuous. A linear-programming-based formulation for solving the approximating finite-horizon CRSMDPs is also given.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源