论文标题

分布强劲的参数最大似然估计

Distributionally Robust Parametric Maximum Likelihood Estimation

论文作者

Nguyen, Viet Anh, Zhang, Xuhui, Blanchet, Jose, Georghiou, Angelos

论文摘要

我们考虑使用自然指数分布家族规定的概率生成模型的参数估计问题。对于这个问题,通常在有限的训练样本量下过度贴合的典型最大似然估计器对噪声敏感,并且在下游预测任务上的表现可能会差。为了减轻这些问题,我们提出了一个在参数标称分布周围的参数kullback-leibler球上均匀地统一地将最坏情况的预期日志传播的最大预期对数损坏最小化。利用同一天然指数家族中两个分布之间的kullback-leibler差异的分析表达,我们表明,在广泛的环境中,最低最大估计问题是可以解决的,包括对广义线性模型的强大训练。我们新颖的稳健估计器还具有统计一致性,并在回归和分类任务中带来了有希望的经验结果。

We consider the parameter estimation problem of a probabilistic generative model prescribed using a natural exponential family of distributions. For this problem, the typical maximum likelihood estimator usually overfits under limited training sample size, is sensitive to noise and may perform poorly on downstream predictive tasks. To mitigate these issues, we propose a distributionally robust maximum likelihood estimator that minimizes the worst-case expected log-loss uniformly over a parametric Kullback-Leibler ball around a parametric nominal distribution. Leveraging the analytical expression of the Kullback-Leibler divergence between two distributions in the same natural exponential family, we show that the min-max estimation problem is tractable in a broad setting, including the robust training of generalized linear models. Our novel robust estimator also enjoys statistical consistency and delivers promising empirical results in both regression and classification tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源