论文标题

网络上的快速,强大的稀疏性学习:分散的替代替代回归方法

Fast and Robust Sparsity Learning over Networks: A Decentralized Surrogate Median Regression Approach

论文作者

Liu, Weidong, Mao, Xiaojun, Zhang, Xin

论文摘要

由于其快速增长的应用,分散的稀疏度学习最近引起了大量关注。为了获得强大而稀疏的估计器,一个自然的想法是采用非平滑损失损失与$ \ ell_1 $ sparsity正常器相结合。但是,大多数现有方法都遭受{\ em double}非平滑目标引起的趋势性能缓慢。为了加速计算,在本文中,我们提出了一种分散的替代中位回归(DESMR)方法,用于有效地解决分散的稀疏学习问题。我们表明,我们提出的算法具有简单的实现,享有线性收敛速率。我们还调查了统计保证,它表明我们提出的估计器达到了近门会的收敛率,而无需限制网络节点的数量。此外,我们建立了稀疏支持恢复的理论结果。提供了彻底的数值实验和实际数据研究,以证明我们方法的有效性。

Decentralized sparsity learning has attracted a significant amount of attention recently due to its rapidly growing applications. To obtain the robust and sparse estimators, a natural idea is to adopt the non-smooth median loss combined with a $\ell_1$ sparsity regularizer. However, most of the existing methods suffer from slow convergence performance caused by the {\em double} non-smooth objective. To accelerate the computation, in this paper, we proposed a decentralized surrogate median regression (deSMR) method for efficiently solving the decentralized sparsity learning problem. We show that our proposed algorithm enjoys a linear convergence rate with a simple implementation. We also investigate the statistical guarantee, and it shows that our proposed estimator achieves a near-oracle convergence rate without any restriction on the number of network nodes. Moreover, we establish the theoretical results for sparse support recovery. Thorough numerical experiments and real data study are provided to demonstrate the effectiveness of our method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源