论文标题

在平滑概念漂移下使用非平稳批处理数据的模型适应和无监督的学习

Model adaptation and unsupervised learning with non-stationary batch data under smooth concept drift

论文作者

Das, Subhro, Lade, Prasanth, Srinivasan, Soundar

论文摘要

大多数预测模型都认为训练和测试数据是从固定过程中生成的。但是,这个假设在实践中并不成立。在本文中,我们考虑了由于数据源的基本非平稳性而导致的逐渐概念漂移的情况。虽然先前的工作已经在监督的学习和适应条件下调查了这种情况,但只有在培训期间才能提供标签时,很少有人能解决常见的现实情况。我们提出了一种用于预测模型无监督适应的新颖的迭代算法。我们表明,我们的批处理改编预测算法的性能要比其相应的未适应版本的性能要好。与其他最先进的方法相比,该算法在运行时间明显更少的运行时间内提供了相似的(或更好的)性能。我们通过对合成数据和真实数据的广泛数值评估来验证我们的主张。

Most predictive models assume that training and test data are generated from a stationary process. However, this assumption does not hold true in practice. In this paper, we consider the scenario of a gradual concept drift due to the underlying non-stationarity of the data source. While previous work has investigated this scenario under a supervised-learning and adaption conditions, few have addressed the common, real-world scenario when labels are only available during training. We propose a novel, iterative algorithm for unsupervised adaptation of predictive models. We show that the performance of our batch adapted prediction algorithm is better than that of its corresponding unadapted version. The proposed algorithm provides similar (or better, in most cases) performance within significantly less run time compared to other state of the art methods. We validate our claims though extensive numerical evaluations on both synthetic and real data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源