论文标题

逆核分解

Inverse Kernel Decomposition

论文作者

Li, Chengrui, Wu, Anqi

论文摘要

最先进的维度降低方法在很大程度上取决于复杂的优化程序。另一方面,仅需要特征分解的封闭形式的方法没有足够的复杂性和非线性。在本文中,我们提出了一种基于样品协方差数据的特征分类,提出了一种新型的非线性降低方法 - 反核分解(IKD)。该方法的灵感来自高斯过程潜在变量模型(GPLVM),并且与GPLVM具有可比的性能。为了处理具有较弱相关性的非常嘈杂的数据,我们提出了两种解决方案 - 块和测量 - 使用局部相关的数据点,并提供更好和数字上更稳定的潜在估计。我们使用合成数据集和四个现实世界数据集来证明IKD是比其他基于EIGEN分解的方法更好的维度降低方法,并且与具有更快的运行速度的优化方法相比,可以实现可比较的性能。可以通过此\ url {https://github.com/jerrysoybean/ikd}访问Python中的开源IKD IKD实现。

The state-of-the-art dimensionality reduction approaches largely rely on complicated optimization procedures. On the other hand, closed-form approaches requiring merely eigen-decomposition do not have enough sophistication and nonlinearity. In this paper, we propose a novel nonlinear dimensionality reduction method -- Inverse Kernel Decomposition (IKD) -- based on an eigen-decomposition of the sample covariance matrix of data. The method is inspired by Gaussian process latent variable models (GPLVMs) and has comparable performance with GPLVMs. To deal with very noisy data with weak correlations, we propose two solutions -- blockwise and geodesic -- to make use of locally correlated data points and provide better and numerically more stable latent estimations. We use synthetic datasets and four real-world datasets to show that IKD is a better dimensionality reduction method than other eigen-decomposition-based methods, and achieves comparable performance against optimization-based methods with faster running speeds. Open-source IKD implementation in Python can be accessed at this \url{https://github.com/JerrySoybean/ikd}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源