论文标题
内核感知的教学维度
The Teaching Dimension of Kernel Perceptron
论文作者
论文摘要
在可能的教学的线性设置下,已经研究了算法的机械教学。但是,以教授非线性学习者而闻名。在这里,我们为不同特征地图家族的内核感知器建立了教学的样本复杂性(又称教学维度)。作为热身,我们表明,对于$ \ mathbb {r}^d $中的线性perceptrons的确切教学的教学复杂性为$θ(d)$,而$θ(d^k)$ to borteptron具有订单$ k $的多项式内核。此外,在对数据分布的某些平稳假设下,我们建立了一个严格的限制,以大致教授高斯内核感知器的复杂性。我们提供了在几个规范设置的线性,多项式和高斯内核感知器下设置的最佳(近似)教学的数值示例。
Algorithmic machine teaching has been studied under the linear setting where exact teaching is possible. However, little is known for teaching nonlinear learners. Here, we establish the sample complexity of teaching, aka teaching dimension, for kernelized perceptrons for different families of feature maps. As a warm-up, we show that the teaching complexity is $Θ(d)$ for the exact teaching of linear perceptrons in $\mathbb{R}^d$, and $Θ(d^k)$ for kernel perceptron with a polynomial kernel of order $k$. Furthermore, under certain smooth assumptions on the data distribution, we establish a rigorous bound on the complexity for approximately teaching a Gaussian kernel perceptron. We provide numerical examples of the optimal (approximate) teaching set under several canonical settings for linear, polynomial and Gaussian kernel perceptrons.