论文标题

不需要激活功能:比率网

Activation functions are not needed: the ratio net

论文作者

Zhou, Chi-Chun, Tu, Hai-Long, Hou, Yue-Jie, Ling, Zhen, Liu, Yi, Hua, Jian

论文摘要

用于分类任务的深神经网络基本上由两个组成部分组成:特征提取器和功能近似器。它们通常是整体整体的工作,但是,任何组件的改进都可以促进整个算法的性能。本文着重于设计新功能近似器。通常,为了构建函数近似器,通常使用基于非线性激活函数或非线性内核函数的方法,并产生经典网络,例如前馈前进神经网络(MLP)和径向基函数函数网络(RBF)。在本文中,提出了有效且有效的新功能近似器。新提出的网络没有设计新的激活功能或内核函数,而是使用分数形式。为了方便起见,我们将网络命名为比率网。我们比较了比率NET的有效性和效率,RBF和MLP的有效性和效率具有各种激活功能在MNIST手写数字数据库和Internet电影数据库(IMDB)上的分类任务中的各种激活功能,这是二元情绪分析数据集。它表明,在大多数情况下,比率净收敛的速度更快,并且比MLP和RBF均优于MLP。

A deep neural network for classification tasks is essentially consist of two components: feature extractors and function approximators. They usually work as an integrated whole, however, improvements on any components can promote the performance of the whole algorithm. This paper focus on designing a new function approximator. Conventionally, to build a function approximator, one usually uses the method based on the nonlinear activation function or the nonlinear kernel function and yields classical networks such as the feed-forward neural network (MLP) and the radial basis function network (RBF). In this paper, a new function approximator that is effective and efficient is proposed. Instead of designing new activation functions or kernel functions, the new proposed network uses the fractional form. For the sake of convenience, we name the network the ratio net. We compare the effectiveness and efficiency of the ratio net and that of the RBF and the MLP with various kinds of activation functions in the classification task on the mnist database of handwritten digits and the Internet Movie Database (IMDb) which is a binary sentiment analysis dataset. It shows that, in most cases, the ratio net converges faster and outperforms both the MLP and the RBF.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源