论文标题

多输出神经树的多目标优化

Multi-Objective Optimisation of Multi-Output Neural Trees

论文作者

Ojha, Varun, Nicosia, Giuseppe

论文摘要

我们提出了一种算法和一种解决分类问题的新方法。我们提出了一种多输出神经树(MONT)算法,该算法是一种进化学习算法,该算法由非主导的排序遗传算法(NSGA)-III训练。由于进化学习是随机的,因此在每种进化学习的过程中,以蒙特的形式发现的假设是独特的,即,与拓扑空间和参数空间中的任何其他假设相比,生成的每个假设(树)都具有独特的特性。这导致了一个具有挑战性的优化问题,其目的是最大程度地减少树大小并最大化分类精度。因此,通过超量指标分析满足了帕累托式优先问题。我们使用九个基准分类学习问题来评估蒙特的表现。由于我们的实验,我们获得了能够以高准确性解决分类问题的蒙大拿。与一组众所周知的分类器相比,本研究中解决的一系列问题的表现更好:多层perceptron,减少越来的修剪树,天真的贝叶斯分类器,决策树和支持向量机。此外,使用遗传编程,NSGA-II和NSGA-III的三种版本的MONT培训的性能表明,NSGA-III提供了最佳的帕累托 - 最佳解决方案。

We propose an algorithm and a new method to tackle the classification problems. We propose a multi-output neural tree (MONT) algorithm, which is an evolutionary learning algorithm trained by the non-dominated sorting genetic algorithm (NSGA)-III. Since evolutionary learning is stochastic, a hypothesis found in the form of MONT is unique for each run of evolutionary learning, i.e., each hypothesis (tree) generated bears distinct properties compared to any other hypothesis both in topological space and parameter-space. This leads to a challenging optimisation problem where the aim is to minimise the tree-size and maximise the classification accuracy. Therefore, the Pareto-optimality concerns were met by hypervolume indicator analysis. We used nine benchmark classification learning problems to evaluate the performance of the MONT. As a result of our experiments, we obtained MONTs which are able to tackle the classification problems with high accuracy. The performance of MONT emerged better over a set of problems tackled in this study compared with a set of well-known classifiers: multilayer perceptron, reduced-error pruning tree, naive Bayes classifier, decision tree, and support vector machine. Moreover, the performances of three versions of MONT's training using genetic programming, NSGA-II, and NSGA-III suggest that the NSGA-III gives the best Pareto-optimal solution.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源