论文标题
NASHD:使用高维计算的有效VIT体系结构性能排名
NasHD: Efficient ViT Architecture Performance Ranking using Hyperdimensional Computing
论文作者
论文摘要
神经体系结构搜索(NAS)是一种用于深度学习设计自动化的自动化体系结构工程方法,它是模型开发,选择,评估和性能估算的手动和错误过程的替代方法。但是,NAS的一个主要障碍是非常苛刻的计算资源需求和耗时的迭代,尤其是在数据集尺度时。在本文中,针对新兴视觉变压器(VIT),我们提出了NASHD,这是一种基于高度计算的监督学习模型,以对给定架构和配置的性能进行排名。与其他基于学习的方法不同,由于HDC体系结构的高平行处理,NASHD的速度更快。我们还评估了两个HDC编码方案:基于革兰氏集的NASHD的性能和效率。在来自不同范围的8个应用程序的Vimer-Ufo基准数据集上,NASHD记录可以对近100K视觉变压器模型的性能进行排名,而该模型的性能约为1分钟,同时仍然可以通过复杂的模型来取得可比的结果。
Neural Architecture Search (NAS) is an automated architecture engineering method for deep learning design automation, which serves as an alternative to the manual and error-prone process of model development, selection, evaluation and performance estimation. However, one major obstacle of NAS is the extremely demanding computation resource requirements and time-consuming iterations particularly when the dataset scales. In this paper, targeting at the emerging vision transformer (ViT), we present NasHD, a hyperdimensional computing based supervised learning model to rank the performance given the architectures and configurations. Different from other learning based methods, NasHD is faster thanks to the high parallel processing of HDC architecture. We also evaluated two HDC encoding schemes: Gram-based and Record-based of NasHD on their performance and efficiency. On the VIMER-UFO benchmark dataset of 8 applications from a diverse range of domains, NasHD Record can rank the performance of nearly 100K vision transformer models with about 1 minute while still achieving comparable results with sophisticated models.