论文标题

主动调整

Active Tuning

论文作者

Otte, Sebastian, Karlbauer, Matthias, Butz, Martin V.

论文摘要

我们引入了主动调整,这是一种新型范式,用于优化反复神经网络(RNN)的内部动力学。与常规序列到序列映射方案相反,主动调整使用展开的时间梯度信号将RNN的复发性神经活动从输入流中解散,以将内部动力学调整到数据流中。结果,模型输出仅取决于其内部隐藏动力学和自身预测的闭环反馈;它的隐藏状态通过反向传播信号观察和模型输出之间的差异而导致的时间梯度不断适应。这样,主动调整会根据最初学到的时间模式主动而间接地注入信号,从而将最合理的隐藏状态序列拟合到观测值中。我们证明了主动调整在几个时间序列预测基准上的有效性,包括多个超级正弦波,混乱的双摆和时空波动力学。主动调整始终提高所有评估模型的鲁棒性,准确性和概括能力。此外,通过主动调谐,可以成功地应用用于信号预测和降解的网络。因此,鉴于有能力的时间序列预测指标,主动调整可增强其在线信号过滤,降解和重建能力,而无需进行额外的培训。

We introduce Active Tuning, a novel paradigm for optimizing the internal dynamics of recurrent neural networks (RNNs) on the fly. In contrast to the conventional sequence-to-sequence mapping scheme, Active Tuning decouples the RNN's recurrent neural activities from the input stream, using the unfolding temporal gradient signal to tune the internal dynamics into the data stream. As a consequence, the model output depends only on its internal hidden dynamics and the closed-loop feedback of its own predictions; its hidden state is continuously adapted by means of the temporal gradient resulting from backpropagating the discrepancy between the signal observations and the model outputs through time. In this way, Active Tuning infers the signal actively but indirectly based on the originally learned temporal patterns, fitting the most plausible hidden state sequence into the observations. We demonstrate the effectiveness of Active Tuning on several time series prediction benchmarks, including multiple super-imposed sine waves, a chaotic double pendulum, and spatiotemporal wave dynamics. Active Tuning consistently improves the robustness, accuracy, and generalization abilities of all evaluated models. Moreover, networks trained for signal prediction and denoising can be successfully applied to a much larger range of noise conditions with the help of Active Tuning. Thus, given a capable time series predictor, Active Tuning enhances its online signal filtering, denoising, and reconstruction abilities without the need for additional training.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源