PP: A dual-stage attention-based recurrent neural network for time series prediction
2021-04-19 16:28
标签:eve dep nta pid The rap end multiple which Problem: time series prediction The nonlinear autoregressive exogenous model: The Nonlinear autoregressive exogenous (NARX) model, which predicts the current value of a time series based upon its previous values as well as the current and past values of multiple driving (exogenous) series. However, few NARX models can capture the long-term temporal dependencies appropriately and select the relevant driving series to make a prediction. 2 issues: 1. capture the long-term temporal dependencies 2. select the relevant driving series to make a prediction We propose a dual-stage attention-based RNN to address these 2 issues. 1. first stage: input attention mechanism to extract relevant driving series. 2. second stage: temporal attention mechanism. attention-based encoder-decoder networks for time series prediction/ LSTM/ GRU One problem with encoder-decoder networks is that their performance will deteriorate rapidly as the length of input sequence increases. Contribution: the two-stage attention mechanism. input attention for driving series and temporal attention for all time stamps. input attention can select the relevant driving series. temporal attention capture temporal information. Supplementary knowledge: 1. what is driving series? PP: A dual-stage attention-based recurrent neural network for time series prediction 标签:eve dep nta pid The rap end multiple which 原文地址:https://www.cnblogs.com/dulun/p/12267003.html
文章标题:PP: A dual-stage attention-based recurrent neural network for time series prediction
文章链接:http://soscw.com/index.php/essay/76739.html