Time Series Forecasting with Multi-Headed Attention-Based Deep Learning for Residential Energy Consumptionopen access
- Authors
- Bu, Seok-Jun; Cho, Sung-Bae
- Issue Date
- Sep-2020
- Publisher
- Multidisciplinary Digital Publishing Institute (MDPI)
- Keywords
- convolutional recurrent neural network; multi-headed attention; time-series forecasting; energy consumption prediction
- Citation
- Energies, v.13, no.18
- Indexed
- SCIE
SCOPUS
- Journal Title
- Energies
- Volume
- 13
- Number
- 18
- URI
- https://scholarworks.gnu.ac.kr/handle/sw.gnu/73646
- DOI
- 10.3390/en13184722
- ISSN
- 1996-1073
- Abstract
- Predicting residential energy consumption is tantamount to forecasting a multivariate time series. A specific window for several sensor signals can induce various features extracted to forecast the energy consumption by using a prediction model. However, it is still a challenging task because of irregular patterns inside including hidden correlations between power attributes. In order to extract the complicated irregular energy patterns and selectively learn the spatiotemporal features to reduce the translational variance between energy attributes, we propose a deep learning model based on the multi-headed attention with the convolutional recurrent neural network. It exploits the attention scores calculated with softmax and dot product operation in the network to model the transient and impulsive nature of energy demand. Experiments with the dataset of University of California, Irvine (UCI) household electric power consumption consisting of a total 2,075,259 time-series show that the proposed model reduces the prediction error by 31.01% compared to the state-of-the-art deep learning model. Especially, the multi-headed attention improves the prediction performance even more by up to 27.91% than the single-attention.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - ETC > Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.