Concepedia

Publication | Open Access

Hybrid transformer model with liquid neural networks and learnable encodings for buildings’ energy forecasting

24

Citations

41

References

2025

Year

Abstract

• Hybrid transformer with liquid neural networks model for building energy forecasting. • Convolutional neural network encoders to understand temporal dynamics in energy data through spatial mappings. • Reservoir processing module implemented with liquid neural networks to capture non-linear relations in energy data. • Validation on various building contexts, including large apartment buildings and small households. Accurate forecasting of buildings' energy demand is essential for building operators to manage loads and resources efficiently, and for grid operators to balance local production with demand. However, nowadays models still struggle to capture nonlinear relationships influenced by external factors like weather and consumer behavior, assume constant variance in energy data over time, and often fail to model sequential data. To address these limitations, we propose a hybrid Transformer-based model with Liquid Neural Networks and learnable encodings for building energy forecasting. The model leverages Dense Layers to learn non-linear mappings to create embeddings that capture underlying patterns in time series energy data. Additionally, a Convolutional Neural Network encoder is integrated to enhance the model's ability to understand temporal dynamics through spatial mappings. To address the limitations of classic attention mechanisms, we implement a reservoir processing module using Liquid Neural Networks which introduces a controlled non-linearity through dynamic reservoir computing, enabling the model to capture complex patterns in the data. For model evaluation, we utilized both pilot data and state-of-the-art datasets to determine the model's performance across various building contexts, including large apartment and commercial buildings and small households, with and without on-site energy production. The proposed transformer model demonstrates good predictive accuracy and training time efficiency across various types of buildings and testing configurations. Specifically, SMAPE scores indicate a reduction in prediction error, with improvements ranging from 1.5 % to 50 % over basic transformer, LSTM and ANN models while the higher R² values further confirm the model's reliability in capturing energy time series variance. The 8 % improvement in training time over the basic transformer model, highlights the hybrid model computational efficiency without compromising accuracy.

References

YearCitations

Page 1