Publication | Closed Access
Precipitation Nowcasting Using Diffusion Transformer With Causal Attention
30
Citations
17
References
2024
Year
Short-term precipitation forecasting remains challenging due to the difficulty in capturing long-term spatiotemporal dependencies. Current deep learning methods fall short in establishing effective dependencies between conditions and forecast results, while also lacking interpretability. To address this issue, we propose a precipitation nowcasting using a diffusion transformer with causal attention (DTCA) model. Our model leverages the transformer and combines causal attention mechanisms to establish spatiotemporal queries between conditional information (causes) and forecast results (results). This design enables the model to effectively capture long-term dependencies, allowing forecast results to maintain strong causal relationships with input conditions over a wide range of time and space. We explore four variants of spatiotemporal information interactions for DTCA, demonstrating that global spatiotemporal labeling interactions yield the best performance. In addition, we introduce a channel-to-batch shift (CTBS) operation to further enhance the model’s ability to represent complex rainfall dynamics. We conducted experiments on two datasets. Compared to state-of-the-art U-Net-based methods, our approach improved the critical success index (CSI) for predicting heavy precipitation by approximately 15% and 8%, respectively, achieving state-of-the-art performance. Our project is open source and available on GitHub at: <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/ybu-lxd/DTCA</uri>.
| Year | Citations | |
|---|---|---|
Page 1
Page 1