In the field of meteorological remote sensing data prediction, capturing both low-frequency trends and high-frequency oscillations poses significant challenges. Conventional models often excel at long-term trend prediction but struggle with short-term oscillatory components, leading to suboptimal performance in highly dynamic atmospheric systems. To address these issues, this study proposes a two-stage framework that combines wavelet transform for signal decomposition with adaptive high- and low-frequency fusion strategies.
First, wavelet transform is utilized to decompose the meteorological data into low-frequency (trend) and high-frequency (oscillation) components. An improved Transformer-based model is then utilized to independently train the two components, effectively capturing their distinct patterns. Subsequently, the following two fusion strategies are employed to integrate the predictions: (1) Residual Prediction Fusion, which treats the high-frequency model as a residual predictor to refine the low-frequency predictions, and (2) Dynamic Weight Fusion, where a neural network dynamically learns and adjusts the weights of low-frequency and high-frequency components based on the signal's features. These fusion methods aim to balance long-term trend stability with short-term variability sensitivity.
The proposed methodology is expected to enhance prediction accuracy for meteorological remote sensing datasets, such as those from the FY-4A and Himawari-8 satellites, as well as other datasets like ERA5 and Weather. Experimental results demonstrate that the proposed method achieves a 5% to 20% reduction in prediction error across different datasets, effectively capturing both chaotic and deterministic atmospheric properties. This improvement highlights the model's potential to provide more accurate short- to medium-term weather forecasting, thereby advancing the understanding of atmospheric dynamics.