Abstract:
Accurate prediction of dam deformation is paramount for structural health monitoring, operational safety, and risk mitigation of hydraulic infrastructure. However, achieving high precision remains challenging due to the inherent complexities in deformation monitoring sequences, characterized by significant noise interference and pronounced nonlinear features. To address these limitations and enhance predictive performance, this paper proposes a novel hybrid forecasting model integrating Improved Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (ICEEMDAN), Elite Breed Quantum Particle Swarm Optimization (EBQPSO), Least Squares Support Vector Machine (LSSVM), and Long Short-Term Memory (LSTM) networks, denoted as the ICEEMDAN‑EBQPSO‑LSSVM‑LSTM model.
The proposed methodology adopts a sophisticated multi-stage decomposition–prediction–integration framework. In the first stage, the raw, noisy dam deformation time series data are processed using ICEEMDAN. This advanced signal decomposition technique effectively mitigates noise and non-stationarity inherent in the original data by adaptively decomposing it into a finite set of relatively stationary, simpler, and more predictable sub-components (Intrinsic Mode Functions, IMFs) alongside a residual trend component. This decomposition step is crucial for simplifying subsequent prediction tasks by isolating different frequency characteristics and underlying trends within the complex signal.
In the second stage, each decomposed sub-component (IMF) undergoes preliminary prediction. For this task, LSSVM models are employed due to their strong capability in handling nonlinear regression problems and providing global optimal solutions with relatively low computational complexity compared to standard Support Vector Machines (SVM). However, the predictive performance of LSSVM is highly sensitive to the selection of its hyperparameters, notably the regularization parameter (γ) and the kernel parameter (σ2) when using a Radial Basis Function (RBF) kernel. To overcome the limitations of manual or suboptimal parameter tuning, this paper introduces the EBQPSO algorithm. EBQPSO constitutes a significant enhancement over basic Quantum‑behaved Particle Swarm Optimization (QPSO), incorporating improved mechanisms for particle encoding, position updating, and convergence control to more effectively and efficiently explore the hyperparameter space. It is specifically designed and applied to automatically optimize the critical hyperparameters of the LSSVM models for each individual IMF sub-component, ensuring that each model achieves its optimal predictive configuration for its particular sub-signal characteristics.
Despite the optimized LSSVM predictions for each IMF, the preliminary forecast results inevitably contain residual errors. These arise from factors such as imperfect decomposition, model generalization limits, and inherent stochasticity not fully captured by the EBQPSO‑optimized LSSVM models for the sub-components. Recognizing that these residuals often exhibit complex temporal dependencies, the third stage employs an LSTM network. LSTM networks, a specialized type of recurrent neural network (RNN), are renowned for their ability to learn and model long‑range temporal dependencies and sequential patterns within time series data. Here, an LSTM model is trained specifically to forecast the residual sequence obtained from the aggregated preliminary predictions of the ICEEMDAN sub-components. This residual correction step is designed to capture and compensate for intricate temporal patterns overlooked by the initial prediction stage, thereby significantly enhancing the overall model’s accuracy.