Abstract:
As an important water resource regulation and flood control facility, the safety and stability of earth-rock dams are directly related to the safety of people's lives and property. In recent years, extreme rainstorms have occurred frequently due to the impact of global climate change and human activities, leading to greater uncertainty in the change rate of reservoir water level and placing higher demands on the safe operation of earth-rock dams. In particular, when facing sudden changes in reservoir water level, the seepage field of earth-rock dams exhibits typical time-varying and nonlinear characteristics, making it difficult to accurately predict and diagnose their seepage behavior using traditional methods. Therefore, accurately predicting and effectively monitoring the seepage state of earth-rock dams has become one of the key challenges in ensuring their safe operation of the dams. Previous research on seepage monitoring models for earth-rock dams often overlooked the potentially rapid changes in reservoir water level that may occur in practice and ignored the impact of the water level change rate on the seepage state. In fact, the change rate of reservoir water level has a significant influence on the seepage field of earth-rock dams. Existing studies show that when the reservoir water level changes rapidly, the stress state and pore pressure within the dam change accordingly, being proportional to the rate of water level change. It is necessary to consider both the change rate of reservoir water level and its lag effect to improve the reliability and accuracy of the seepage monitoring model, which is of great significance for evaluating the safety performance of earth-rock dams under extreme conditions, particularly in cases of sudden water level changes. To enhance the prediction accuracy of seepage monitoring models under rapidly changing reservoir water levels, this paper proposes an improved seepage monitoring model for earth-rock dams. The proposed model integrates the change rate of reservoir water level with a lag effect function to optimize the calculation method of the water pressure component in traditional seepage models, thereby more accurately reflecting the influence of reservoir water level changes on the seepage of earth-rock dams. Based on measured seepage monitoring data from three sets of observation points (S2, S8, and S16) in a reservoir, the effectiveness of the proposed model is validated, and a comparative analysis with existing methods is presented. The results show that, in terms of fitting performance, the multiple correlation coefficient of the proposed model improved significantly. In terms of prediction accuracy, the proposed model demonstrates good adaptability and predictive capability. The maximum prediction errors for points S2, S8, and S16 are 0.32, 0.41, and 0.40 m, respectively. For observation point S2, the mean absolute error (
eMAE), mean absolute percentage error (
eMAPE), and root mean square error (
eRMSE) decreased by 34.5%, 33.3%, and 36.1%, respectively. For observation point S8, the
eMAE,
eMAPE, and
eRMSE decreased by 19.3%, 19.2%, and 25.8%, respectively. For observation point S16, the
eMAE,
eMAPE, and
eRMSE decreased by 47.1%, 48.3%, and 55.5%, respectively. Particularly in cases of sudden changes in reservoir water level, the proposed model can still adapt well to the dynamic changes in the seepage state of earth-rock dams, effectively capturing the steep increases and sudden decreases in piezometer water levels. Across the three sets of observation points, the
eMAE,
eMAPE, and
eRMSE decreased by 42.6%–58.0%, 42.7%–58.2%, and 42.1%–55.2%, respectively. Therefore, the model proposed in this paper is feasible for the seepage monitoring of earth-rock dams. Compared with existing models, both the regression and prediction accuracy of the proposed model improved significantly. The research results can provide a theoretical basis and technical support for evaluating the safety performance of earth-rock dams under extreme conditions.