عنوان مقاله [English]
نویسندگان [English]چکیده [English]
Environmental consequences of deep subsurface drainage in irrigated lands of arid and semi-arid areas with shallow saline groundwater have posed a significant threat to sustainable crop production. In this regard, controlled drainage has introduced as the next logical step towards improving water management in irrigated agriculture and reducing the environmental impacts of subsurface drainage flow. Besides, the collected data from field experiments in combination with deterministic agro-hydrological simulation models offer the opportunity to gain detailed insights into the agricultural system behavior in space and time. Deterministic soil and water balance models like field scale Soil-Water-Atmosphere-Plant (SWAP) quantify all water and salt balance components and their interactions in the Soil-Water-Plant- Atmosphere continuum during the simulation period. The accuracy of these predictive models depends upon proper identification of the required model input parameters. Also, in the current edition of the SWAP model, the simulation of controlled subsurface drainage is not implemented. Despite of the capability of field-scale agro-hydrological models to quantify the interactions between soil water flow, salt transport, and crop growth, application of these models in actual operational conditions of the current case study has faced with a significant hurdle for reliable characterization of the farming system. Because of the large area of the sugarcane farms Khuzestan, southern Iran, the complement of a single irrigation event may take up to more than five days. Hence, representing the agricultural system as a single soil column would be questionable. The main objective of this work was to enable the application of such models under actual operational conditions in large fields with surface and/or subsurface drainage systems, with combinational free/controlled subsurface drainage management. To this aim, a distributed agro-hydrological modeling scheme with sub-daily calibration capability was developed through combining a modified version of the field scale SWAP model with an improved variant of Unified Particle Swarm Optimization (UPSO) algorithm.
The source code of the SWAP model was modified and extended to consider the duration of the irrigation events, simulation of intra-daily reference evapotranspiration, intra-daily precipitation interception, ratooning, and implementation of subsurface controlled drainage during the simulation period. To calibrate the unknown model parameters, the model was coupled with an intra-daily inverse modeling scheme developed via an improved variant of Unified Particle Swarm Optimization (UPSO) algorithm. The developed modeling scheme was applied to a dataset collected from a field with a combinational free/controlled (70-cm depth) subsurface drainage management located at Shoaybiyeh Sugarcane Agro-industrial company farms. The simulation was performed from 2010-07-19 to 2011-12-11 (481 days) for planted sugarcane (CP48-103 cultivar). A soil profile of 550 cm depth (depth of impermeable layer) was specified during simulations. The soil profile was divided into two layers. To consider inhomogeneity of irrigation scheduling at different parts of the studied field, the field area was divided into ten homogeneous simulation units (with a dimension of 84.5 ? 250 m), termed as hydrotopes. The hydrotopes have similar agro-hydrological properties except for irrigation scheduling. The model was calibrated, using the measured soil moisture profile, soil solute concentration profile, groundwater level, subsurface drainage outflow, drainage outflow salinity, Leaf Area Index (LAI), cane yield, and sucrose yield in a parallel manner. The weighted average of simulated values derived for each hydrotopes was compared with the corresponding measured data. Totally, 45 parameters were estimated through inverse modeling scheme. The accuracy of the model in calibration and validation stages was evaluated, using the mean error ME, the mean absolute error MAE, the root mean square error RMSE, normalized root mean square error NRMSE, Pearson's correlation coefficient r, and Nash-Sutcliffe model efficiency coefficient EF.
The results demonstrated the success of the developed modeling scheme in retrieving the measured soil moisture, groundwater level, subsurface drainage outflow (with an EF of 0.875, 0.933, and 0.751 for calibration dataset; and 0.804, 0.760, and 0.739 for validation dataset, respectively), soil water solute concentration, subsurface drainage outflow salinity (with a NRMSE of 0.092 and 0.119 for calibration dataset; and 0.142 and 0.042 for validation dataset, respectively), LAI, cane yield, and sucrose yield (with an EF of 0.996, 0.996, and 0.999, respectively). Except for drainage outflow salinity and cane yield, the lack of correlation was the main source of disagreement between measured and simulated data. In the case of measured subsurface drainage outflow salinity, the model deficiency to retrieve the temporal oscillations in measured data was the main source (with a contribution of 93.225% and 89.645% in overall disagreement for calibration and validation stage, respectively) of disagreement between measured and simulated data. Based on the simulated solute balance components throughout the simulation period, ~ 33.72 ton salt ha-1 was added to the soil due to saline irrigation water, and ~ 54.94 ton salt ha-1 was discharged into the receiving water bodies via field drains. Ultimately, the calibrated and validated model can be used as a helpful decision support tool for deriving the integrated irrigation and drainage water management scenarios in the study area.