Article ID: | iaor2007862 |
Country: | Netherlands |
Volume: | 44 |
Issue: | 5/6 |
Start Page Number: | 485 |
End Page Number: | 498 |
Publication Date: | Sep 2006 |
Journal: | Mathematical and Computer Modelling |
Authors: | Dandy Graeme C., Maier Holger R., Nixon John B., Holmes Mike, Gibbs M.S., Morgan N. |
Keywords: | neural networks |
Drinking water contaminated by micro-organisms can be a major risk to public health. Disinfection is used to destroy micro-organisms that are potentially dangerous to humans. In order to prevent bacterial regrowth, it is also desirable to maintain a disinfectant residual throughout the water distribution system. The most commonly used disinfectant is chlorine. If the dosing rate of chlorine is too low, there may be insufficient residual at the end of the distribution system, resulting in bacterial regrowth. On the other hand, the addition of too much chlorine can lead to customer complaints about taste and odour, corrosion of the pipe network and the formation of potentially carcinogenic by-products. Consequently, in order to determine the optimal chlorine dosing rate, it is necessary to be able to predict chlorine decay in the network. In this paper three different data-driven techniques are used to predict chlorine concentrations at two key locations in the Hope Valley water distribution system, located to the north of Adelaide, South Australia. The data-driven methods applied include a linear regression model and two artificial neural networks: the Multi Layer Perceptron (MLP); and the General Regression Neural Network (GRNN). A 5-year data set containing routinely measured parameters is used for model development and validation. The results indicate that data-driven techniques are relatively successful in predicting chlorine concentrations in the distribution system. This is despite the fact that there is no hydraulic model of the system, and that only data that are collected on a routine basis were used for model development.