Author:
Bernats M.,Osterhus S. W.,Dzelzitis K.,Juhna T.
Abstract
Abstract. Corrosion in water supply networks is unwanted process that causes pipe material loss and subsequent pipe failures. Nowadays pipe replacing strategy most often is based on pipe age, which is not always the most important factor in pipe burst rate. In this study a methodology for developing a mathematical model to predict the decrease of pipe thickness in a large cast iron networks is presented. The quality of water, the temperature and the water flow regime were the main factors taken into account in the corrosion model. The water quality and flow rate effect were determined by measuring corrosion rate of metals coupons over the period of one year at different flow regimes. The obtained constants were then introduced in a calibrated hydraulic model (Epanet) and the corrosion model was validated by measuring the decrease of wall thickness in the samples that were removed during the regular pipe replacing event. The validated model was run for 30 yr to simulate the water distribution system of Riga (Latvia). Corrosion rate in the first year was 8.0–9.5 times greater than in all the forthcoming years, an average decrease of pipe wall depth being 0.013/0.016 mm per year in long term. The optimal iron pipe exploitation period was concluded to be 30–35 yr (for pipe wall depth 5.50 mm and metal density 7.5 m3 t−1). The initial corrosion model and measurement error was 33%. After the validation of the model the error was reduced to below 15%.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献