BOOK
Benchmarking of Control Strategies for Wastewater Treatment Plants
Krist V. Gernaey | Ulf Jeppsson | Peter A. Vanrolleghem | John B. Copp
(2014)
Additional Information
Book Details
Abstract
Wastewater treatment plants are large non-linear systems subject to large perturbations in wastewater flow rate, load and composition. Nevertheless these plants have to be operated continuously, meeting stricter and stricter regulations. Many control strategies have been proposed in the literature for improved and more efficient operation of wastewater treatment plants. Unfortunately, their evaluation and comparison – either practical or based on simulation – is difficult. This is partly due to the variability of the influent, to the complexity of the biological and biochemical phenomena and to the large range of time constants (from a few minutes to several days). The lack of standard evaluation criteria is also a tremendous disadvantage. To really enhance the acceptance of innovative control strategies, such an evaluation needs to be based on a rigorous methodology including a simulation model, plant layout, controllers, sensors, performance criteria and test procedures, i.e. a complete benchmarking protocol.
This book is a Scientific and Technical Report produced by the IWA Task Group on Benchmarking of Control Strategies for Wastewater Treatment Plants. The goal of the Task Group includes developing models and simulation tools that encompass the most typical unit processes within a wastewater treatment system (primary treatment, activated sludge, sludge treatment, etc.), as well as tools that will enable the evaluation of long-term control strategies and monitoring tasks (i.e. automatic detection of sensor and process faults). Work on these extensions has been carried out by the Task Group during the past five years, and the main results are summarized in Benchmarking of Control Strategies for Wastewater Treatment Plants. Besides a description of the final version of the already well-known Benchmark Simulation Model no. 1 (BSM1), the book includes the Benchmark Simulation Model no. 1 Long-Term (BSM1_LT) – with focus on benchmarking of process monitoring tasks – and the plant-wide Benchmark Simulation Model no. 2 (BSM2).
Authors: Krist V. Gernaey, Technical University of Denmark, Lyngby, Denmark, Ulf Jeppsson, Lund University, Sweden, Peter A. Vanrolleghem, Université Laval, Quebec, Canada and John B. Copp, Primodal Inc., Hamilton, Ontario, Canada
Table of Contents
Section Title | Page | Action | Price |
---|---|---|---|
Cover\r | Cover | ||
Contents | v | ||
Nomenclature | ix | ||
List of technical reports | xvii | ||
Preface | xix | ||
Chapter 1:\rIntroduction | 1 | ||
1.1 What is Meant by a ‘Benchmark Simulation Model’? | 1 | ||
1.2 What is the Purpose of the Benchmark Simulation Models? | 2 | ||
1.3 Who Should Use the Benchmark Simulation Models? | 2 | ||
1.4 How Should the Benchmark Simulation Models be Used? | 3 | ||
1.5 Who has been Involved in the Development of the Benchmark Simulation Models? | 3 | ||
1.6 How Should this Scientific and Technical Report be Read? | 3 | ||
Chapter 2:\rBenchmark overview | 5 | ||
2.1 Benchmark Simulation Model No. 1 | 5 | ||
2.2 Benchmark Simulation Model No. 1 Long-Term | 6 | ||
2.3 Benchmark Simulation Model No. 2 | 7 | ||
2.4 The Benchmark Simulation Model Set | 8 | ||
Chapter 3:\rBenchmark plant description | 9 | ||
3.1 Benchmark Simulation Model No. 1 | 9 | ||
3.2 Benchmark Simulation Model No. 1 Long-Term | 10 | ||
3.3 Benchmark Simulation Model No. 2 | 10 | ||
3.4 Characteristics Summary | 12 | ||
Chapter 4:\rBenchmark models | 15 | ||
4.1 Influent Modelling | 16 | ||
4.1.1 BSM1 influent | 16 | ||
4.1.2 BSM1_LT and BSM2 influent | 17 | ||
4.2 Unit Process Models | 23 | ||
4.2.1 Activated Sludge Model No. 1 (ASM1) | 23 | ||
4.2.2 Anaerobic Digestion Model No. 1 (ADM1) | 24 | ||
4.2.2.1 Elemental balances | 24 | ||
4.2.2.2 Acid-base equations | 26 | ||
4.2.2.3 pH inhibition equations | 26 | ||
4.2.2.4 Gas phase equations | 27 | ||
4.2.2.5 DAE simplifications and simulation speed | 27 | ||
4.2.2.6 Model parameters | 29 | ||
4.2.3 ASM/ADM interfacing | 29 | ||
4.2.3.1 ASM1 to ADM1 conversion | 30 | ||
4.2.3.2 ADM1 to ASM1 conversion | 31 | ||
4.2.3.3 Further remarks | 31 | ||
4.2.4 Solids separation models | 32 | ||
4.2.4.1 Primary clarifier | 32 | ||
4.2.4.2 Secondary clarifier | 33 | ||
4.2.4.3 Thickener | 35 | ||
4.2.4.4 Dewatering unit | 36 | ||
4.2.5 Reject water storage tank | 36 | ||
4.3 Sensors and Actuators | 36 | ||
4.3.1 Sensors | 37 | ||
4.3.1.1 Concept | 37 | ||
4.3.1.2 Time response | 38 | ||
4.3.2 Actuators | 39 | ||
4.3.3 Faults and failures | 40 | ||
4.4 Inhibition and Toxicity | 44 | ||
4.4.1 Biological processes | 44 | ||
4.4.2 Physical processes | 46 | ||
4.4.3 Modelling inhibitory/toxic substances | 46 | ||
4.5 Risk Assessment Modelling | 48 | ||
4.5.1 Concept | 48 | ||
4.5.2 Application to filamentous bulking | 48 | ||
4.5.2.1 Decision tree | 48 | ||
4.5.2.2 Modelling approach | 49 | ||
4.5.2.3 Temperature effect | 51 | ||
4.5.2.4 Risk assessment outcomes | 51 | ||
4.6 Temperature | 51 | ||
Chapter 5:\rBenchmarking of control strategies | 55 | ||
5.1 BSM1 and BSM1_LT Controllers | 55 | ||
5.1.1 Default BSM1 control strategy | 55 | ||
5.1.2 Other BSM1 control handles | 56 | ||
5.1.3 BSM1_LT control strategy | 56 | ||
5.2 BSM2 Controllers | 57 | ||
5.2.1 Default BSM2 control strategy | 57 | ||
5.2.2 Testing other BSM2 control strategies | 57 | ||
Chapter 6:\rEvaluation criteria | 59 | ||
6.1 Effluent and Influent Quality Indices | 59 | ||
6.2 Effluent Concentrations | 61 | ||
6.2.1 Ninety-five (95) percentiles | 61 | ||
6.2.2 Number of violations | 61 | ||
6.2.3 Percentage of time plant is in violation | 62 | ||
6.3 Operational Cost Index | 62 | ||
6.3.1 Aeration energy | 63 | ||
6.3.2 Pumping energy | 64 | ||
6.3.3 Sludge production for disposal | 64 | ||
6.3.4 External carbon | 65 | ||
6.3.5 Mixing energy | 65 | ||
6.3.6 Methane production | 66 | ||
6.3.7 Heating energy | 66 | ||
6.4 Controller Assessment | 67 | ||
6.4.1 Controlled variable tracking | 67 | ||
6.4.2 Actuator performance | 68 | ||
6.4.3 Risk-related evaluation criteria | 69 | ||
6.5 Monitoring Performance Assessment | 69 | ||
6.6 Evaluation Summary | 73 | ||
Chapter 7:\rSimulation procedure | 75 | ||
7.1 BSM1 | 75 | ||
Steady state simulations | 75 | ||
Dynamic simulations | 75 | ||
7.2 BSM1_LT | 76 | ||
7.3 BSM2 | 78 | ||
Chapter 8:\rRing-testing | 81 | ||
8.1 Steady State Verification | 82 | ||
8.2 Dynamic Verification | 83 | ||
8.3 Findings | 86 | ||
Chapter 9:\rBSM limitations | 89 | ||
9.1 BSM as a Toolbox | 89 | ||
9.2 Model Structures | 90 | ||
9.2.1 Biokinetic models | 90 | ||
9.2.2 Aeration | 91 | ||
9.2.3 Solid/Liquid separation models | 92 | ||
9.2.4 Other models | 92 | ||
9.3 Model Parameters | 93 | ||
9.4 Evaluation Criteria | 93 | ||
9.5 Model Simulation | 94 | ||
9.6 Application Extension | 95 | ||
9.7 Conclusion | 96 | ||
Chapter 10:\rConclusions and perspectives | 97 | ||
10.1 Lessons Learned: Development of the Benchmark Platforms | 97 | ||
10.2 Lessons Learned: Use of the Benchmark Platforms, Verified Process Models and Generic Tools | 98 | ||
10.2.1 Portability | 98 | ||
10.2.2 Extensions | 99 | ||
10.3 Looking Ahead: Future Extensions of the BSM Platforms | 99 | ||
10.3.1 Temporal extensions | 100 | ||
10.3.2 Spatial extensions | 100 | ||
10.3.3 Process extensions | 100 | ||
10.3.4 Realism of models used in BSM | 101 | ||
10.3.5 Control strategy extensions | 101 | ||
10.3.6 Extended evaluation tools | 101 | ||
10.4 The ‘Benchmarking Spirit’ | 102 | ||
References | 103 | ||
Appendix A:\rModel Parameters | 109 | ||
Appendix B:\rSimulation Output | 119 | ||
Index | 141 |