Nothing Special   »   [go: up one dir, main page]

Academia.eduAcademia.edu

Addressing the Evaluation of EDSS-maintenance

iemss.org

Abstract: Daily operation and maintenance tasks are needed to guarantee the correct performance of constructed wetlands. The definition of these activities is a complex task since these actions vary according to the characteristics of each facility. To support the definition of these ...

Addressing the Evaluation of EDSS-maintenance Clàudia Turon1, Joaquim Comas1, Ulises Cortés2 and Manel Poch1 1 Laboratori d’Enginyeria Química i Ambiental, University of Girona, Campus Montilivi, 17071, Girona, Spain. [e-mail addresses: claudia@lequia.udg.es; quim@lequia.udg.es; manel@lequia.udg.es] 2 Knowledge Engineering and Machine Learning Group, Technical University of Catalonia, c/ Jordi Girona 1&3, 08034 Barcelona, Spain.[e-mail addresses: ia@lsi.upc.es] Abstract: Daily operation and maintenance tasks are needed to guarantee the correct performance of constructed wetlands. The definition of these activities is a complex task since these actions vary according to the characteristics of each facility. To support the definition of these operation and maintenance protocols an Environmental Decision Support System (EDSS) has been constructed (EDSS-maintenance). The methodology used to develop EDSS-maintenance is based on the following five steps: environmental problem analysis, data and knowledge acquisition, model selection, model implementation and evaluation process. The first four steps have been finished; however, the evaluation process is ongoing. This document presents a new approach for this step: two numerical indices allow (a) verifying the performance of the EDSS-maintenance and (b) validating the compliance of the protocols with the user requirements. Moreover, another index enables an easy revision and improvement of the knowledge bases (problems, causes and actions) and so enhances the decision support system. Keywords: Constructed wetlands; Environmental decision support system; Evaluation Process; Validation; Verification. 1. INTRODUCTION Daily operation and maintenance tasks are needed to guarantee the correct performance of Constructed Wetlands (CWs). The definition of these activities is a complex task since these actions vary according to (1) the technology, the configuration and the design of the wastewater treatment plant, (2) the community characteristics and (3) the features of the receiving media. To support the definition of these actions an Environmental Decision Support System (EDSS) has been constructed (EDSS-maintenance). The methodology used to develop EDSSmaintenance is based on the following five steps (Poch et al., 2004): (1) environmental problem analysis, (2) data and knowledge acquisition, (3) model selection, (4) model implementation and (5) evaluation process. The first four steps have been finished (Turon et al., 2005): the required data and knowledge to solve the environmental problem were acquired and translated into a knowledge base composed of IF – THEN rules. The evaluation process is ongoing and there are no clear guidelines on EDSS evaluation, on the contrary, it is still an open problem. This document presents a new approach for this step. 2. EVALUATION PROCESS The evaluation process has to guarantee both the functioning of the EDSS-maintenance and the compliance with the user requirement specifications. These user requirements are: (1) identify the CWs’ problems, (2) identify the causes unleashing these disturbances and (3) propose the most appropriate preventive and corrective actions. That is to say that this evaluation process includes the verification and the validation of the knowledge-based system. Despite many verification and validation techniques and tools have been proposed, developed, and implemented (Ayel and Laurent, 1991; Lydiard, 1992; O’Keefe and O’Leary, 1993; Rosenwald and Liu, 1997; Tsai et al., 1999; Preece, 2001) the evaluation procedure is still an imprecise art. Specific steps in verification and validation processes vary upon the system under investigation. The EDSS-maintenance evaluation procedure is done in the following stages: checking the syntax and the semantic of the rules (Step-1 Evaluation), comparing the tasks proposed by the EDSS-maintenance with real operation and maintenance protocols (Step-2 Evaluation), expert evaluation of guidelines proposed by the EDSS-maintenance (Step-3 Evaluation), and evaluating the results of the application of operation and maintenance protocols in new CWs (Step-4 Evaluation). In the near future the protocols proposed by the EDSS-maintenance will be applied in new CWs. The technicians of these CWs will be responsible for evaluating the usefulness of the protocols. The results of these evaluations will be the Step-4 Evaluation. In this case the results obtained will be completely subjective and the evaluation of the usefulness for a given protocol will be based on the CW performance. The Step-1 Evaluation (or verification step) was done during the construction of the EDSSmaintenance. The objective of this first evaluation step was to check the consistency and completeness of the EDSS-maintenance. The system was checked for the following rules: redundant rules, conflicting rules, subsumed rules, unnecessary IF <conditions>, circular rules, deadend rules and unreachable rules. The results of this step are easily interpreted: the rule system works or it does not. Verification (Step-1 Evaluation) should be done before validation (Step-2, Step-3 and Step-4 Evaluation) to guarantee that software provides expected outputs via scientific and logical relationships, rather than simply calibration and correlation input and output (Sodja, 2005). In the Step-2 Evaluation, the operation and maintenance tasks proposed by the EDSSmaintenance for thirteen real CWs were compared with the operation and maintenance protocols applied in these facilities. The results of this validation step are difficult to treat because the comparison is done with real protocols which guarantee the CW performance, but cannot be 100 % correct. The Step-2 Evaluation can be done by experts on the domain or does not. The Step-3 Evaluation proposes another dilemma. In this case, the EDSS-maintenance was applied to thirty one CWs planned (not constructed) for the Fluvià river basin in the Urban Wastewater Treatment Program, of the Catalan Government (Alemany et al., 2005). Therefore, we did not have a real standard to compare the EDSSmaintenance outputs. For that reason, these protocols were evaluated by a pool of experts. The results of this validation are completely subjective, and the evaluation for a given protocol can vary among experts. Hence, it is also difficult to quantify the usefulness of the protocols. The validation stage (Step-2, Step-3 and Step-4 Evaluation) starts once the system is complete, coherent and logical from the modelling and programming perspective. To quantify the usefulness of the operation and maintenance protocols proposed by the EDSS-maintenance we suggest using the following mathematical index: Vi = n Step − 4 ∑ U (%) = ∑U Vi Vi =1 n Step − 2 3 * 100 (Equation 1) Where: U: Utility index of the protocols proposed by the EDSS-maintenance. UVi: Utility index of the protocols proposed by the EDSS-maintenance according to the Step-2, Step3 and Step-4 Evaluation. The UVi can be calculated with the Equation 2. n: Number of protocols evaluated in each evaluation stage. Remark: If one of the validation steps has not been done, the UVi utility index will be 0. k =n l =n  j =n   ∑ Pj EDSS − maintenance   ∑ C k EDSS − maintenance   ∑ Al EDSS − maintenance   j =1   k =1  +  l =1   + j =n k =n l =n       C k real Al real Pj real    ∑ ∑ ∑    k =1 l =1    j =1   U Vi = 3 (Equation 2) Where: Pj EDSS-maintenance: Number of problems proposed by the EDSS-maintenance and that have appeared in a CW (Step-2 and Step-4 Evaluation) or thought to have appeared (Step-3 Evaluation). Pj real: Number of problems that have appeared in a CW (Step-2 and Step-4 Evaluation) or thought to have appeared (Step-3 Evaluation). Ck EDSS-maintenance: Number of causes proposed by the EDSS-maintenance for one specific problem and identified in a CW (Step-2 and Step-4 Evaluation) or thought to have been the origin of disturbances (Step-3 Evaluation). Ck real: Number of causes identified in a CW (Step-2 and Step-4 Evaluation) or thought to be identified in a CW (Step-3 Evaluation). Al EDSS-maintenance: Number of actions proposed by the EDSS-maintenance for one specific problem and applied in a CW (Step-2 and Step-4 Evaluation) or thought to be able to be applied in a CW (Step-3 Evaluation). Al real: Number of actions applied in a CW (Step2 and Step-4 Evaluation) or thought to be able to be applied in a CW (Step-3 Evaluation). These equations allow specifying how useful the protocols provided by the EDSS-maintenance are. Nevertheless, there are still some open questions: Part of the information provided by the EDSSmaintenance is not useful, therefore how should this useless knowledge be expressed and evaluated? In the same way, the EDSSmaintenance can not consider some expert knowledge or empirical experiences, and therefore once again how should or could this useful knowledge be expressed and evaluated? To confront this dilemma we propose studying the possibility in which the problems appear, the possibility in which the causes are the origin of disturbances and the possibility in which the actions are applied. To calculate these probabilities we propose using the following equations: n ∑ Problem P (%) = i =1 n ∗ 100 (Equation 3) Where: P: Percentage of cases in which a problem can appear. Problem: Number of CWs that have had or can have the problem. n: Number of CWs studied. m ∑ Cause C (%) = i =1 m Where: C: Percentage of cases in which a cause can be the origin of a problem. Cause: Number of CWs in which the cause has been or can be the origin of a problem. m: Number of CWs studied. r ∑ Action A (%) = i =1 r ∗ 100 (Equation 5) Where: A: Percentage of cases in which an action has been or can be applied. Action: Number of CWs in which the action has been or can be applied to solve a problem. r: Number of CWs studied. If in Step-2, Step-3 or Step-4 Evaluation a new problem, cause or action is identified, and the probability of its occurrence is greater than 10 %, we recommend including them in the knowledge base of the EDSS. On the contrary, if the EDSSmaintenance proposes a problem, cause or action which is not identified or applied in real CWs or is discarded by experts and the probability of its occurrence is less than 10 %, we recommend removing them from the EDSS-maintenance. 3. CONCLUSIONS Equation-1and Equation-2 allow the evaluation of both: the EDSS-maintenance performance and the compliance of the protocols with the user requirements. Moreover, Equation-3, Equation-4 and Equation-5 make the revision and improvement of the knowledge bases easy and so enhance the decision support system. Therefore, this evaluation procedure allows achieving the verification and validation goal: Provide a protocol to measure the quality of knowledge in a knowledge base, and indicate where work needs to be done to rectify anomalous knowledge. 4. ACKNOWLEDGEMENTS This research has been supported by the Agència Catalana de l’Aigua of the Generalitat de Catalunya through the project “Sistema de suport a la decision per a l’establiment de protocol d’explotació a les depuradores del PSARU 2002”. 5. REFERENCES ∗ 100 (Equation 4) Alemany, J., Comas, J., Turon, C., Balaguer, M.D., Poch, M., Puig, M.A., and Bou, J., Evaluating the application of a decision support system in identifying adequate wastewater treatment for small communities. A case study: the Fluvià River Basin, Water Science and Technology, 51 (10), 179-186, 2005. Ayel, M., and Laurent, J.P., Validation, verification, and test of knowledge-based systems, John Wiley and Sons (ed.), Chichester, U.K. 1991. Lydiard, T.J., Overview of current practice and research initiatives for the verification and validation of KBS, Knowledge Engineering Review, 7 (2), 101-113, 1992. O’Keefe, R.M., and O’Leary, D.E., Expert system verification and validation: A survey and tutorial, Artificial Intelligence Review, 7, 342, 1993. Poch, M., Comas, J., Rodríguez-Roda, I., Sànchez-Marré, M., and Cortés, U., Designing and building real environmental decision support systems, Environmental Modelling & Software, 19, 857-873, 2004. Preece, A., Evaluating verification and validation methods in knowledge engineering, In R Roy (ed), Micro-Level Knowledge Management, Morgan-Kaufman, 123-145, 2001. Rosenwald, G.W., and Liu, C.C., Rule-based system validation through automatic identification of equivalence classes, IEEE Transactions on Knowledge and Data Engineering, 9 (1), 24-31, 1997. Sojda, R., Empirical evaluation of decision support systems: Needs, definitions, potential methods, and an example pertaining to waterfowl management, Environmental Modelling & Software, in press, 2005. Tsai, W.T, Vishnuvajjala, R., and Zhang, D., Verification and validation of knowledgebased systems, IEEE Transactions on Knowledge and Data Engineering, 11 (1), 202-212, 1999. Turon, C., Alemany, J., Bou, J., Comas, J., and Poch, M., Optimal maintenance of constructed wetlands using an environmental decision support system, Water Science and Technology, 51 (10), 109-117, 2005.