887 resultados para Quality models
Resumo:
Process variability in pollutant build-up and wash-off generates inherent uncertainty that affects the outcomes of stormwater quality models. Poor characterisation of process variability constrains the accurate accounting of the uncertainty associated with pollutant processes. This acts as a significant limitation to effective decision making in relation to stormwater pollution mitigation. The study undertaken developed three theoretical scenarios based on research findings that variations in particle size fractions <150µm and >150µm during pollutant build-up and wash-off primarily determine the variability associated with these processes. These scenarios, which combine pollutant build-up and wash-off processes that takes place on a continuous timeline, are able to explain process variability under different field conditions. Given the variability characteristics of a specific build-up or wash-off event, the theoretical scenarios help to infer the variability characteristics of the associated pollutant process that follows. Mathematical formulation of the theoretical scenarios enables the incorporation of variability characteristics of pollutant build-up and wash-off processes in stormwater quality models. The research study outcomes will contribute to the quantitative assessment of uncertainty as an integral part of the interpretation of stormwater quality modelling outcomes.
Resumo:
The Water Framework Directive (WFD; European Commission 2000) is a framework for European environmental legislation that aims at improving water quality by using an integrated approach to implement the necessary societal and technical measures. Assessments to guide, support, monitor and evaluate policies, such as the WFD, require scientific approaches which integrate biophysical and human aspects of ecological systems and their interactions, as outlined by the International Council for Science (2002). These assessments need to be based on sound scientific principles and address the environmental problems in a holistic way. End-users need help to select the most appropriate methods and models. Advice on the selection and use of a wide range of water quality models has been developed within the project Benchmark Models for the Water Framework Directive (BMW). In this article, the authors summarise the role of benchmarking in the modelling process and explain how such an archive of validated models can be used to support the implementation of the WFD.
Resumo:
Purpose – The aim of this paper is to analyse how critical incidents or organisational crises can be used to check and legitimise quality management change efforts in relation to the fundamental principles of quality. Design/methodology/approach – Multiple case studies analyse critical incidents that demonstrate the importance of legitimisation, normative evaluation and conflict constructs in this process. A theoretical framework composed of these constructs is used to guide the analysis. Findings – The cases show that the critical incidents leading to the legitimisation of continuous improvement (CI) were diverse. However all resulted in the need for significant ongoing cost reduction to achieve or retain competitiveness. In addition, attempts at legitimising CI were coupled with attempts at destabilising the existing normative practice. This destabilisation process, in some cases, advocated supplementing the existing approaches and in others replacing them. In all cases, significant conflict arose in these legitimising and normative evaluation processes. Research limitations/implications – It is suggested that further research could involve a critical analysis of existing quality models, tools and techniques in relation to how they incorporate, and are built upon, fundamental quality management principles. Furthermore, such studies could probe the dangers of quality curriculum becoming divorced from business and market reality and thus creating a parallel existence. Practical implications – As demonstrated by the case studies, models, tools and techniques are not valued for their intrinsic value but rather for what they will contribute to addressing the business needs. Thus, in addition to being an opportunity for quality management, critical incidents present a challenge to the field. Quality management must be shown to make a contribution in these circumstances. Originality/value – This paper is of value to both academics and practitioners.
Resumo:
Lors de ces dix dernières années, le coût de la maintenance des systèmes orientés objets s'est accru jusqu' à compter pour plus de 70% du coût total des systèmes. Cette situation est due à plusieurs facteurs, parmi lesquels les plus importants sont: l'imprécision des spécifications des utilisateurs, l'environnement d'exécution changeant rapidement et la mauvaise qualité interne des systèmes. Parmi tous ces facteurs, le seul sur lequel nous ayons un réel contrôle est la qualité interne des systèmes. De nombreux modèles de qualité ont été proposés dans la littérature pour contribuer à contrôler la qualité. Cependant, la plupart de ces modèles utilisent des métriques de classes (nombre de méthodes d'une classe par exemple) ou des métriques de relations entre classes (couplage entre deux classes par exemple) pour mesurer les attributs internes des systèmes. Pourtant, la qualité des systèmes par objets ne dépend pas uniquement de la structure de leurs classes et que mesurent les métriques, mais aussi de la façon dont celles-ci sont organisées, c'est-à-dire de leur conception, qui se manifeste généralement à travers les patrons de conception et les anti-patrons. Dans cette thèse nous proposons la méthode DEQUALITE, qui permet de construire systématiquement des modèles de qualité prenant en compte non seulement les attributs internes des systèmes (grâce aux métriques), mais aussi leur conception (grâce aux patrons de conception et anti-patrons). Cette méthode utilise une approche par apprentissage basée sur les réseaux bayésiens et s'appuie sur les résultats d'une série d'expériences portant sur l'évaluation de l'impact des patrons de conception et des anti-patrons sur la qualité des systèmes. Ces expériences réalisées sur 9 grands systèmes libres orientés objet nous permettent de formuler les conclusions suivantes: • Contre l'intuition, les patrons de conception n'améliorent pas toujours la qualité des systèmes; les implantations très couplées de patrons de conception par exemple affectent la structure des classes et ont un impact négatif sur leur propension aux changements et aux fautes. • Les classes participantes dans des anti-atrons sont beaucoup plus susceptibles de changer et d'être impliquées dans des corrections de fautes que les autres classes d'un système. • Un pourcentage non négligeable de classes sont impliquées simultanément dans des patrons de conception et dans des anti-patrons. Les patrons de conception ont un effet positif en ce sens qu'ils atténuent les anti-patrons. Nous appliquons et validons notre méthode sur trois systèmes libres orientés objet afin de démontrer l'apport de la conception des systèmes dans l'évaluation de la qualité.
Resumo:
Les sociétés modernes dépendent de plus en plus sur les systèmes informatiques et ainsi, il y a de plus en plus de pression sur les équipes de développement pour produire des logiciels de bonne qualité. Plusieurs compagnies utilisent des modèles de qualité, des suites de programmes qui analysent et évaluent la qualité d'autres programmes, mais la construction de modèles de qualité est difficile parce qu'il existe plusieurs questions qui n'ont pas été répondues dans la littérature. Nous avons étudié les pratiques de modélisation de la qualité auprès d'une grande entreprise et avons identifié les trois dimensions où une recherche additionnelle est désirable : Le support de la subjectivité de la qualité, les techniques pour faire le suivi de la qualité lors de l'évolution des logiciels, et la composition de la qualité entre différents niveaux d'abstraction. Concernant la subjectivité, nous avons proposé l'utilisation de modèles bayésiens parce qu'ils sont capables de traiter des données ambiguës. Nous avons appliqué nos modèles au problème de la détection des défauts de conception. Dans une étude de deux logiciels libres, nous avons trouvé que notre approche est supérieure aux techniques décrites dans l'état de l'art, qui sont basées sur des règles. Pour supporter l'évolution des logiciels, nous avons considéré que les scores produits par un modèle de qualité sont des signaux qui peuvent être analysés en utilisant des techniques d'exploration de données pour identifier des patrons d'évolution de la qualité. Nous avons étudié comment les défauts de conception apparaissent et disparaissent des logiciels. Un logiciel est typiquement conçu comme une hiérarchie de composants, mais les modèles de qualité ne tiennent pas compte de cette organisation. Dans la dernière partie de la dissertation, nous présentons un modèle de qualité à deux niveaux. Ces modèles ont trois parties: un modèle au niveau du composant, un modèle qui évalue l'importance de chacun des composants, et un autre qui évalue la qualité d'un composé en combinant la qualité de ses composants. L'approche a été testée sur la prédiction de classes à fort changement à partir de la qualité des méthodes. Nous avons trouvé que nos modèles à deux niveaux permettent une meilleure identification des classes à fort changement. Pour terminer, nous avons appliqué nos modèles à deux niveaux pour l'évaluation de la navigabilité des sites web à partir de la qualité des pages. Nos modèles étaient capables de distinguer entre des sites de très bonne qualité et des sites choisis aléatoirement. Au cours de la dissertation, nous présentons non seulement des problèmes théoriques et leurs solutions, mais nous avons également mené des expériences pour démontrer les avantages et les limitations de nos solutions. Nos résultats indiquent qu'on peut espérer améliorer l'état de l'art dans les trois dimensions présentées. En particulier, notre travail sur la composition de la qualité et la modélisation de l'importance est le premier à cibler ce problème. Nous croyons que nos modèles à deux niveaux sont un point de départ intéressant pour des travaux de recherche plus approfondis.
Resumo:
Water quality models generally require a relatively large number of parameters to define their functional relationships, and since prior information on parameter values is limited, these are commonly defined by fitting the model to observed data. In this paper, the identifiability of water quality parameters and the associated uncertainty in model simulations are investigated. A modification to the water quality model `Quality Simulation Along River Systems' is presented in which an improved flow component is used within the existing water quality model framework. The performance of the model is evaluated in an application to the Bedford Ouse river, UK, using a Monte-Carlo analysis toolbox. The essential framework of the model proved to be sound, and calibration and validation performance was generally good. However some supposedly important water quality parameters associated with algal activity were found to be completely insensitive, and hence non-identifiable, within the model structure, while others (nitrification and sedimentation) had optimum values at or close to zero, indicating that those processes were not detectable from the data set examined. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Many urban surface energy balance models now exist. These vary in complexity from simple schemes that represent the city as a concrete slab, to those which incorporate detailed representations of momentum and energy fluxes distributed within the atmospheric boundary layer. While many of these schemes have been evaluated against observations, with some models even compared with the same data sets, such evaluations have not been undertaken in a controlled manner to enable direct comparison. For other types of climate model, for instance the Project for Intercomparison of Land-Surface Parameterization Schemes (PILPS) experiments (Henderson-Sellers et al., 1993), such controlled comparisons have been shown to provide important insights into both the mechanics of the models and the physics of the real world. This paper describes the progress that has been made to date on a systematic and controlled comparison of urban surface schemes. The models to be considered, and their key attributes, are described, along with the methodology to be used for the evaluation.
Resumo:
In spite of trying to understand processes in the same spatial domain, the catchment hydrology and water quality scientific communities are relatively disconnected and so are their respective models. This is emphasized by an inadequate representation of transport processes, in both catchment-scale hydrological and water quality models. While many hydrological models at the catchment scale only account for pressure propagation and not for mass transfer, catchment scale water quality models are typically limited by overly simplistic representations of flow processes. With the objective of raising awareness for this issue and outlining potential ways forward we provide a non-technical overview of (1) the importance of hydrology-controlled transport through catchment systems as the link between hydrology and water quality; (2) the limitations of current generation catchment-scale hydrological and water quality models; (3) the concept of transit times as tools to quantify transport and (4) the benefits of transit time based formulations of solute transport for catchment-scale hydrological and water quality models. There is emerging evidence that an explicit formulation of transport processes, based on the concept of transit times has the potential to improve the understanding of the integrated system dynamics of catchments and to provide a stronger link between catchment-scale hydrological and water quality models.
Resumo:
Background: In recent years, Spain has implemented a number of air quality control measures that are expected to lead to a future reduction in fine particle concentrations and an ensuing positive impact on public health. Objectives: We aimed to assess the impact on mortality attributable to a reduction in fine particle levels in Spain in 2014 in relation to the estimated level for 2007. Methods: To estimate exposure, we constructed fine particle distribution models for Spain for 2007 (reference scenario) and 2014 (projected scenario) with a spatial resolution of 16x16 km2. In a second step, we used the concentration-response functions proposed by cohort studies carried out in Europe (European Study of Cohorts for Air Pollution Effects and Rome longitudinal cohort) and North America (American Cancer Society cohort, Harvard Six Cities study and Canadian national cohort) to calculate the number of attributable annual deaths corresponding to all causes, all non-accidental causes, ischemic heart disease and lung cancer among persons aged over 25 years (2005-2007 mortality rate data). We examined the effect of the Spanish demographic shift in our analysis using 2007 and 2012 population figures. Results: Our model suggested that there would be a mean overall reduction in fine particle levels of 1mg/m3 by 2014. Taking into account 2007 population data, between 8 and 15 all-cause deaths per 100,000 population could be postponed annually by the expected reduction in fine particle levels. For specific subgroups, estimates varied from 10 to 30 deaths for all non-accidental causes, from 1 to 5 for lung cancer, and from 2 to 6 for ischemic heart disease. The expected burden of preventable mortality would be even higher in the future due to the Spanish population growth. Taking into account the population older than 30 years in 2012, the absolute mortality impact estimate would increase approximately by 18%. Conclusions: Effective implementation of air quality measures in Spain, in a scenario with a short-term projection, would amount to an appreciable decline infine particle concentrations, and this, in turn, would lead to notable health-related benefits. Recent European cohort studies strengthen the evidence of an association between long-term exposure to fine particles and health effects, and could enhance the health impact quantification in Europe. Air quality models can contribute to improved assessment of air pollution health impact estimates, particularly in study areas without air pollution monitoring data.
Inherent errors in pollutant build-up estimation in considering urban land use as a lumped parameter
Resumo:
Stormwater quality modelling results is subject to uncertainty. The variability of input parameters is an important source of overall model error. An in-depth understanding of the variability associated with input parameters can provide knowledge on the uncertainty associated with these parameters and consequently assist in uncertainty analysis of stormwater quality models and the decision making based on modelling outcomes. This paper discusses the outcomes of a research study undertaken to analyse the variability related to pollutant build-up parameters in stormwater quality modelling. The study was based on the analysis of pollutant build-up samples collected from 12 road surfaces in residential, commercial and industrial land uses. It was found that build-up characteristics vary appreciably even within the same land use. Therefore, using land use as a lumped parameter would contribute significant uncertainties in stormwater quality modelling. Additionally, it was also found that the variability in pollutant build-up can also be significant depending on the pollutant type. This underlines the importance of taking into account specific land use characteristics and targeted pollutant species when undertaking uncertainty analysis of stormwater quality models or in interpreting the modelling outcomes.
Resumo:
Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.
Resumo:
Variability in the pollutant wash-off process is a concept which needs to be understood in-depth in order to better assess the outcomes of stormwater quality models, and thereby strengthen stormwater pollution mitigation strategies. Current knowledge about the wash-off process does not extend to a clear understanding of the influence of the initially available pollutant build-up on the variability of the pollutant wash-off load and composition. Consequently, pollutant wash-off process variability is poorly characterised in stormwater quality models, which can result in inaccurate stormwater quality predictions. Mathematical simulation of particulate wash-off from three urban road surfaces confirmed that the wash-off load of particle size fractions <150µm and >150µm after a storm event vary with the build-up of the respective particle size fractions available at the beginning of the storm event. Furthermore, pollutant load and composition associated with the initially available build-up of <150µm particles predominantly influence the variability in washed-off pollutant load and composition. The influence of the build-up of pollutants associated with >150µm particles on wash-off process variability is significant only for relatively shorter duration storm events.
Resumo:
Forest management is facing new challenges under climate change. By adjusting thinning regimes, conventional forest management can be adapted to various objectives of utilization of forest resources, such as wood quality, forest bioenergy, and carbon sequestration. This thesis aims to develop and apply a simulation-optimization system as a tool for an interdisciplinary understanding of the interactions between wood science, forest ecology, and forest economics. In this thesis, the OptiFor software was developed for forest resources management. The OptiFor simulation-optimization system integrated the process-based growth model PipeQual, wood quality models, biomass production and carbon emission models, as well as energy wood and commercial logging models into a single optimization model. Osyczka s direct and random search algorithm was employed to identify optimal values for a set of decision variables. The numerical studies in this thesis broadened our current knowledge and understanding of the relationships between wood science, forest ecology, and forest economics. The results for timber production show that optimal thinning regimes depend on site quality and initial stand characteristics. Taking wood properties into account, our results show that increasing the intensity of thinning resulted in lower wood density and shorter fibers. The addition of nutrients accelerated volume growth, but lowered wood quality for Norway spruce. Integrating energy wood harvesting into conventional forest management showed that conventional forest management without energy wood harvesting was still superior in sparse stands of Scots pine. Energy wood from pre-commercial thinning turned out to be optimal for dense stands. When carbon balance is taken into account, our results show that changing carbon assessment methods leads to very different optimal thinning regimes and average carbon stocks. Raising the carbon price resulted in longer rotations and a higher mean annual increment, as well as a significantly higher average carbon stock over the rotation.
Resumo:
Assessing build-up and wash-off process uncertainty is important for accurate interpretation of model outcomes to facilitate informed decision making for developing effective stormwater pollution mitigation strategies. Uncertainty inherent to pollutant build-up and wash-off processes influences the variations in pollutant loads entrained in stormwater runoff from urban catchments. However, build-up and wash-off predictions from stormwater quality models do not adequately represent such variations due to poor characterisation of the variability of these processes in mathematical models. The changes to the mathematical form of current models with the incorporation of process variability, facilitates accounting for process uncertainty without significantly affecting the model prediction performance. Moreover, the investigation of uncertainty propagation from build-up to wash-off confirmed that uncertainty in build-up process significantly influences wash-off process uncertainty. Specifically, the behaviour of particles <150µm during build-up primarily influences uncertainty propagation, resulting in appreciable variations in the pollutant load and composition during a wash-off event.
Resumo:
Sediments are an important location in determining the fate of nutrients entering the estuary. Role of sediments needs to be incorporated into water quality models. Purpose of this study was to estimate the portion of sediment oxygen consumption (SOC) and sediment ammonium (NH4+) release directly attributable to benthic invertebrates via the respiratory use of oxygen and catabolic release of ammonium. Samples were collected at 8 locations from August 1985 through November 1988. (PDF contains 45 pages)