88 resultados para Quality Model
em CentAUR: Central Archive University of Reading - UK
Resumo:
A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.
Resumo:
Water quality models generally require a relatively large number of parameters to define their functional relationships, and since prior information on parameter values is limited, these are commonly defined by fitting the model to observed data. In this paper, the identifiability of water quality parameters and the associated uncertainty in model simulations are investigated. A modification to the water quality model `Quality Simulation Along River Systems' is presented in which an improved flow component is used within the existing water quality model framework. The performance of the model is evaluated in an application to the Bedford Ouse river, UK, using a Monte-Carlo analysis toolbox. The essential framework of the model proved to be sound, and calibration and validation performance was generally good. However some supposedly important water quality parameters associated with algal activity were found to be completely insensitive, and hence non-identifiable, within the model structure, while others (nitrification and sedimentation) had optimum values at or close to zero, indicating that those processes were not detectable from the data set examined. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
The aim of this study was, within a sensitivity analysis framework, to determine if additional model complexity gives a better capability to model the hydrology and nitrogen dynamics of a small Mediterranean forested catchment or if the additional parameters cause over-fitting. Three nitrogen-models of varying hydrological complexity were considered. For each model, general sensitivity analysis (GSA) and Generalized Likelihood Uncertainty Estimation (GLUE) were applied, each based on 100,000 Monte Carlo simulations. The results highlighted the most complex structure as the most appropriate, providing the best representation of the non-linear patterns observed in the flow and streamwater nitrate concentrations between 1999 and 2002. Its 5% and 95% GLUE bounds, obtained considering a multi-objective approach, provide the narrowest band for streamwater nitrogen, which suggests increased model robustness, though all models exhibit periods of inconsistent good and poor fits between simulated outcomes and observed data. The results confirm the importance of the riparian zone in controlling the short-term (daily) streamwater nitrogen dynamics in this catchment but not the overall flux of nitrogen from the catchment. It was also shown that as the complexity of a hydrological model increases over-parameterisation occurs, but the converse is true for a water quality model where additional process representation leads to additional acceptable model simulations. Water quality data help constrain the hydrological representation in process-based models. Increased complexity was justifiable for modelling river-system hydrochemistry. Increased complexity was justifiable for modelling river-system hydrochemistry.
Resumo:
A water quality model is used to assess the impact of possible climate change on dissolved oxygen (DO) in the Thames. The Thames catchment is densely populated and, typically, many pressures are anthropogenic. However, that same population also relies on the river for potable water supply and as a disposal route for treated wastewater. Thus, future water quality will be highly dependent on future activity. Dynamic and stochastic modelling has been used to assess the likely impacts on DO dynamics along the river system and the probability distributions associated with future variability. The modelling predictions indicate that warmer river temperatures and drought act to reduce dissolved oxygen concentrations in lowland river systems
Resumo:
The aim of the present work is to study the occupants' exposure to fine particulate concentrations in ten nightclubs (NCs) in Athens, Greece. Measurements of PM1 and PM 2.5 were made in the outdoor and indoor environment of each NC. The average indoorPM1 andPM 2.5 concentrations were found to be 181.77 μgm−3 and 454.08 μg m−3 respectively, while the corresponding outdoor values were 11.04 μg m−3 and 32.19 μg m−3. Ventilation and resuspension rates were estimated through consecutive numerical experiments with an indoor air quality model and were found to be remarkably lower than the minimum values recommended by national standards. The relative effects of the ventilation and smoking on the occupants' exposures were examined using multiple regression techniques. Itwas found that given the low ventilation rates, the effect of smoking as well as the occupancy is of the highest importance. Numerical evaluations showed that if the ventilation rates were at the minimum values set by national standards, then the indoor exposures would be reduced at the 70% of the present exposure values.
Resumo:
The relative contribution of the main mechanisms that control indoor air quality in residential flats was examined. Indoor and outdoor concentration measurements of different type pollutants (black carbon, SO2, O3, NO, NO2,) were monitored in three naturally ventilated residential flats in Athens, Greece. At each apartment, experiments were conducted during the cold as well as during the warm period of the year. The controlling parameters of transport and deposition mechanisms were calculated from the experimental data. Deposition rates of the same pollutant differ according to the site (different construction characteristics) and to the measuring period for the same site (variations in relative humidity and differences in furnishing). Differences in the black carbon deposition rates were attributed to different black carbon size distributions. The highest deposition rates were observed for O3 in the residential flats with the older construction and the highest humidity levels. The calculated parameters as well as the measured outdoor concentrations were used as input data of a one-compartment indoor air quality model, and the indoor concentrations, the production, and loss rates of the different pollutants were calculated. The model calculated concentrations are in good agreement with the measured values. Model simulations revealed that the mechanism that mainly affected the change rate of indoor black carbon concentrations was the transport from the outdoor environment, while the removal due to deposition was insignificant. During model simulations, it was also established that that the change rate of SO2 concentrations was governed by the interaction between the transport and the deposition mechanisms while NOX concentrations were mainly controlled through photochemical reactions and the transport from outdoors.
Resumo:
MOTIVATION: The accurate prediction of the quality of 3D models is a key component of successful protein tertiary structure prediction methods. Currently, clustering or consensus based Model Quality Assessment Programs (MQAPs) are the most accurate methods for predicting 3D model quality; however they are often CPU intensive as they carry out multiple structural alignments in order to compare numerous models. In this study, we describe ModFOLDclustQ - a novel MQAP that compares 3D models of proteins without the need for CPU intensive structural alignments by utilising the Q measure for model comparisons. The ModFOLDclustQ method is benchmarked against the top established methods in terms of both accuracy and speed. In addition, the ModFOLDclustQ scores are combined with those from our older ModFOLDclust method to form a new method, ModFOLDclust2, that aims to provide increased prediction accuracy with negligible computational overhead. RESULTS: The ModFOLDclustQ method is competitive with leading clustering based MQAPs for the prediction of global model quality, yet it is up to 150 times faster than the previous version of the ModFOLDclust method at comparing models of small proteins (<60 residues) and over 5 times faster at comparing models of large proteins (>800 residues). Furthermore, a significant improvement in accuracy can be gained over the previous clustering based MQAPs by combining the scores from ModFOLDclustQ and ModFOLDclust to form the new ModFOLDclust2 method, with little impact on the overall time taken for each prediction. AVAILABILITY: The ModFOLDclustQ and ModFOLDclust2 methods are available to download from: http://www.reading.ac.uk/bioinf/downloads/ CONTACT: l.j.mcguffin@reading.ac.uk.
Resumo:
The development of effective methods for predicting the quality of three-dimensional (3D) models is fundamentally important for the success of tertiary structure (TS) prediction strategies. Since CASP7, the Quality Assessment (QA) category has existed to gauge the ability of various model quality assessment programs (MQAPs) at predicting the relative quality of individual 3D models. For the CASP8 experiment, automated predictions were submitted in the QA category using two methods from the ModFOLD server-ModFOLD version 1.1 and ModFOLDclust. ModFOLD version 1.1 is a single-model machine learning based method, which was used for automated predictions of global model quality (QMODE1). ModFOLDclust is a simple clustering based method, which was used for automated predictions of both global and local quality (QMODE2). In addition, manual predictions of model quality were made using ModFOLD version 2.0-an experimental method that combines the scores from ModFOLDclust and ModFOLD v1.1. Predictions from the ModFOLDclust method were the most successful of the three in terms of the global model quality, whilst the ModFOLD v1.1 method was comparable in performance to other single-model based methods. In addition, the ModFOLDclust method performed well at predicting the per-residue, or local, model quality scores. Predictions of the per-residue errors in our own 3D models, selected using the ModFOLD v2.0 method, were also the most accurate compared with those from other methods. All of the MQAPs described are publicly accessible via the ModFOLD server at: http://www.reading.ac.uk/bioinf/ModFOLD/. The methods are also freely available to download from: http://www.reading.ac.uk/bioinf/downloads/.
Resumo:
Background: Selecting the highest quality 3D model of a protein structure from a number of alternatives remains an important challenge in the field of structural bioinformatics. Many Model Quality Assessment Programs (MQAPs) have been developed which adopt various strategies in order to tackle this problem, ranging from the so called "true" MQAPs capable of producing a single energy score based on a single model, to methods which rely on structural comparisons of multiple models or additional information from meta-servers. However, it is clear that no current method can separate the highest accuracy models from the lowest consistently. In this paper, a number of the top performing MQAP methods are benchmarked in the context of the potential value that they add to protein fold recognition. Two novel methods are also described: ModSSEA, which based on the alignment of predicted secondary structure elements and ModFOLD which combines several true MQAP methods using an artificial neural network. Results: The ModSSEA method is found to be an effective model quality assessment program for ranking multiple models from many servers, however further accuracy can be gained by using the consensus approach of ModFOLD. The ModFOLD method is shown to significantly outperform the true MQAPs tested and is competitive with methods which make use of clustering or additional information from multiple servers. Several of the true MQAPs are also shown to add value to most individual fold recognition servers by improving model selection, when applied as a post filter in order to re-rank models. Conclusion: MQAPs should be benchmarked appropriately for the practical context in which they are intended to be used. Clustering based methods are the top performing MQAPs where many models are available from many servers; however, they often do not add value to individual fold recognition servers when limited models are available. Conversely, the true MQAP methods tested can often be used as effective post filters for re-ranking few models from individual fold recognition servers and further improvements can be achieved using a consensus of these methods.
Resumo:
Controlled human intervention trials are required to confirm the hypothesis that dietary fat quality may influence insulin action. The aim was to develop a food-exchange model, suitable for use in free-living volunteers, to investigate the effects of four experimental diets distinct in fat quantity and quality: high SFA (HSFA); high MUFA (HMUFA) and two low-fat (LF) diets, one supplemented with 1.24g EPA and DHA/d (LFn-3). A theoretical food-exchange model was developed. The average quantity of exchangeable fat was calculated as the sum of fat provided by added fats (spreads and oils), milk, cheese, biscuits, cakes, buns and pastries using data from the National Diet and Nutrition Survey of UK adults. Most of the exchangeable fat was replaced by specifically designed study foods. Also critical to the model was the use of carbohydrate exchanges to ensure the diets were isoenergetic. Volunteers from eight centres across Europe completed the dietary intervention. Results indicated that compositional targets were largely achieved with significant differences in fat quantity between the high-fat diets (39.9 (SEM 0.6) and 38.9 (SEM 0.51) percentage energy (%E) from fat for the HSFA and HMUFA diets respectively) and the low-fat diets (29.6 (SEM 0.6) and 29.1 (SEM 0.5) %E from fat for the LF and LFn-3 diets respectively) and fat quality (17.5 (SEM 0.3) and 10.4 (SEM 0.2) %E front SFA and 12.7 (SEM 0.3) and 18.7 (SEM 0.4) %E MUFA for the HSFA and HMUFA diets respectively). In conclusion, a robust, flexible food-exchange model was developed and implemented successfully in the LIPGENE dietary intervention trial.
Resumo:
Our objective in this study was to develop and implement an effective intervention strategy to manipulate the amount and composition of dietary fat and carbohydrate (CHO) in free-living individuals in the RISCK study. The study was a randomized, controlled dietary intervention study that was conducted in 720 participants identified as higher risk for or with metabolic syndrome. All followed a 4-wk run-in reference diet [high saturated fatty acids (SF)/high glycemic index (GI)]. Volunteers were randomized to continue this diet for a further 24 wk or to I of 4 isoenergetic prescriptions [high monounsaturated fatty acids (MUFA)/high GI; high MUFA/low GI; low fat (LF)/high GI; and LF/low GI]. We developed a food exchange model to implement each diet. Dietary records and plasma phospholipid fatty acids were used to assess the effectiveness of the intervention strategy. Reported fat intake from the LF diets was significantly reduced to 28% of energy (%E) compared with 38% E from the HM and LF diets. SF intake was successfully decreased in the HM and LF diets was similar to 10% E compared with 17% E in the reference diet (P = 0.001). Dietary MUFA in the HIM diets was similar to 17% E, significantly higher than in the reference (12% E) and LF diets (10% E) (P = 0.001). Changes in plasma phospholipid fatty acids provided further evidence for the successful manipulation of fat intake. The GI of the HGI and LGI arms differed by similar to 9 points (P = 0.001). The food exchange model provided an effective dietary strategy for the design and implementation across multiple sites of 5 experimental diets with specific targets for the proportion of fat and CHO. J. Nutr. 139: 1534-1540, 2009.
Resumo:
The IntFOLD-TS method was developed according to the guiding principle that the model quality assessment would be the most critical stage for our template based modelling pipeline. Thus, the IntFOLD-TS method firstly generates numerous alternative models, using in-house versions of several different sequence-structure alignment methods, which are then ranked in terms of global quality using our top performing quality assessment method – ModFOLDclust2. In addition to the predicted global quality scores, the predictions of local errors are also provided in the resulting coordinate files, using scores that represent the predicted deviation of each residue in the model from the equivalent residue in the native structure. The IntFOLD-TS method was found to generate high quality 3D models for many of the CASP9 targets, whilst also providing highly accurate predictions of their per-residue errors. This important information may help to make the 3D models that are produced by the IntFOLD-TS method more useful for guiding future experimental work