61 resultados para Confusion Assessment Method
Resumo:
Project managers in the construction industry increasingly seek to learn from other industrial sectors. Knowledge sharing between different contexts is thus viewed as an essential source of competitive advantage. It is important therefore for project managers from all sectors to address and develop appropriate methods of knowledge sharing. However, too often it is assumed that knowledge freely exists and can be captured and shared between contexts. Such assumptions belie complexities and problems awaiting the unsuspecting knowledge-sharing protagonist. Knowledge per se is a problematic esoteric concept that does not lend itself easily to codification. Specifically tacit knowledge possessed by individuals, presents particular methodological issues for those considering harnessing its utility in return for competitive advantage. The notion that knowledge is also embedded in specific social contexts compounds this complexity. It is argued that knowledge is highly individualistic and concomitant with the various surrounding contexts within which it is shaped and enacted. Indeed, these contexts are also shaped as a consequence of knowledge adding further complexity to the problem domain. Current methods of knowledge capture, transfer and, sharing fall short of addressing these problematic issues. Research is presented that addresses these problems and proposes an alternative method of knowledge sharing. Drawing on data and observations collected from its application, the findings clearly demonstrate the crucial role of re-contextualisation, social interaction and dialectic debate in understanding knowledge sharing.
Resumo:
Until recently, there has been little investigation concerning the poor indoor air quality (IAQ) in classrooms. Despite the evidence that the educational building systems in many of the UK institutions have significant defects that may degrade IAQ, systematic assessments of IAQ measurements has been rarely undertaken. When undertaking IAQ measurement, there is a difficult task of representing and characterizing the environment parameters. Although technologies exist to measure these parameters, direct measurements especially in a naturally ventilated spaces are often difficult. This paper presents a methodology for developing a method to characterize indoor environment flow parameters as well as the Carbon Dioxide (CO2) concentrations. Thus, CO2 concentration level can be influenced by the differences in the selection of sampling points and heights. However, because this research focuses on natural ventilation in classrooms, air exchange is provided mainly by air infiltration. It is hoped that the methodology developed and evaluated in this research can effectively simplify the process of estimating the parameters for a systematic assessment of IAQ measurements in a naturally ventilated classrooms.
Resumo:
A generic model of Exergy Assessment is proposed for the Environmental Impact of the Building Lifecycle, with a special focus on the natural environment. Three environmental impacts: energy consumption, resource consumption and pollutant discharge have been analyzed with reference to energy-embodied exergy, resource chemical exergy and abatement exergy, respectively. The generic model of Exergy Assessment of the Environmental Impact of the Building Lifecycle thus formulated contains two sub-models, one from the aspect of building energy utilization and the other from building materials use. Combined with theories by ecologists such as Odum, the paper evaluates a building's environmental sustainability through its exergy footprint and environmental impacts. A case study from Chongqing, China illustrates the application of this method. From the case study, it was found that energy consumption constitutes 70–80% of the total environmental impact during a 50-year building lifecycle, in which the operation phase accounts for 80% of the total environmental impact, the building material production phase 15% and 5% for the other phases.
Resumo:
In any undergraduate engineering programme there is a need to assess the balance and flavour of the various educational strands. In order for a quality assurance of these programmes to be met there is a need to evaluate the course load, academic content and the assessment marks of each course in the undergraduate programme. The existing ranges of QA methods for these programmes are focused on one or two of these issues and do not provide a comprehensive assessment procedure. Following a review of the existing QA methods, this paper will define a three-dimensional approach to the assessment of the educational aspects of an undergraduate course. Various features of this method will be described and potential benefits explained.
Resumo:
Context: During development managers, analysts and designers often need to know whether enough requirements analysis work has been done and whether or not it is safe to proceed to the design stage. Objective: This paper describes a new, simple and practical method for assessing our confidence in a set of requirements. Method: We identified 4 confidence factors and used a goal oriented framework with a simple ordinal scale to develop a method for assessing confidence. We illustrate the method and show how it has been applied to a real systems development project. Results: We show how assessing confidence in the requirements could have revealed problems in this project earlier and so saved both time and money. Conclusion: Our meta-level assessment of requirements provides a practical and pragmatic method that can prove useful to managers, analysts and designers who need to know when sufficient requirements analysis has been performed.
Resumo:
The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.
Resumo:
Integrated Arable Farming Systems (IAFS), which involve a reduction in the use of off-farm inputs, are attracting considerable research interest in the UK. The objectives of these systems experiments are to compare their financial performance with that from conventional or current farming practices. To date, this comparison has taken little account of any environmental benefits (or disbenefits) of the two systems. The objective of this paper is to review the assessment methodologies available for the analysis of environmental impacts. To illustrate the results of this exercise, the methodology and environmental indicators chosen are then applied to data from one of the LINK - Integrated Farming Systems experimental sites. Data from the Pathhead site in Southern Scotland are used to evaluate the use of invertebrates and nitrate loss as environmental indicators within IAFS. The results suggest that between 1992 and 1995 the biomass of earthworms fell by 28 kg per hectare on the integrated rotation and rose by 31 kg per hectare on the conventional system. This led to environmental costs ranging between £2.24 and £13.44 per hectare for the integrated system and gains of between £2.48 and £14.88 for the conventional system. In terms of nitrate, the integrated system had an estimated loss of £72.21 per hectare in comparison to £149.40 per hectare on the conventional system. Conclusions are drawn about the advantages and disadvantages of this type of analytical framework. Keywords: Farming systems; IAFS; Environmental valuation; Economics; Earthworms; Nitrates; Soil fauna
Resumo:
Accumulating data suggest that diets rich in flavanols and procyanidins are beneficial for human health. In this context, there has been a great interest in elucidating the systemic levels and metabolic profiles at which these compounds occur in humans. While recent progress has been made, there still exist considerable differences and various disagreements with regard to the mammalian metabolites of these compounds, which in turn is largely a consequence of the lack of availability of authentic standards that would allow for the directed development and validation of expedient analytical methodologies. In the present study, we developed a method for the analysis of structurally-related flavanol metabolites using a wide range of authentic standards. Applying this method in the context of a human dietary intervention study using comprehensively characterized and standardized flavanol- and procyanidin-containing cocoa, we were able to identify the structurally-related (−)-epicatechin metabolites (SREM) postprandially extant in the systemic circulation of humans. Our results demonstrate that (−)-epicatechin-3′-β-D-glucuronide, (−)-epicatechin-3′-sulfate, and a 3′-O-methyl(−)-epicatechin-5/7-sulfate are the predominant SREM in humans, and further confirm the relevance of the stereochemical configuration in the context of flavanol metabolism. In addition, we also identified plausible causes for the previously reported discrepancies regarding flavanol metabolism, consisting to a significant extent of inter-laboratory differences in sample preparation (enzymatic treatment and sample conditioning for HPLC analysis) and detection systems. Thus, these findings may also aid in the establishment of consensus on this topic.
Resumo:
This paper describes a new method for the assessment of palaeohydrology through the Holocene. A palaeoclimate model was linked with a hydrological model, using a weather generator to correct bias in the rainfall estimates, to simulate the changes in the flood frequency and the groundwater response through the late Pleistocene and Holocene for the Wadi Faynan in southern Jordan, a site considered internationally important due to its rich archaeological heritage spanning the Pleistocene and Holocene. This is the first study to describe the hydrological functioning of the Wadi Faynan, a meso-scale (241 km2) semi-arid catchment, setting this description within the framework of contemporary archaeological investigations. Historic meteorological records were collated and supplemented with new hydrological and water quality data. The modelled outcomes indicate that environmental changes, such as deforestation, had a major impact on the local water cycle and this amplified the effect of the prevailing climate on the flow regime. The results also show that increased rainfall alone does not necessarily imply better conditions for farming and highlight the importance of groundwater. The discussion focuses on the utility of the method and the importance of the local hydrology to the sustained settlement of the Wadi Faynan through pre-history and history.
Resumo:
The assessment of building energy efficiency is one of the most effective measures for reducing building energy consumption. This paper proposes a holistic method (HMEEB) for assessing and certifying building energy efficiency based on the D-S (Dempster-Shafer) theory of evidence and the Evidential Reasoning (ER) approach. HMEEB has three main features: (i) it provides both a method to assess and certify building energy efficiency, and exists as an analytical tool to identify improvement opportunities; (ii) it combines a wealth of information on building energy efficiency assessment, including identification of indicators and a weighting mechanism; and (iii) it provides a method to identify and deal with inherent uncertainties within the assessment procedure. This paper demonstrates the robustness, flexibility and effectiveness of the proposed method, using two examples to assess the energy efficiency of two residential buildings, both located in the ‘Hot Summer and Cold Winter’ zone in China. The proposed certification method provides detailed recommendations for policymakers in the context of carbon emission reduction targets and promoting energy efficiency in the built environment. The method is transferable to other countries and regions, using an indicator weighting system to modify local climatic, economic and social factors.
Resumo:
Purpose NANA is a 3-year project using sensitively-designed technology to improve data collection and integrate information on nutrition, physical and cognitive function and mental health to identify individuals at risk of under-nourishment and improve targeting of interventions. This research will also improve our understanding of the interactions between these factors, in order to better medical treatment and social provision. The toolkit has potential for commercial development for additional segments of the population. Method This is a multi-disciplinary program involving psychology, nutrition, engineering and software engineering. The first phase is a user needs analysis and will involve consulting with a broad cross-section of older people, caregivers, and health professionals, to establish what technical approaches would be useful and acceptable. The second phase focuses on the development of an integrated measurement toolkit. There are three inter-related subsections: (i) an iterative program to develop the assessment technology, (ii) techniques for dietary assessment in older people, and (iii) a parallel investigation of measures of cognition and mental health in older people. It includes a full validation of the assessment toolkit and will comprise a comparison of the new, integrated assessment with traditional 'pen and paper' methods with volunteers having the equipment installed in their homes.
Resumo:
The accurate assessment of dietary exposure is important in investigating associations between diet and disease. Research in nutritional epidemiology, which has resulted in a large amount of information on associations between diet and chronic diseases in the last decade, relies on accurate assessment methods to identify these associations. However, most dietary assessment instruments rely to some extent on self-reporting, which is prone to systematic bias affected by factors such as age, gender, social desirability and approval. Nutritional biomarkers are not affected by these and therefore provide an additional, alternative method to estimate intake. However, there are also some limitations in their application: they are affected by inter-individual variations in metabolism and other physiological factors, and they are often limited to estimating intake of specific compounds and not entire foods. It is therefore important to validate nutritional biomarkers to determine specific strengths and limitations. In this perspective paper, criteria for the validation of nutritional markers and future developments are discussed.
Resumo:
Motivation: The ability of a simple method (MODCHECK) to determine the sequence–structure compatibility of a set of structural models generated by fold recognition is tested in a thorough benchmark analysis. Four Model Quality Assessment Programs (MQAPs) were tested on 188 targets from the latest LiveBench-9 automated structure evaluation experiment. We systematically test and evaluate whether the MQAP methods can successfully detect native-likemodels. Results: We show that compared with the other three methods tested MODCHECK is the most reliable method for consistently performing the best top model selection and for ranking the models. In addition, we show that the choice of model similarity score used to assess a model's similarity to the experimental structure can influence the overall performance of these tools. Although these MQAP methods fail to improve the model selection performance for methods that already incorporate protein three dimension (3D) structural information, an improvement is observed for methods that are purely sequence-based, including the best profile–profile methods. This suggests that even the best sequence-based fold recognition methods can still be improved by taking into account the 3D structural information.
Resumo:
Motivation: Modelling the 3D structures of proteins can often be enhanced if more than one fold template is used during the modelling process. However, in many cases, this may also result in poorer model quality for a given target or alignment method. There is a need for modelling protocols that can both consistently and significantly improve 3D models and provide an indication of when models might not benefit from the use of multiple target-template alignments. Here, we investigate the use of both global and local model quality prediction scores produced by ModFOLDclust2, to improve the selection of target-template alignments for the construction of multiple-template models. Additionally, we evaluate clustering the resulting population of multi- and single-template models for the improvement of our IntFOLD-TS tertiary structure prediction method. Results: We find that using accurate local model quality scores to guide alignment selection is the most consistent way to significantly improve models for each of the sequence to structure alignment methods tested. In addition, using accurate global model quality for re-ranking alignments, prior to selection, further improves the majority of multi-template modelling methods tested. Furthermore, subsequent clustering of the resulting population of multiple-template models significantly improves the quality of selected models compared with the previous version of our tertiary structure prediction method, IntFOLD-TS.
Resumo:
The estimation of prediction quality is important because without quality measures, it is difficult to determine the usefulness of a prediction. Currently, methods for ligand binding site residue predictions are assessed in the function prediction category of the biennial Critical Assessment of Techniques for Protein Structure Prediction (CASP) experiment, utilizing the Matthews Correlation Coefficient (MCC) and Binding-site Distance Test (BDT) metrics. However, the assessment of ligand binding site predictions using such metrics requires the availability of solved structures with bound ligands. Thus, we have developed a ligand binding site quality assessment tool, FunFOLDQA, which utilizes protein feature analysis to predict ligand binding site quality prior to the experimental solution of the protein structures and their ligand interactions. The FunFOLDQA feature scores were combined using: simple linear combinations, multiple linear regression and a neural network. The neural network produced significantly better results for correlations to both the MCC and BDT scores, according to Kendall’s τ, Spearman’s ρ and Pearson’s r correlation coefficients, when tested on both the CASP8 and CASP9 datasets. The neural network also produced the largest Area Under the Curve score (AUC) when Receiver Operator Characteristic (ROC) analysis was undertaken for the CASP8 dataset. Furthermore, the FunFOLDQA algorithm incorporating the neural network, is shown to add value to FunFOLD, when both methods are employed in combination. This results in a statistically significant improvement over all of the best server methods, the FunFOLD method (6.43%), and one of the top manual groups (FN293) tested on the CASP8 dataset. The FunFOLDQA method was also found to be competitive with the top server methods when tested on the CASP9 dataset. To the best of our knowledge, FunFOLDQA is the first attempt to develop a method that can be used to assess ligand binding site prediction quality, in the absence of experimental data.