270 resultados para Validation par connaissance expert
Resumo:
This research-in-progress paper reports preliminary findings of a study that is designed to identify characteristics of an expert in the discipline of Information Systems (IS). The paper delivers a formative research model to depict characteristics of an expert with three additive constructs, using concepts derived from psychology, knowledge management and social-behaviour research. The paper then explores the formation and application ‘expertise’ using four investigative questions in the context of System Evaluations. Data have been gathered from 220 respondents representing three medium sized companies in India, using the SAP Enterprise Resource Planning system. The paper summarizes planned data analyses in construct validation, model testing and model application. A validated construct of expertise of IS will have a wide range of implications for research and practice.
Resumo:
This paper will report on the way expert science teachers’ conceive of scientific literacy in their classrooms, the values related to scientific literacy they hold and how this conception and the underpinning values affect their teaching practice. Three perceived expert science teachers who teach both at senior and middle school levels and across the range of sub-disciplines (one senior biology, one senior chemistry and one senior physics) were interviewed about their understanding of scientific literacy and how this influenced their teaching practice. The three teachers were video recorded teaching a junior science class and a senior science class. The data were analysed to identify values that underpin their conceptions of science and science education. The analysis focussed on the matching of the verbalised conceptions and values with their practice of teaching science. This paper will report on these data.
Resumo:
Microbial pollution in water periodically affects human health in Australia, particularly in times of drought and flood. There is an increasing need for the control of waterborn microbial pathogens. Methods, allowing the determination of the origin of faecal contamination in water, are generally referred to as Microbial Source Tracking (MST). Various approaches have been evaluated as indicatorsof microbial pathogens in water samples, including detection of different microorganisms and various host-specific markers. However, until today there have been no universal MST methods that could reliably determine the source (human or animal) of faecal contamination. Therefore, the use of multiple approaches is frequently advised. MST is currently recognised as a research tool, rather than something to be included in routine practices. The main focus of this research was to develop novel and universally applicable methods to meet the demands for MST methods in routine testing of water samples. Escherichia coli was chosen initially as the object organism for our studies as, historically and globally, it is the standard indicator of microbial contamination in water. In this thesis, three approaches are described: single nucleotide polymorphism (SNP) genotyping, clustered regularly interspaced short palindromic repeats (CRISPR) screening using high resolution melt analysis (HRMA) methods and phage detection development based on CRISPR types. The advantage of the combination SNP genotyping and CRISPR genes has been discussed in this study. For the first time, a highly discriminatory single nucleotide polymorphism interrogation of E. coli population was applied to identify the host-specific cluster. Six human and one animal-specific SNP profile were revealed. SNP genotyping was successfully applied in the field investigations of the Coomera watershed, South-East Queensland, Australia. Four human profiles [11], [29], [32] and [45] and animal specific SNP profile [7] were detected in water. Two human-specific profiles [29] and [11] were found to be prevalent in the samples over a time period of years. The rainfall (24 and 72 hours), tide height and time, general land use (rural, suburban), seasons, distance from the river mouth and salinity show a lack of relashionship with the diversity of SNP profiles present in the Coomera watershed (p values > 0.05). Nevertheless, SNP genotyping method is able to identify and distinquish between human- and non-human specific E. coli isolates in water sources within one day. In some samples, only mixed profiles were detected. To further investigate host-specificity in these mixed profiles CRISPR screening protocol was developed, to be used on the set of E. coli, previously analysed for SNP profiles. CRISPR loci, which are the pattern of previous DNA coliphages attacks, were considered to be a promising tool for detecting host-specific markers in E. coli. Spacers in CRISPR loci could also reveal the dynamics of virulence in E. coli as well in other pathogens in water. Despite the fact that host-specificity was not observed in the set of E. coli analysed, CRISPR alleles were shown to be useful in detection of the geographical site of sources. HRMA allows determination of ‘different’ and ‘same’ CRISPR alleles and can be introduced in water monitoring as a cost-effective and rapid method. Overall, we show that the identified human specific SNP profiles [11], [29], [32] and [45] can be useful as marker genotypes globally for identification of human faecal contamination in water. Developed in the current study, the SNP typing approach can be used in water monitoring laboratories as an inexpensive, high-throughput and easy adapted protocol. The unique approach based on E. coli spacers for the search for unknown phage was developed to examine the host-specifity in phage sequences. Preliminary experiments on the recombinant plasmids showed the possibility of using this method for recovering phage sequences. Future studies will determine the host-specificity of DNA phage genotyping as soon as first reliable sequences can be acquired. No doubt, only implication of multiple approaches in MST will allow identification of the character of microbial contamination with higher confidence and readability.
Resumo:
An introduction to thinking about and understanding probability that highlights the main pits and trapfalls that befall logical reasoning
Resumo:
An introduction to elicitation of experts' probabilities, which illustrates common problems with reasoning and how to circumvent them during elicitation.
Resumo:
An introduction to design of eliciting knowledge from experts.
Resumo:
An introduction to eliciting a conditional probability table in a Bayesian Network model, highlighting three efficient methods for populating a CPT.
Resumo:
In recent years, the problems resulting from unsustainable subdivision development have become significant problems in the Bangkok Metropolitan Region (BMR), Thailand. Numbers of government departments and agencies have tried to eliminate the problems by introducing the rating tools to encourage the higher sustainability levels of subdivision development in BMR, such as the Environmental Impact Assessment Monitoring Award (EIA-MA) and the Thai’s Rating for Energy and Environmental Sustainability of New construction and major renovation (TREES-NC). However, the EIA-MA has included the neighbourhood designs in the assessment criteria, but this requirement applies to large projects only. Meanwhile, TREES-NC has focused only on large scale buildings such as condominiums, office buildings, and is not specific for subdivision neighbourhood designs. Recently, the new rating tool named “Rating for Subdivision Neighbourhood Sustainability Design (RSNSD)” has been developed. Therefore, the validation process of RSNSD is still required. This paper aims to validate the new rating tool for subdivision neighbourhood design in BMR. The RSNSD has been validated by applying the rating tool to eight case study subdivisions. The result of RSNSD by data generated through surveying subdivisions will be compared to the existing results from the EIA-MA. The selected cases include of one “Excellent Award”, two “Very Good Award”, and five non-rated subdivision developments. This paper expects to prove the credibility of RSNSD before introducing to the real subdivision development practises. The RSNSD could be useful to encourage higher sustainability subdivision design level, and then protect the problems from further subdivision development in BMR.
Resumo:
Six Sigma is considered to be an important management philosophy to obtain satisfied customers. But financial service organisations have been slow to adopt Six Sigma issues so far. Despite the extensive effort that has been invested and benefits that can be obtained, the systematic implementation of Six Sigma in financial service organisations is limited. As a company wide implementation framework is missing so far, this paper tries to fill this gap. Based on theory, a conceptual framework is developed and evaluated by experts from financial institutions. The results show that it is very important to link Six Sigma with the strategic as well as the operations level. Furthermore, although Six Sigma is a very important method for improving quality of processes others such as Lean Management are also used This requires a superior project portfolio management to coordinate resources and projects of Six Sigma with the other methods used. Beside the theoretical contribution, the framework can be used by financial service companies to evaluate their Six Sigma activities. Thus, the framework grounded through literature and empirical data will be a useful guide for sustainable and successful implementation of a Six Sigma initiative in financial service organisations.
Resumo:
This is a creative piece inspired within and by the decolonising stream that emerged at the World Congress on Action Research and Action Learning, held in Melbourne, Australia in September 2010. It compliments the other works within ALAR Journal's specific edition on research experiences within the decolonising space.
Resumo:
Precise protein quantification is essential in clinical dietetics, particularly in the management of renal, burn and malnourished patients. The EP-10 was developed to expedite the estimation of dietary protein for nutritional assessment and recommendation. The main objective of this study was to compare the validity and efficacy of the EP-10 with the American Dietetic Association’s “Exchange List for Meal Planning” (ADA-7g) in quantifying dietary protein intake, against computerised nutrient analysis (CNA). Protein intake of 197 food records kept by healthy adult subjects in Singapore was determined thrice using three different methods – (1) EP-10, (2) ADA-7g and (3) CNA using SERVE program (Version 4.0). Assessments using the EP-10 and ADA-7g were performed by two assessors in a blind crossover manner while a third assessor performed the CNA. All assessors were blind to each other’s results. Time taken to assess a subsample (n=165) using the EP-10 and ADA-7g was also recorded. Mean difference in protein intake quantification when compared to the CNA was statistically non-significant for the EP-10 (1.4 ± 16.3 g, P = .239) and statistically significant for the ADA-7g (-2.2 ± 15.6 g, P = .046). Both the EP-10 and ADA-7g had clinically acceptable agreement with the CNA as determined via Bland-Altman plots, although it was found that EP-10 had a tendency to overestimate with protein intakes above 150 g. The EP-10 required significantly less time for protein intake quantification than the ADA-7g (mean time of 65 ± 36 seconds vs. 111 ± 40 seconds, P < .001). The EP-10 and ADA-7g are valid clinical tools for protein intake quantification in an Asian context, with EP-10 being more time efficient. However, a dietician’s discretion is needed when the EP-10 is used on protein intakes above 150g.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
The lack of satisfactory consensus for characterizing the system intelligence and structured analytical decision models has inhibited the developers and practitioners to understand and configure optimum intelligent building systems in a fully informed manner. So far, little research has been conducted in this aspect. This research is designed to identify the key intelligent indicators, and develop analytical models for computing the system intelligence score of smart building system in the intelligent building. The integrated building management system (IBMS) was used as an illustrative example to present a framework. The models presented in this study applied the system intelligence theory, and the conceptual analytical framework. A total of 16 key intelligent indicators were first identified from a general survey. Then, two multi-criteria decision making (MCDM) approaches, the analytic hierarchy process (AHP) and analytic network process (ANP), were employed to develop the system intelligence analytical models. Top intelligence indicators of IBMS include: self-diagnostic of operation deviations; adaptive limiting control algorithm; and, year-round time schedule performance. The developed conceptual framework was then transformed to the practical model. The effectiveness of the practical model was evaluated by means of expert validation. The main contribution of this research is to promote understanding of the intelligent indicators, and to set the foundation for a systemic framework that provide developers and building stakeholders a consolidated inclusive tool for the system intelligence evaluation of the proposed components design configurations.
Resumo:
The underlying objective of this study was to develop a novel approach to evaluate the potential for commercialisation of a new technology. More specifically, this study examined the 'ex-ante'. evaluation of the technology transfer process. For this purpose, a technology originating from the high technology sector was used. The technology relates to the application of software for the detection of weak signals from space, which is an established method of signal processing in the field of radio astronomy. This technology has the potential to be used in commercial and industrial areas other than astronomy, such as detecting water leakages in pipes. Its applicability to detecting water leakage was chosen owing to several problems with detection in the industry as well as the impact it can have on saving water in the environment. This study, therefore, will demonstrate the importance of interdisciplinary technology transfer. The study employed both technical and business evaluation methods including laboratory experiments and the Delphi technique to address the research questions. There are several findings from this study. Firstly, scientific experiments were conducted and these resulted in a proof of concept stage of the chosen technology. Secondly, validation as well as refinement of criteria from literature that can be used for „ex-ante. evaluation of technology transfer has been undertaken. Additionally, after testing the chosen technology.s overall transfer potential using the modified set of criteria, it was found that the technology is still in its early stages and will require further development for it to be commercialised. Furthermore, a final evaluation framework was developed encompassing all the criteria found to be important. This framework can help in assessing the overall readiness of the technology for transfer as well as in recommending a viable mechanism for commercialisation. On the whole, the commercial potential of the chosen technology was tested through expert opinion, thereby focusing on the impact of a new technology and the feasibility of alternate applications and potential future applications.