97 resultados para Genotyping uncertainty
em Queensland University of Technology - ePrints Archive
Resumo:
Sales growth and employment growth are the two most widely used growth indicators for new ventures; yet, sales growth and employment growth are not interchangeable measures of new venture growth. Rather, they are related, but somewhat independent constructs that respond differently to a variety of criteria. Most of the literature treats this as a methodological technicality. However, sales growth with or without accompanying employment growth has very different implications for managers and policy makers. A better understanding of what drives these different growth metrics has the potential to lead to better decision making. To improve that understanding we apply transaction cost economics reasoning to predict when sales growth will be or will not be accompanied by employment growth. Our results indicate that our predictions are borne out consistently in resource-constrained contexts but not in resource-munificent contexts.
Resumo:
Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.
Resumo:
A study has been conducted to investigate current practices on decision-making under risk and uncertainty for infrastructure project investments. It was found that many European countries such as the UK, France, Germany including Australia use scenarios for the investigation of the effects of risk and uncertainty of project investments. Different alternative scenarios are mostly considered during the engineering economic cost-benefit analysis stage. For instance, the World Bank requires an analysis of risks in all project appraisals. Risk in economic evaluation needs to be addressed by calculating sensitivity of the rate of return for a number of events. Risks and uncertainties of project developments arise from various sources of errors including data, model and forecasting errors. It was found that the most influential factors affecting risk and uncertainty resulted from forecasting errors. Data errors and model errors have trivial effects. It was argued by many analysts that scenarios do not forecast what will happen but scenarios indicate only what can happen from given alternatives. It was suggested that the probability distributions of end-products of the project appraisal, such as cost-benefit ratios that take forecasting errors into account, are feasible decision tools for economic evaluation. Political, social, environmental as well as economic and other related risk issues have been addressed and included in decision-making frameworks, such as in a multi-criteria decisionmaking framework. But no suggestion has been made on how to incorporate risk into the investment decision-making process.
Resumo:
Purpose: Choosing the appropriate procurement system for construction projects is a complex and challenging task for clients particularly when professional advice has not been sought. To assist with the decision making process, a range of procurement selection tools and techniques have been developed by both academic and industry bodies. Public sector clients in Western Australia (WA) remain uncertain about the pairing of procurement method to bespoke construction project and how this decision will ultimately impact upon project success. This paper examines ‘how and why’ a public sector agency selected particular procurement methods. · Methodology/Approach: An analysis of two focus group workshops (with 18 senior project and policy managers involved with procurement selection) is reported upon · Findings: The traditional lump sum (TLS) method is still the preferred procurement path even though alternative forms such as design and construct, public-private-partnerships could optimize the project outcome. Paradoxically, workshop participants agreed that alternative procurement forms should be considered, but an embedded culture of uncertainty avoidance invariably meant that TLS methods were selected. Senior managers felt that only a limited number of contractors have the resources and experience to deliver projects using the nontraditional methods considered. · Research limitations/implications: The research identifies a need to develop a framework that public sector clients can use to select an appropriate procurement method. A procurement framework should be able to guide the decision-maker rather than provide a prescriptive solution. Learning from previous experiences with regard to procurement selection will further provide public sector clients with knowledge about how to best deliver their projects.
Resumo:
Information uncertainty which is inherent in many real world applications brings more complexity to the visualisation problem. Despite the increasing number of research papers found in the literature, much more work is needed. The aims of this chapter are threefold: (1) to provide a comprehensive analysis of the requirements of visualisation of information uncertainty and their dimensions of complexity; (2) to review and assess current progress; and (3) to discuss remaining research challenges. We focus on four areas: information uncertainty modelling, visualisation techniques, management of information uncertainty modelling, propagation and visualisation, and the uptake of uncertainty visualisation in application domains.
Resumo:
The effects of particulate matter on environment and public health have been widely studied in recent years. A number of studies in the medical field have tried to identify the specific effect on human health of particulate exposure, but agreement amongst these studies on the relative importance of the particles’ size and its origin with respect to health effects is still lacking. Nevertheless, air quality standards are moving, as the epidemiological attention, towards greater focus on the smaller particles. Current air quality standards only regulate the mass of particulate matter less than 10 μm in aerodynamic diameter (PM10) and less than 2.5 μm (PM2.5). The most reliable method used in measuring Total Suspended Particles (TSP), PM10, PM2.5 and PM1 is the gravimetric method since it directly measures PM concentration, guaranteeing an effective traceability to international standards. This technique however, neglects the possibility to correlate short term intra-day variations of atmospheric parameters that can influence ambient particle concentration and size distribution (emission strengths of particle sources, temperature, relative humidity, wind direction and speed and mixing height) as well as human activity patterns that may also vary over time periods considerably shorter than 24 hours. A continuous method to measure the number size distribution and total number concentration in the range 0.014 – 20 μm is the tandem system constituted by a Scanning Mobility Particle Sizer (SMPS) and an Aerodynamic Particle Sizer (APS). In this paper, an uncertainty budget model of the measurement of airborne particle number, surface area and mass size distributions is proposed and applied for several typical aerosol size distributions. The estimation of such an uncertainty budget presents several difficulties due to i) the complexity of the measurement chain, ii) the fact that SMPS and APS can properly guarantee the traceability to the International System of Measurements only in terms of number concentration. In fact, the surface area and mass concentration must be estimated on the basis of separately determined average density and particle morphology. Keywords: SMPS-APS tandem system, gravimetric reference method, uncertainty budget, ultrafine particles.
Resumo:
One of the new challenges in aeronautics is combining and accounting for multiple disciplines while considering uncertainties or variability in the design parameters or operating conditions. This paper describes a methodology for robust multidisciplinary design optimisation when there is uncertainty in the operating conditions. The methodology, which is based on canonical evolution algorithms, is enhanced by its coupling with an uncertainty analysis technique. The paper illustrates the use of this methodology on two practical test cases related to Unmanned Aerial Systems (UAS). These are the ideal candidates due to the multi-physics involved and the variability of missions to be performed. Results obtained from the optimisation show that the method is effective to find useful Pareto non-dominated solutions and demonstrate the use of robust design techniques.
Resumo:
Introduction: Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost-effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, antimicrobial-coated central venous catheters to use. We re-evaluated the cost-effectiveness of all commercially available antimicrobialcoated central venous catheters for prevention of catheter-related bloodstream infection in adult intensive care unit (ICU) patients. Methods: We used a Markov decision model to compare the cost-effectiveness of antimicrobial-coated central venous catheters relative to uncoated catheters. Four catheter types were evaluated; minocycline and rifampicin (MR)-coated catheters; silver, platinum and carbon (SPC)-impregnated catheters; and two chlorhexidine and silver sulfadiazine-coated catheters, one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per qualityadjusted life-year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results: The baseline analysis, with no consideration of uncertainty, indicated all four types of antimicrobial-coated central venous catheters were cost-saving relative to uncoated catheters. Minocycline and rifampicin-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life-years, and cost-savings, AUD $130,289. After considering uncertainty in the current evidence, the minocycline and rifampicin-coated catheters returned the highest incremental monetary net benefits of $948 per catheter; but there was a 62% probability of error in this conclusion. Although the minocycline and rifampicin-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions: Current evidence suggests that the cost-effectiveness of using antimicrobial-coated central venous catheters within the ICU is highly uncertain. Policies to prevent catheter-related bloodstream infection amongst ICU patients should consider the cost-effectiveness of competing interventions in the light of this uncertainty. Decision makers would do well to consider the current gaps in knowledge and the complexity of producing good quality evidence in this area.
Resumo:
Using the generative processes developed over two stages of creative development and the performance of The Physics Project at the Loft at the Creative Industries Precinct at the Queensland University of Technology (QUT) from 5th – 8th April 2006 as a case study, this exegesis considers how the principles of contemporary physics can be reframed as aesthetic principles in the creation of contemporary performance. The Physics Project is an original performance work that melds live performance, video and web-casting and overlaps an exploration of personal identity with the physics of space, time, light and complementarity. It considers the acts of translation between the language of physics and the language of contemporary performance that occur via process and form. This exegesis also examines the devices in contemporary performance making and contemporary performance that extend the reach of the performance, including the integration of the live and the mediated and the use of metanarratives.
Resumo:
The highly variable flagellin-encoding flaA gene has long been used for genotyping Campylobacter jejuni and Campylobacter coli. High-resolution melting (HRM) analysis is emerging as an efficient and robust method for discriminating DNA sequence variants. The objective of this study was to apply HRM analysis to flaA-based genotyping. The initial aim was to identify a suitable flaA fragment. It was found that the PCR primers commonly used to amplify the flaA short variable repeat (SVR) yielded a mixed PCR product unsuitable for HRM analysis. However, a PCR primer set composed of the upstream primer used to amplify the fragment used for flaA restriction fragment length polymorphism (RFLP) analysis and the downstream primer used for flaA SVR amplification generated a very pure PCR product, and this primer set was used for the remainder of the study. Eighty-seven C. jejuni and 15 C. coli isolates were analyzed by flaA HRM and also partial flaA sequencing. There were 47 flaA sequence variants, and all were resolved by HRM analysis. The isolates used had previously also been genotyped using single-nucleotide polymorphisms (SNPs), binary markers, CRISPR HRM, and flaA RFLP. flaAHRManalysis provided resolving power multiplicative to the SNPs, binary markers, and CRISPR HRM and largely concordant with the flaA RFLP. It was concluded that HRM analysis is a promising approach to genotyping based on highly variable genes.
Resumo:
Genetically modified (GM) food products are the source of much controversy and in the context of consumer behaviour, the way in which consumers perceive such food products is of paramount importance both theoretically and practically. Despite this, relatively little research has focused on GM food products from a consumer perspective, and as such, this study seeks to better understand what effects consumer willingness to buy GM food products in Australian consumers.
Resumo:
CFD has been successfully used in the optimisation of aerodynamic surfaces using a given set of parameters such as Mach numbers and angle of attack. While carrying out a multidisciplinary design optimisation one deals with situations where the parameters have some uncertain attached. Any optimisation carried out for fixed values of input parameters gives a design which may be totally unacceptable under off-design conditions. The challenge is to develop a robust design procedure which takes into account the fluctuations in the input parameters. In this work, we attempt this using a modified Taguchi approach. This is incorporated into an evolutionary algorithm with many features developed in house. The method is tested for an UCAV design which simultaneously handles aerodynamics, electromagnetics and maneuverability. Results demonstrate that the method has considerable potential.
Resumo:
Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.
Resumo:
In this thesis, the issue of incorporating uncertainty for environmental modelling informed by imagery is explored by considering uncertainty in deterministic modelling, measurement uncertainty and uncertainty in image composition. Incorporating uncertainty in deterministic modelling is extended for use with imagery using the Bayesian melding approach. In the application presented, slope steepness is shown to be the main contributor to total uncertainty in the Revised Universal Soil Loss Equation. A spatial sampling procedure is also proposed to assist in implementing Bayesian melding given the increased data size with models informed by imagery. Measurement error models are another approach to incorporating uncertainty when data is informed by imagery. These models for measurement uncertainty, considered in a Bayesian conditional independence framework, are applied to ecological data generated from imagery. The models are shown to be appropriate and useful in certain situations. Measurement uncertainty is also considered in the context of change detection when two images are not co-registered. An approach for detecting change in two successive images is proposed that is not affected by registration. The procedure uses the Kolmogorov-Smirnov test on homogeneous segments of an image to detect change, with the homogeneous segments determined using a Bayesian mixture model of pixel values. Using the mixture model to segment an image also allows for uncertainty in the composition of an image. This thesis concludes by comparing several different Bayesian image segmentation approaches that allow for uncertainty regarding the allocation of pixels to different ground components. Each segmentation approach is applied to a data set of chlorophyll values and shown to have different benefits and drawbacks depending on the aims of the analysis.