932 resultados para HMM, Nosocomial Pathogens, Genotyping, Statistical Modelling, VRE
Resumo:
The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.
Resumo:
Airport system is complex. Passenger dynamics within it appear to be complicate as well. Passenger behaviours outside standard processes are regarded more significant in terms of public hazard and service rate issues. In this paper, we devised an individual agent decision model to simulate stochastic passenger behaviour in airport departure terminal. Bayesian networks are implemented into the decision making model to infer the probabilities that passengers choose to use any in-airport facilities. We aim to understand dynamics of the discretionary activities of passengers.
Resumo:
Abstract. Fire safety of light gauge cold-formed steel frame (LSF) stud walls is significant in the design of buildings. In this research, finite element thermal models of both the traditional LSF wall panels with cavity insulation and the new LSF composite wall panels were developed to simulate their thermal behaviour under standard and real design fire conditions. Suitable thermal properties were proposed for plasterboards and insulations based on laboratory tests and literature review. The developed models were then validated by comparing their results with available fire test results. This paper presents the details of the developed finite element models of load bearing LSF wall panels and the thermal analysis results. It shows that finite element models can be used to simulate the thermal behaviour of load bearing LSF walls with varying configurations of insulations and plasterboards. Failure times of load bearing LSF walls were also predicted based on the results from finite element thermal analyses.
Resumo:
Effective, statistically robust sampling and surveillance strategies form an integral component of large agricultural industries such as the grains industry. Intensive in-storage sampling is essential for pest detection, Integrated Pest Management (IPM), to determine grain quality and to satisfy importing nation’s biosecurity concerns, while surveillance over broad geographic regions ensures that biosecurity risks can be excluded, monitored, eradicated or contained within an area. In the grains industry, a number of qualitative and quantitative methodologies for surveillance and in-storage sampling have been considered. Primarily, research has focussed on developing statistical methodologies for in storage sampling strategies concentrating on detection of pest insects within a grain bulk, however, the need for effective and statistically defensible surveillance strategies has also been recognised. Interestingly, although surveillance and in storage sampling have typically been considered independently, many techniques and concepts are common between the two fields of research. This review aims to consider the development of statistically based in storage sampling and surveillance strategies and to identify methods that may be useful for both surveillance and in storage sampling. We discuss the utility of new quantitative and qualitative approaches, such as Bayesian statistics, fault trees and more traditional probabilistic methods and show how these methods may be used in both surveillance and in storage sampling systems.
Resumo:
AIMS: As recent conflicting reports describe a genetic association between both the C- and the T-alleles of the dopamine D2 receptor (DRD2) C957T polymorphism (rs6277) in alcohol-dependent subjects, our aim was to examine this polymorphism and TaqIA (rs1800497) in Australian alcohol-dependent subjects. METHODS: The C957T polymorphism was genotyped in 228 patients with alcohol dependence (72 females and 156 males) and 228 healthy controls. RESULTS: The C-allele and C/C genotype of C957T was associated with alcohol dependence, whereas the TaqIA polymorphism was not. When analysed separately for C957T, males showed an even stronger association with the C-allele and females showed no association. The C957T and TaqIA haplotyping revealed a strong association with alcohol dependence and a double-genotype analysis (combining C957T and TaqIA genotypes) revealed that the relative risk of different genotypes varied by up to 27-fold with the TT/A1A2 having an 8.5-fold lower risk of alcohol dependence than other genotypes. CONCLUSION: Decreased DRD2 binding associated with the C-allele of the DRD2 C957T polymorphism is likely to be important in the underlying pathophysiology of at least some forms of alcohol dependence, and this effect appears to be limited to males only.
Resumo:
Fire safety has become an important part in structural design due to the ever increasing loss of properties and lives during fires. Conventionally the fire rating of load bearing wall systems made of Light gauge Steel Frames (LSF) is determined using fire tests based on the standard time-temperature curve given in ISO 834 (ISO, 1999). The standard time-temperature curve given in ISO 834 (ISO, 1999) originated from the application of wood burning furnaces in the early 1900s. However, modern commercial and residential buildings make use of thermoplastic materials, which mean considerably high fuel loads. Hence a detailed fire research study into the performance of LSF walls was undertaken using the developed real fire curves based on Eurocode parametric curves (ECS, 2002) and Barnett’s BFD curves (Barnett, 2002) using both full scale fire tests and numerical studies. It included LSF walls without any insulation, and the recently developed externally insulated composite panel system. This paper presents the details of the numerical studies and the results. It also includes brief details of the development of real building fire curves and experimental studies.
Resumo:
A model has been developed to track the flow of cane constituents through the milling process. While previous models have tracked the flow of fibre, brix and water through the process, this model tracks the soluble and insoluble solid cane components using modelling theory and experiment data, assisting in further understanding the flow of constituents into mixed juice and final bagasse. The work provided an opportunity to understand the factors which affect the distribution of the cane constituents in juice and bagasse. Application of the model should lead to improvements in the overall performance of the milling train.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
This paper presents an approach to modelling the resilience of a generic (potable) water supply system. The system is contextualized as a meta-system consisting of three subsystems to represent the natural catchment, the water treatment plant and the water distribution infrastructure for urban use. An abstract mathematical model of the meta-system is disaggregated progressively to form a cascade of equations forming a relational matrix of models. This allows the investigation of commonly implicit relationships between various operational components within the meta system, the in-depth understanding of specific system components and influential factors and the incorporation of explicit disturbances to explore system behaviour. Consequently, this will facilitate long-term decision making to achieve sustainable solutions for issues such as, meeting a growing demand or managing supply-side influences in the meta-system under diverse water availability regimes. This approach is based on the hypothesis that the means to achieve resilient supply of water may be better managed by modelling the effects of changes at specific levels that have a direct or in some cases indirect impact on higher-order outcomes. Additionally, the proposed strategy allows the definition of approaches to combine disparate data sets to synthesise previously missing or incomplete higher-order information, a scientifically robust means to define and carry out meta-analyses using knowledge from diverse yet relatable disciplines relevant to different levels of the system and for enhancing the understanding of dependencies and inter-dependencies of variable factors at various levels across the meta-system. The proposed concept introduces an approach for modelling a complex infrastructure system as a meta system which consists of a combination of bio-ecological, technical and socio-technical subsystems.
Resumo:
Problems involving the solution of advection-diffusion-reaction equations on domains and subdomains whose growth affects and is affected by these equations, commonly arise in developmental biology. Here, a mathematical framework for these situations, together with methods for obtaining spatio-temporal solutions and steady states of models built from this framework, is presented. The framework and methods are applied to a recently published model of epidermal skin substitutes. Despite the use of Eulerian schemes, excellent agreement is obtained between the numerical spatio-temporal, numerical steady state, and analytical solutions of the model.
Resumo:
AR process modelling movie presented at Gartner BPM Summit in Sydney, August, 2011. Video showing us using the MS Surface at QUT to perform collaborative process modelling.
Resumo:
Prototyping is an established and accepted practice used by the design community. Prototypes play a valuable role during the design process and can greatly affect the designed outcome. The concept of a business model prototype, however, is not well understood by the design and business communities. Design industry trends indicate a move away from product and service innovation towards business model innovation. Therefore, it stands to reason that the role of prototypes and prototyping in this context should also be considered. This paper is conceptual and presents a process for creating and enabling business model prototypes. Specifically, the focus is on building emotional connections across the value chain to enable internal growth within firms. To do this, the authors‟ have relied on personal observations and critical reflection from multiple industry engagements. The outcomes of this critical reflective practice are presented and the opportunities and challenges for this approach are discussed. Future research opportunities are also detailed and presented within the context of the emotional business model.
Resumo:
A re-examination of design education at all levels is needed to ensure global economic competitiveness and social and environmental sustainment. This paper presents an emerging research agenda modelling design led innovation approaches from the business sector to secondary education curriculum. To do this, a review of literature is provided and current knowledge gaps surrounding design education are detailed. A regional secondary school design immersion program is outlined as a future research case study using action research. A framework and recommendations for developing and delivering pedagogical approaches for 21st century skill outcomes in secondary education are briefly introduced and future research objectives are overviewed and discussed.