932 resultados para HMM, Nosocomial Pathogens, Genotyping, Statistical Modelling, VRE
Resumo:
Anisotropic damage distribution and evolution have a profound effect on borehole stress concentrations. Damage evolution is an irreversible process that is not adequately described within classical equilibrium thermodynamics. Therefore, we propose a constitutive model, based on non-equilibrium thermodynamics, that accounts for anisotropic damage distribution, anisotropic damage threshold and anisotropic damage evolution. We implemented this constitutive model numerically, using the finite element method, to calculate stress–strain curves and borehole stresses. The resulting stress–strain curves are distinctively different from linear elastic-brittle and linear elastic-ideal plastic constitutive models and realistically model experimental responses of brittle rocks. We show that the onset of damage evolution leads to an inhomogeneous redistribution of material properties and stresses along the borehole wall. The classical linear elastic-brittle approach to borehole stability analysis systematically overestimates the stress concentrations on the borehole wall, because dissipative strain-softening is underestimated. The proposed damage mechanics approach explicitly models dissipative behaviour and leads to non-conservative mud window estimations. Furthermore, anisotropic rocks with preferential planes of failure, like shales, can be addressed with our model.
Resumo:
Vacuum circuit breaker (VCB) overvoltage failure and its catastrophic failures during shunt reactor switching have been analyzed through computer simulations for multiple reignitions with a statistical VCB model found in the literature. However, a systematic review (SR) that is related to the multiple reignitions with a statistical VCB model does not yet exist. Therefore, this paper aims to analyze and explore the multiple reignitions with a statistical VCB model. It examines the salient points, research gaps and limitations of the multiple reignition phenomenon to assist with future investigations following the SR search. Based on the SR results, seven issues and two approaches to enhance the current statistical VCB model are identified. These results will be useful as an input to improve the computer modeling accuracy as well as the development of a reignition switch model with point-on-wave controlled switching for condition monitoring
Resumo:
Parametric and generative modelling methods are ways in which computer models are made more flexible, and of formalising domain-specific knowledge. At present, no open standard exists for the interchange of parametric and generative information. The Industry Foundation Classes (IFC) which are an open standard for interoperability in building information models is presented as the base for an open standard in parametric modelling. The advantage of allowing parametric and generative representations are that the early design process can allow for more iteration and changes can be implemented quicker than with traditional models. This paper begins with a formal definition of what constitutes to be parametric and generative modelling methods and then proceeds to describe an open standard in which the interchange of components could be implemented. As an illustrative example of generative design, Frazer’s ‘Reptiles’ project from 1968 is reinterpreted.
Resumo:
Intra-host sequence data from RNA viruses have revealed the ubiquity of defective viruses in natural viral populations, sometimes at surprisingly high frequency. Although defective viruses have long been known to laboratory virologists, their relevance in clinical and epidemiological settings has not been established. The discovery of long-term transmission of a defective lineage of dengue virus type 1 (DENV-1) in Myanmar, first seen in 2001, raised important questions about the emergence of transmissible defective viruses and their role in viral epidemiology. By combining phylogenetic analyses and dynamical modelling, we investigate how evolutionary and ecological processes at the intra-host and inter-host scales shaped the emergence and spread of the defective DENV-1 lineage. We show that this lineage of defective viruses emerged between June 1998 and February 2001, and that the defective virus was transmitted primarily through co-transmission with the functional virus to uninfected individuals. We provide evidence that, surprisingly, this co-transmission route has a higher transmission potential than transmission of functional dengue viruses alone. Consequently, we predict that the defective lineage should increase overall incidence of dengue infection, which could account for the historically high dengue incidence reported in Myanmar in 2001-2002. Our results show the unappreciated potential for defective viruses to impact the epidemiology of human pathogens, possibly by modifying the virulence-transmissibility trade-off, or to emerge as circulating infections in their own right. They also demonstrate that interactions between viral variants, such as complementation, can open new pathways to viral emergence.
Resumo:
AR process modelling movie presented at Gartner BPM Summit in Sydney, August, 2011.
Resumo:
Video presented as part of BPM2011 demonstration(France). In this video we show a prototype BPMN process modelling tool which uses Augmented Reality techniques to increase the sense of immersion when editing a process model. The avatar represents a remotely logged in user, and facilitates greater insight into the editing actions of the collaborator than present 2D web-based approaches in collaborative process modelling. We modified the Second Life client to integrate the ARToolkit in order to support pattern-based AR.
Resumo:
Situated on Youtube, and shown in various locations. A video showing members of the QUT BPM research group using a Mimio pen-based tabletop system for collaborative process modelling.
Resumo:
Controlled drug delivery is a key topic in modern pharmacotherapy, where controlled drug delivery devices are required to prolong the period of release, maintain a constant release rate, or release the drug with a predetermined release profile. In the pharmaceutical industry, the development process of a controlled drug delivery device may be facilitated enormously by the mathematical modelling of drug release mechanisms, directly decreasing the number of necessary experiments. Such mathematical modelling is difficult because several mechanisms are involved during the drug release process. The main drug release mechanisms of a controlled release device are based on the device’s physiochemical properties, and include diffusion, swelling and erosion. In this thesis, four controlled drug delivery models are investigated. These four models selectively involve the solvent penetration into the polymeric device, the swelling of the polymer, the polymer erosion and the drug diffusion out of the device but all share two common key features. The first is that the solvent penetration into the polymer causes the transition of the polymer from a glassy state into a rubbery state. The interface between the two states of the polymer is modelled as a moving boundary and the speed of this interface is governed by a kinetic law. The second feature is that drug diffusion only happens in the rubbery region of the polymer, with a nonlinear diffusion coefficient which is dependent on the concentration of solvent. These models are analysed by using both formal asymptotics and numerical computation, where front-fixing methods and the method of lines with finite difference approximations are used to solve these models numerically. This numerical scheme is conservative, accurate and easily implemented to the moving boundary problems and is thoroughly explained in Section 3.2. From the small time asymptotic analysis in Sections 5.3.1, 6.3.1 and 7.2.1, these models exhibit the non-Fickian behaviour referred to as Case II diffusion, and an initial constant rate of drug release which is appealing to the pharmaceutical industry because this indicates zeroorder release. The numerical results of the models qualitatively confirms the experimental behaviour identified in the literature. The knowledge obtained from investigating these models can help to develop more complex multi-layered drug delivery devices in order to achieve sophisticated drug release profiles. A multi-layer matrix tablet, which consists of a number of polymer layers designed to provide sustainable and constant drug release or bimodal drug release, is also discussed in this research. The moving boundary problem describing the solvent penetration into the polymer also arises in melting and freezing problems which have been modelled as the classical onephase Stefan problem. The classical one-phase Stefan problem has unrealistic singularities existed in the problem at the complete melting time. Hence we investigate the effect of including the kinetic undercooling to the melting problem and this problem is called the one-phase Stefan problem with kinetic undercooling. Interestingly we discover the unrealistic singularities existed in the classical one-phase Stefan problem at the complete melting time are regularised and also find out the small time behaviour of the one-phase Stefan problem with kinetic undercooling is different to the classical one-phase Stefan problem from the small time asymptotic analysis in Section 3.3. In the case of melting very small particles, it is known that surface tension effects are important. The effect of including the surface tension to the melting problem for nanoparticles (no kinetic undercooling) has been investigated in the past, however the one-phase Stefan problem with surface tension exhibits finite-time blow-up. Therefore we investigate the effect of including both the surface tension and kinetic undercooling to the melting problem for nanoparticles and find out the the solution continues to exist until complete melting. The investigation of including kinetic undercooling and surface tension to the melting problems reveals more insight into the regularisations of unphysical singularities in the classical one-phase Stefan problem. This investigation gives a better understanding of melting a particle, and contributes to the current body of knowledge related to melting and freezing due to heat conduction.
Resumo:
Topic modeling has been widely utilized in the fields of information retrieval, text mining, text classification etc. Most existing statistical topic modeling methods such as LDA and pLSA generate a term based representation to represent a topic by selecting single words from multinomial word distribution over this topic. There are two main shortcomings: firstly, popular or common words occur very often across different topics that bring ambiguity to understand topics; secondly, single words lack coherent semantic meaning to accurately represent topics. In order to overcome these problems, in this paper, we propose a two-stage model that combines text mining and pattern mining with statistical modeling to generate more discriminative and semantic rich topic representations. Experiments show that the optimized topic representations generated by the proposed methods outperform the typical statistical topic modeling method LDA in terms of accuracy and certainty.
Resumo:
Stigmergy is a biological term originally used when discussing insect or swarm behaviour, and describes a model supporting environment-based communication separating artefacts from agents. This phenomenon is demonstrated in the behavior of ants and their food foraging supported by pheromone trails, or similarly termites and their termite nest building process. What is interesting with this mechanism is that highly organized societies are formed without an apparent central management function. We see design features in Web sites that mimic stigmergic mechanisms as part of the User Interface and we have created generalizations of these patterns. Software development and Web site development techniques have evolved significantly over the past 20 years. Recent progress in this area proposes languages to model web applications to facilitate the nuances specific to these developments. These modeling languages provide a suitable framework for building reusable components encapsulating our design patterns of stigmergy. We hypothesize that incorporating stigmergy as a separate feature of a site’s primary function will ultimately lead to enhanced user coordination.
Resumo:
Light Gauge Steel Framing (LSF) walls are made of cold-formed, thin-walled steel lipped channel studs with plasterboard linings on both sides. However, these thin-walled steel sections heat up quickly and lose their strength under fire conditions despite the protection provided by plasterboards. A new composite wall panel was recently proposed to improve the fire resistance rating of LSF walls, where an insulation layer was used externally between the plasterboards on both sides of the wall frame instead of using it in the cavity. A research study using both fire tests and numerical studies was undertaken to investigate the structural and thermal behaviour of load bearing LSF walls made of both conventional and the new composite panels under standard fire conditions and to determine their fire resistance rating. This paper presents the details of finite element models of LSF wall studs developed to simulate the structural performance of LSF wall panels under standard fire conditions. Finite element analyses were conducted under both steady and transient state conditions using the time-temperature profiles measured during the fire tests. The developed models were validated using the fire test results of 11 LSF wall panels with various plasterboard/insulation configurations and load ratios. They were able to predict the fire resistance rating within five minutes. The use of accurate numerical models allowed the inclusion of various complex structural and thermal effects such as local buckling, thermal bowing and neutral axis shift that occurred in thin-walled steel studs under non-uniform elevated temperature conditions. Finite element analyses also demonstrated the improvements offered by the new composite panel system over the conventional cavity insulated system.
Resumo:
LiFePO4 is a commercially available battery material with good theoretical discharge capacity, excellent cycle life and increased safety compared with competing Li-ion chemistries. It has been the focus of considerable experimental and theoretical scrutiny in the past decade, resulting in LiFePO4 cathodes that perform well at high discharge rates. This scrutiny has raised several questions about the behaviour of LiFePO4 material during charge and discharge. In contrast to many other battery chemistries that intercalate homogeneously, LiFePO4 can phase-separate into highly and lowly lithiated phases, with intercalation proceeding by advancing an interface between these two phases. The main objective of this thesis is to construct mathematical models of LiFePO4 cathodes that can be validated against experimental discharge curves. This is in an attempt to understand some of the multi-scale dynamics of LiFePO4 cathodes that can be difficult to determine experimentally. The first section of this thesis constructs a three-scale mathematical model of LiFePO4 cathodes that uses a simple Stefan problem (which has been used previously in the literature) to describe the assumed phase-change. LiFePO4 crystals have been observed agglomerating in cathodes to form a porous collection of crystals and this morphology motivates the use of three size-scales in the model. The multi-scale model developed validates well against experimental data and this validated model is then used to examine the role of manufacturing parameters (including the agglomerate radius) on battery performance. The remainder of the thesis is concerned with investigating phase-field models as a replacement for the aforementioned Stefan problem. Phase-field models have recently been used in LiFePO4 and are a far more accurate representation of experimentally observed crystal-scale behaviour. They are based around the Cahn-Hilliard-reaction (CHR) IBVP, a fourth-order PDE with electrochemical (flux) boundary conditions that is very stiff and possesses multiple time and space scales. Numerical solutions to the CHR IBVP can be difficult to compute and hence a least-squares based Finite Volume Method (FVM) is developed for discretising both the full CHR IBVP and the more traditional Cahn-Hilliard IBVP. Phase-field models are subject to two main physicality constraints and the numerical scheme presented performs well under these constraints. This least-squares based FVM is then used to simulate the discharge of individual crystals of LiFePO4 in two dimensions. This discharge is subject to isotropic Li+ diffusion, based on experimental evidence that suggests the normally orthotropic transport of Li+ in LiFePO4 may become more isotropic in the presence of lattice defects. Numerical investigation shows that two-dimensional Li+ transport results in crystals that phase-separate, even at very high discharge rates. This is very different from results shown in the literature, where phase-separation in LiFePO4 crystals is suppressed during discharge with orthotropic Li+ transport. Finally, the three-scale cathodic model used at the beginning of the thesis is modified to simulate modern, high-rate LiFePO4 cathodes. High-rate cathodes typically do not contain (large) agglomerates and therefore a two-scale model is developed. The Stefan problem used previously is also replaced with the phase-field models examined in earlier chapters. The results from this model are then compared with experimental data and fit poorly, though a significant parameter regime could not be investigated numerically. Many-particle effects however, are evident in the simulated discharges, which match the conclusions of recent literature. These effects result in crystals that are subject to local currents very different from the discharge rate applied to the cathode, which impacts the phase-separating behaviour of the crystals and raises questions about the validity of using cathodic-scale experimental measurements in order to determine crystal-scale behaviour.
Resumo:
This thesis reports on an investigation to develop an advanced and comprehensive milling process model of the raw sugar factory. Although the new model can be applied to both, the four-roller and six-roller milling units, it is primarily developed for the six-roller mills which are widely used in the Australian sugar industry. The approach taken was to gain an understanding of the previous milling process simulation model "MILSIM" developed at the University of Queensland nearly four decades ago. Although the MILSIM model was widely adopted in the Australian sugar industry for simulating the milling process it did have some incorrect assumptions. The study aimed to eliminate all the incorrect assumptions of the previous model and develop an advanced model that represents the milling process correctly and tracks the flow of other cane components in the milling process which have not been considered in the previous models. The development of the milling process model was done is three stages. Firstly, an enhanced milling unit extraction model (MILEX) was developed to access the mill performance parameters and predict the extraction performance of the milling process. New definitions for the milling performance parameters were developed and a complete milling train along with the juice screen was modelled. The MILEX model was validated with factory data and the variation in the mill performance parameters was observed and studied. Some case studies were undertaken to study the effect of fibre in juice streams, juice in cush return and imbibition% fibre on extraction performance of the milling process. It was concluded from the study that the empirical relations developed for the mill performance parameters in the MILSIM model were not applicable to the new model. New empirical relations have to be developed before the model is applied with confidence. Secondly, a soluble and insoluble solids model was developed using modelling theory and experimental data to track the flow of sucrose (pol), reducing sugars (glucose and fructose), soluble ash, true fibre and mud solids entering the milling train through the cane supply and their distribution in juice and bagasse streams.. The soluble impurities and mud solids in cane affect the performance of the milling train and further processing of juice and bagasse. New mill performance parameters were developed in the model to track the flow of cane components. The developed model is the first of its kind and provides some additional insight regarding the flow of soluble and insoluble cane components and the factors affecting their distribution in juice and bagasse. The model proved to be a good extension to the MILEX model to study the overall performance of the milling train. Thirdly, the developed models were incorporated in a proprietary software package "SysCAD’ for advanced operational efficiency and for availability in the ‘whole of factory’ model. The MILEX model was developed in SysCAD software to represent a single milling unit. Eventually the entire milling train and the juice screen were developed in SysCAD using series of different controllers and features of the software. The models developed in SysCAD can be run from macro enabled excel file and reports can be generated in excel sheets. The flexibility of the software, ease of use and other advantages are described broadly in the relevant chapter. The MILEX model is developed in static mode and dynamic mode. The application of the dynamic mode of the model is still under progress.
Resumo:
The invited presentation was delivered at Queensland Department of Main Roads, Brisbane Australia, 17th June 2013
Resumo:
Vehicle speed is an important attribute for the utility of a transport mode. The speed relationship between multiple modes of transport is of interest to the traffic planners and operators. This paper quantifies the relationship between bus speed and average car speed by integrating Bluetooth data and Transit Signal Priority data from the urban network in Brisbane, Australia. The method proposed in this paper is the first of its kind to relate bus speed and average car speed by integrating multi-source traffic data in a corridor-based method. Three transferable regression models relating not-in-service bus; in-service bus during peak; and in-service bus during off peak periods with average car are proposed. The models are cross-validated and the interrelationships are significant