972 resultados para Bayesian Modelling
Resumo:
Unsaturated water flow in soil is commonly modelled using Richards’ equation, which requires the hydraulic properties of the soil (e.g., porosity, hydraulic conductivity, etc.) to be characterised. Naturally occurring soils, however, are heterogeneous in nature, that is, they are composed of a number of interwoven homogeneous soils each with their own set of hydraulic properties. When the length scale of these soil heterogeneities is small, numerical solution of Richards’ equation is computationally impractical due to the immense effort and refinement required to mesh the actual heterogeneous geometry. A classic way forward is to use a macroscopic model, where the heterogeneous medium is replaced with a fictitious homogeneous medium, which attempts to give the average flow behaviour at the macroscopic scale (i.e., at a scale much larger than the scale of the heterogeneities). Using the homogenisation theory, a macroscopic equation can be derived that takes the form of Richards’ equation with effective parameters. A disadvantage of the macroscopic approach, however, is that it fails in cases when the assumption of local equilibrium does not hold. This limitation has seen the introduction of two-scale models that include at each point in the macroscopic domain an additional flow equation at the scale of the heterogeneities (microscopic scale). This report outlines a well-known two-scale model and contributes to the literature a number of important advances in its numerical implementation. These include the use of an unstructured control volume finite element method and image-based meshing techniques, that allow for irregular micro-scale geometries to be treated, and the use of an exponential time integration scheme that permits both scales to be resolved simultaneously in a completely coupled manner. Numerical comparisons against a classical macroscopic model confirm that only the two-scale model correctly captures the important features of the flow for a range of parameter values.
Resumo:
This paper presents a robust place recognition algorithm for mobile robots that can be used for planning and navigation tasks. The proposed framework combines nonlinear dimensionality reduction, nonlinear regression under noise, and Bayesian learning to create consistent probabilistic representations of places from images. These generative models are incrementally learnt from very small training sets and used for multi-class place recognition. Recognition can be performed in near real-time and accounts for complexity such as changes in illumination, occlusions, blurring and moving objects. The algorithm was tested with a mobile robot in indoor and outdoor environments with sequences of 1579 and 3820 images, respectively. This framework has several potential applications such as map building, autonomous navigation, search-rescue tasks and context recognition.
Resumo:
Cold-formed steel members are increasingly used as primary structural elements in buildings due to the availability of thin and high strength steels and advanced cold-forming technologies. Cold-formed lipped channel beams (LCB) are commonly used as flexural members such as floor joists and bearers. Many research studies have been carried out to evaluate the behaviour and design of LCBs subject to pure bending actions. However, limited research has been undertaken on the shear behaviour and strength of LCBs. Hence a numerical study was undertaken to investigate the shear behaviour and strength of LCBs. Finite element models of simply supported LCBs with aspect ratios of 1.0 and 1.5 were considered under a mid-span load. They were then validated by comparing their results with test results and used in a detailed parametric study based on the validated finite element models. Numerical studies were conducted to investigate the shear buckling and post-buckling behaviour of LCBs. Experimental and numerical results showed that the current design rules in cold-formed steel structures design codes are very conservative for the shear design of LCBs. Improved design equations were therefore proposed for the shear strength of LCBs. This paper presents the details of this numerical study of LCBs and the results.
Resumo:
Sustainability is a key driver for decisions in the management and future development of organisations and industries. However, quantifying and comparing sustainability across the triple bottom line (TBL) of economy, environment and social impact, has been problematic. There is a need for a tool which can measure the complex interactions within and between the environmental, economic and social systems which affect the sustainability of an industry in a transparent, consistent and comparable way. The authors acknowledge that there are currently numerous ways in which sustainability is measured and multiple methodologies in how these measurement tools were designed. The purpose of this book is to showcase how Bayesian network modelling can be used to identify and measure environmental, economic and social sustainability variables and to understand their impact on and interaction with each other. This book introduces the Sustainability Scorecard, and describes it through a case study on sustainability of the Australian dairy industry. This study was conducted in collaboration with the Australian dairy industry.
Resumo:
Chronic leg ulcers are costly to manage for health service providers. Although evidence-based care leads to improved healing rates and reduced costs, a significant evidence-practice gap is known to exist. Lack of access to specialist skills in wound care is one reason suggested for this gap. The aim of this study was to model the change to total costs and health outcomes under two versions of health services for patients with leg ulcers: routine health services for community-living patients; and care provided by specialist wound clinics. Mean weekly treatment and health services costs were estimated from participants’ data (n=70) for the twelve months prior to their entry to a study specialist wound clinic, and prospectively for 24 weeks after entry. For the retrospective phase mean weekly costs of care were $AU130.30 (SD $12.64) and these fell to $AU53.32 (SD $6.47) for the prospective phase. Analysis at a population level suggests if 10,000 individuals receive 12 weeks of specialist evidence-based care, the cost savings are likely to be AU$9,238,800. Significant savings could be made by the adoption of evidence-based care such as that provided by the community and outpatient specialist wound clinics in this study.
Resumo:
Presentation by Dr Caroline Grant, Science & Engineering Faculty, IHBI, at Managing your research data seminar, 2012
Resumo:
In this paper, we explore how BIM functionalities together with novel management concepts and methods have been utilized in thirteen hospital projects in the United States and the United Kingdom. Secondary data collection and analysis were used as the method. Initial findings indicate that the utilization of BIM enables a holistic view of project delivery and helps to integrate project parties into a collaborative process. The initiative to implement BIM must come from the top down to enable early involvement of all key stakeholders. It seems that it is rather resistance from people to adapt to the new way of working and thinking than immaturity of technology that hinders the utilization of BIM.
Resumo:
This paper investigates the mutual relations of three current drivers of construction: lean construction, building information modelling and sustainability. These drivers are based on infrequently occurring changes, only incidentally simultaneous, in their respective domains. It is contended that the drivers are mutually supportive and thus synergistic. They are aligned in the sense that all require, promote or enable collaboration. It is argued that these three drivers should be implemented in a unified manner for rapid and robust improvements in construction industry performance and the quality of the constructed facilities and their benefits for stakeholders and wider society.
Resumo:
Building with Building Information Modelling (BIM) changes design and production processes. But can BIM be used to support process changes designed according to lean production and lean construction principles? To begin to answer this question we provide a conceptual analysis of the interaction of lean construction and BIM for improving construction. This was investigated by compiling a detailed listing of lean construction principles and BIM functionalities which are relevant from this perspective. These were drawn from a detailed literature survey. A research framework for analysis of the interaction between lean and BIM was then compiled. The goal of the framework is to both guide and stimulate research; as such, the approach adopted up to this point is constructive. Ongoing research has identified 55 such interactions, the majority of which show positive synergy between the two.
Resumo:
This paper is a modified version of a lecture which describes the synthesis, structure and reactivity of some neutral molecules of stellar significance. The neutrals are formed in the collision cell of a mass spectrometer following vertical Franck-Condon one electron oxidation of anions of known bond connectivity. Neutrals are characterised by conversion to positive ions and by extensive theoretical studies at the CCSD(T)/aug-cc-pVDZ//B3LYP/6-31G(d) level of theory. Four systems are considered in detail, viz (i) the formation of linear C-4 and its conversion to the rhombus C-4, (ii) linear C-5 and the atom scrambling of this system when energised, (iii) the stable cumulene oxide CCCCCO, and (iv) the elusive species O2C-CO. This paper is not intended to be a review of interstellar chemistry: examples are selected from our own work in this area. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
In this paper, we present fully Bayesian experimental designs for nonlinear mixed effects models, in which we develop simulation-based optimal design methods to search over both continuous and discrete design spaces. Although Bayesian inference has commonly been performed on nonlinear mixed effects models, there is a lack of research into performing Bayesian optimal design for nonlinear mixed effects models that require searches to be performed over several design variables. This is likely due to the fact that it is much more computationally intensive to perform optimal experimental design for nonlinear mixed effects models than it is to perform inference in the Bayesian framework. In this paper, the design problem is to determine the optimal number of subjects and samples per subject, as well as the (near) optimal urine sampling times for a population pharmacokinetic study in horses, so that the population pharmacokinetic parameters can be precisely estimated, subject to cost constraints. The optimal sampling strategies, in terms of the number of subjects and the number of samples per subject, were found to be substantially different between the examples considered in this work, which highlights the fact that the designs are rather problem-dependent and require optimisation using the methods presented in this paper.
Resumo:
Passenger experience has become a major factor that influences the success of an airport. In this context, passenger flow simulation has been used in designing and managing airports. However, most passenger flow simulations failed to consider the group dynamics when developing passenger flow models. In this paper, an agent-based model is presented to simulate passenger behaviour at the airport check-in and evacuation process. The simulation results show that the passenger behaviour can have significant influences on the performance and utilisation of services in airport terminals. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.
Computation of ECG signal features using MCMC modelling in software and FPGA reconfigurable hardware
Resumo:
Computational optimisation of clinically important electrocardiogram signal features, within a single heart beat, using a Markov-chain Monte Carlo (MCMC) method is undertaken. A detailed, efficient data-driven software implementation of an MCMC algorithm has been shown. Initially software parallelisation is explored and has been shown that despite the large amount of model parameter inter-dependency that parallelisation is possible. Also, an initial reconfigurable hardware approach is explored for future applicability to real-time computation on a portable ECG device, under continuous extended use.
Resumo:
A bioeconomic model was developed to evaluate the potential performance of brown tiger prawn stock enhancement in Exmouth Gulf, Australia. This paper presents the framework for the bioeconomic model and risk assessment for all components of a stock enhancement operation, i.e. hatchery, grow-out, releasing, population dynamics, fishery, and monitoring, for a commercial scale enhancement of about 100 metric tonnes, a 25% increase in average annual catch in Exmouth Gulf. The model incorporates uncertainty in estimates of parameters by using a distribution for the parameter over a certain range, based on experiments, published data, or similar studies. Monte Carlo simulation was then used to quantify the effects of these uncertainties on the model-output and on the economic potential of a particular production target. The model incorporates density-dependent effects in the nursery grounds of brown tiger prawns. The results predict that a release of 21 million 1 g prawns would produce an estimated enhanced prawn catch of about 100 t. This scale of enhancement has a 66.5% chance of making a profit. The largest contributor to the overall uncertainty of the enhanced prawn catch was the post-release mortality, followed by the density-dependent mortality caused by released prawns. These two mortality rates are most difficult to estimate in practice and are much under-researched in stock enhancement.
Resumo:
We consider the problem of combining opinions from different experts in an explicitly model-based way to construct a valid subjective prior in a Bayesian statistical approach. We propose a generic approach by considering a hierarchical model accounting for various sources of variation as well as accounting for potential dependence between experts. We apply this approach to two problems. The first problem deals with a food risk assessment problem involving modelling dose-response for Listeria monocytogenes contamination of mice. Two hierarchical levels of variation are considered (between and within experts) with a complex mathematical situation due to the use of an indirect probit regression. The second concerns the time taken by PhD students to submit their thesis in a particular school. It illustrates a complex situation where three hierarchical levels of variation are modelled but with a simpler underlying probability distribution (log-Normal).