803 resultados para Bayesian modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cost estimating is a key task within Quantity Surveyors’ (QS) offices. Provision of an accurate estimate is vital to ensure that the objectives of the client are met by staying within the client’s budget. Building Information Modelling (BIM) is an evolving technology that has gained attention in the construction industries all over the world. Benefits from the use of BIM include cost and time savings if the processes used by the procurement team are adapted to maximise the benefits of BIM. BIM can be used by QSs to automate aspects of quantity take-off and the preparation of estimates, decreasing turnaround time and assist in controlling errors and inaccuracies. The Malaysian government has decided to require the use of BIM for its projects beginning from 2016. However, slow uptake is reported in the use of BIM both within companies and to support collaboration within the Malaysian industry. It has been recommended that QSs to start evaluating the impact of BIM on their practices. This paper reviews the perspectives of QSs in Malaysia towards the use of BIM to achieve more dependable results in their cost estimating practice. The objectives of this paper include identifying strategies in improving practice and potential adoption drivers that lead QSs to BIM usage in their construction projects. From the expert interviews, it was found out that, despite still using traditional methods and not practising BIM, the interviewees still acquire limited knowledge related to BIM. There are some drivers that potentially motivate them to employ BIM in their practices. These include client demands, innovation in traditional methods, speed in estimating costs, reduced time and costs, improvement in practices and self-awareness, efficiency in projects, and competition from other companies. The findings of this paper identify the potential drivers in encouraging Malaysian Quantity Surveyors to exploit BIM in their construction projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Living cells are the functional unit of organs that controls reactions to their exterior. However, the mechanics of living cells can be difficult to characterize due to the crypticity of their microscale structures and associated dynamic cellular processes. Fortunately, multiscale modelling provides a powerful simulation tool that can be used to study the mechanical properties of these soft hierarchical, biological systems. This paper reviews recent developments in hierarchical multiscale modeling technique that aimed at understanding cytoskeleton mechanics. Discussions are expanded with respects to cytoskeletal components including: intermediate filaments, microtubules and microfilament networks. The mechanical performance of difference cytoskeleton components are discussed with respect to their structural and material properties. Explicit granular simulation methods are adopted with different coarse-grained strategies for these cytoskeleton components and the simulation details are introduced in this review.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer modelling has been used extensively in some processes in the sugar industry to achieve significant gains. This paper reviews the investigations carried out over approximately the last twenty five years, including the successes but also areas where problems and delays have been encountered. In that time the capability of both hardware and software have increased dramatically. For some processes such as cane cleaning, cane billet preparation, and sugar drying, the application of computer modelling towards improved equipment design and operation has been quite limited. A particular problem has been the large number of particles and particle interactions in these…

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Multilevel and spatial models are being increasingly used to obtain substantive information on area-level inequalities in cancer survival. Multilevel models assume independent geographical areas, whereas spatial models explicitly incorporate geographical correlation, often via a conditional autoregressive prior. However the relative merits of these methods for large population-based studies have not been explored. Using a case-study approach, we report on the implications of using multilevel and spatial survival models to study geographical inequalities in all-cause survival. Methods Multilevel discrete-time and Bayesian spatial survival models were used to study geographical inequalities in all-cause survival for a population-based colorectal cancer cohort of 22,727 cases aged 20–84 years diagnosed during 1997–2007 from Queensland, Australia. Results Both approaches were viable on this large dataset, and produced similar estimates of the fixed effects. After adding area-level covariates, the between-area variability in survival using multilevel discrete-time models was no longer significant. Spatial inequalities in survival were also markedly reduced after adjusting for aggregated area-level covariates. Only the multilevel approach however, provided an estimation of the contribution of geographical variation to the total variation in survival between individual patients. Conclusions With little difference observed between the two approaches in the estimation of fixed effects, multilevel models should be favored if there is a clear hierarchical data structure and measuring the independent impact of individual- and area-level effects on survival differences is of primary interest. Bayesian spatial analyses may be preferred if spatial correlation between areas is important and if the priority is to assess small-area variations in survival and map spatial patterns. Both approaches can be readily fitted to geographically enabled survival data from international settings

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical comparison of oil samples is an integral part of oil spill identification, which deals with the process of linking an oil spill with its source of origin. In current practice, a frequentist hypothesis test is often used to evaluate evidence in support of a match between a spill and a source sample. As frequentist tests are only able to evaluate evidence against a hypothesis but not in support of it, we argue that this leads to unsound statistical reasoning. Moreover, currently only verbal conclusions on a very coarse scale can be made about the match between two samples, whereas a finer quantitative assessment would often be preferred. To address these issues, we propose a Bayesian predictive approach for evaluating the similarity between the chemical compositions of two oil samples. We derive the underlying statistical model from some basic assumptions on modeling assays in analytical chemistry, and to further facilitate and improve numerical evaluations, we develop analytical expressions for the key elements of Bayesian inference for this model. The approach is illustrated with both simulated and real data and is shown to have appealing properties in comparison with both standard frequentist and Bayesian approaches

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim Determining how ecological processes vary across space is a major focus in ecology. Current methods that investigate such effects remain constrained by important limiting assumptions. Here we provide an extension to geographically weighted regression in which local regression and spatial weighting are used in combination. This method can be used to investigate non-stationarity and spatial-scale effects using any regression technique that can accommodate uneven weighting of observations, including machine learning. Innovation We extend the use of spatial weights to generalized linear models and boosted regression trees by using simulated data for which the results are known, and compare these local approaches with existing alternatives such as geographically weighted regression (GWR). The spatial weighting procedure (1) explained up to 80% deviance in simulated species richness, (2) optimized the normal distribution of model residuals when applied to generalized linear models versus GWR, and (3) detected nonlinear relationships and interactions between response variables and their predictors when applied to boosted regression trees. Predictor ranking changed with spatial scale, highlighting the scales at which different species–environment relationships need to be considered. Main conclusions GWR is useful for investigating spatially varying species–environment relationships. However, the use of local weights implemented in alternative modelling techniques can help detect nonlinear relationships and high-order interactions that were previously unassessed. Therefore, this method not only informs us how location and scale influence our perception of patterns and processes, it also offers a way to deal with different ecological interpretations that can emerge as different areas of spatial influence are considered during model fitting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The focus of this paper is two-dimensional computational modelling of water flow in unsaturated soils consisting of weakly conductive disconnected inclusions embedded in a highly conductive connected matrix. When the inclusions are small, a two-scale Richards’ equation-based model has been proposed in the literature taking the form of an equation with effective parameters governing the macroscopic flow coupled with a microscopic equation, defined at each point in the macroscopic domain, governing the flow in the inclusions. This paper is devoted to a number of advances in the numerical implementation of this model. Namely, by treating the micro-scale as a two-dimensional problem, our solution approach based on a control volume finite element method can be applied to irregular inclusion geometries, and, if necessary, modified to account for additional phenomena (e.g. imposing the macroscopic gradient on the micro-scale via a linear approximation of the macroscopic variable along the microscopic boundary). This is achieved with the help of an exponential integrator for advancing the solution in time. This time integration method completely avoids generation of the Jacobian matrix of the system and hence eases the computation when solving the two-scale model in a completely coupled manner. Numerical simulations are presented for a two-dimensional infiltration problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the issue of finding uncertainty intervals for queries in a Bayesian Network is reconsidered. The investigation focuses on Bayesian Nets with discrete nodes and finite populations. An earlier asymptotic approach is compared with a simulation-based approach, together with further alternatives, one based on a single sample of the Bayesian Net of a particular finite population size, and another which uses expected population sizes together with exact probabilities. We conclude that a query of a Bayesian Net should be expressed as a probability embedded in an uncertainty interval. Based on an investigation of two Bayesian Net structures, the preferred method is the simulation method. However, both the single sample method and the expected sample size methods may be useful and are simpler to compute. Any method at all is more useful than none, when assessing a Bayesian Net under development, or when drawing conclusions from an ‘expert’ system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bayesian networks (BNs) are tools for representing expert knowledge or evidence. They are especially useful for synthesising evidence or belief concerning a complex intervention, assessing the sensitivity of outcomes to different situations or contextual frameworks and framing decision problems that involve alternative types of intervention. Bayesian networks are useful extensions to logic maps when initiating a review or to facilitate synthesis and bridge the gap between evidence acquisition and decision-making. Formal elicitation techniques allow development of BNs on the basis of expert opinion. Such applications are useful alternatives to ‘empty’ reviews, which identify knowledge gaps but fail to support decision-making. Where review evidence exists, it can inform the development of a BN. We illustrate the construction of a BN using a motivating example that demonstrates how BNs can ensure coherence, transparently structure the problem addressed by a complex intervention and assess sensitivity to context, all of which are critical components of robust reviews of complex interventions. We suggest that BNs should be utilised to routinely synthesise reviews of complex interventions or empty reviews where decisions must be made despite poor evidence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates the use of fusion techniques and mathematical modelling to increase the robustness of iris recognition systems against iris image quality degradation, pupil size changes and partial occlusion. The proposed techniques improve recognition accuracy and enhance security. They can be further developed for better iris recognition in less constrained environments that do not require user cooperation. A framework to analyse the consistency of different regions of the iris is also developed. This can be applied to improve recognition systems using partial iris images, and cancelable biometric signatures or biometric based cryptography for privacy protection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The process of spray drying is applied in a number of contexts. One such application is the production of a synthetic rock used for storage of nuclear waste. To establish a framework for a model of the spray drying process for this application, we here develop a model describing evaporation from droplets of pure water, such that the model may be extended to account for the presence of colloid within the droplet. We develop a spherically-symmetric model and formulate continuum equations describing mass, momentum, and energy balance in both the liquid and gas phases from first principles. We establish appropriate boundary conditions at the surface of the droplet, including a generalised Clapeyron equation that accurately describes the temperature at the surface of the droplet. To account for experiment design, we introduce a simplified platinum ball and wire model into the system using a thin wire problem. The resulting system of equations is transformed in order to simplify a finite volume solution scheme. The results from numerical simulation are compared with data collected for validation, and the sensitivity of the model to variations in key parameters, and to the use of Clausius–Clapeyron and generalised Clapeyron equations, is investigated. Good agreement is found between the model and experimental data, despite the simplicity of the platinum phase model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene expression is arguably the most important indicator of biological function. Thus identifying differentially expressed genes is one of the main aims of high throughout studies that use microarray and RNAseq platforms to study deregulated cellular pathways. There are many tools for analysing differentia gene expression from transciptomic datasets. The major challenge of this topic is to estimate gene expression variance due to the high amount of ‘background noise’ that is generated from biological equipment and the lack of biological replicates. Bayesian inference has been widely used in the bioinformatics field. In this work, we reveal that the prior knowledge employed in the Bayesian framework also helps to improve the accuracy of differential gene expression analysis when using a small number of replicates. We have developed a differential analysis tool that uses Bayesian estimation of the variance of gene expression for use with small numbers of biological replicates. Our method is more consistent when compared to the widely used cyber-t tool that successfully introduced the Bayesian framework to differential analysis. We also provide a user-friendly web based Graphic User Interface for biologists to use with microarray and RNAseq data. Bayesian inference can compensate for the instability of variance caused when using a small number of biological replicates by using pseudo replicates as prior knowledge. We also show that our new strategy to select pseudo replicates will improve the performance of the analysis. - See more at: http://www.eurekaselect.com/node/138761/article#sthash.VeK9xl5k.dpuf

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ship seakeeping operability refers to the quantification of motion performance in waves relative to mission requirements. This is used to make decisions about preferred vessel designs, but it can also be used as comprehensive assessment of the benefits of ship-motion-control systems. Traditionally, operability computation aggregates statistics of motion computed over over the envelope of likely environmental conditions in order to determine a coefficient in the range from 0 to 1 called operability. When used for assessment of motion-control systems, the increase of operability is taken as the key performance indicator. The operability coefficient is often given the interpretation of the percentage of time operable. This paper considers an alternative probabilistic approach to this traditional computation of operability. It characterises operability not as a number to which a frequency interpretation is attached, but as a hypothesis that a vessel will attain the desired performance in one mission considering the envelope of likely operational conditions. This enables the use of Bayesian theory to compute the probability of that this hypothesis is true conditional on data from simulations. Thus, the metric considered is the probability of operability. This formulation not only adheres to recent developments in reliability and risk analysis, but also allows incorporating into the analysis more accurate descriptions of ship-motion-control systems since the analysis is not limited to linear ship responses in the frequency domain. The paper also discusses an extension of the approach to the case of assessment of increased levels of autonomy for unmanned marine craft.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atheromatous plaque rupture h the cause of the majority of strokes and heart attacks in the developed world. The role of calcium deposits and their contribution to plaque vulnerability are controversial. Some studies have suggested that calcified plaque tends to be more stable whereas others have suggested the opposite. This study uses a finite element model to evaluate the effect of calcium deposits on the stress within the fibrous cap by varying their location and size. Plaque fibrous cap, lipid pool and calcification were modeled as hyperelastic, Isotropic, (nearly) incompressible materials with different properties for large deformation analysis by assigning time-dependent pressure loading on the lumen wall. The stress and strain contours were illustrated for each condition for comparison. Von Mises stress only increases up to 1.5% when varying the location of calcification in the lipid pool distant to the fibrous cap. Calcification in the fibrous cap leads to a 43% increase of Von Mises stress when compared with that in the lipid pool. An increase of 100% of calcification area leads to a 15% stress increase in the fibrous cap. Calcification in the lipid pool does not increase fibrous cap stress when it is distant to the fibrous cap, whilst large areas of calcification close to or in the fibrous cap may lead to a high stress concentration within the fibrous cap, which may cause plaque rupture. This study highlights the application of a computational model on a simulation of clinical problems, and it may provide insights into the mechanism of plaque rupture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes new metrics and a performance-assessment framework for vision-based weed and fruit detection and classification algorithms. In order to compare algorithms, and make a decision on which one to use fora particular application, it is necessary to take into account that the performance obtained in a series of tests is subject to uncertainty. Such characterisation of uncertainty seems not to be captured by the performance metrics currently reported in the literature. Therefore, we pose the problem as a general problem of scientific inference, which arises out of incomplete information, and propose as a metric of performance the(posterior) predictive probabilities that the algorithms will provide a correct outcome for target and background detection. We detail the framework through which these predicted probabilities can be obtained, which is Bayesian in nature. As an illustration example, we apply the framework to the assessment of performance of four algorithms that could potentially be used in the detection of capsicums (peppers).