937 resultados para Bose-Einstein condensation statistical model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel in-cylinder pressure method for determining ignition delay has been proposed and demonstrated. This method proposes a new Bayesian statistical model to resolve the start of combustion, defined as being the point at which the band-pass in-cylinder pressure deviates from background noise and the combustion resonance begins. Further, it is demonstrated that this method is still accurate in situations where there is noise present. The start of combustion can be resolved for each cycle without the need for ad hoc methods such as cycle averaging. Therefore, this method allows for analysis of consecutive cycles and inter-cycle variability studies. Ignition delay obtained by this method and by the net rate of heat release have been shown to give good agreement. However, the use of combustion resonance to determine the start of combustion is preferable over the net rate of heat release method because it does not rely on knowledge of heat losses and will still function accurately in the presence of noise. Results for a six-cylinder turbo-charged common-rail diesel engine run with neat diesel fuel at full, three quarters and half load have been presented. Under these conditions the ignition delay was shown to increase as the load was decreased with a significant increase in ignition delay at half load, when compared with three quarter and full loads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information that is elicited from experts can be treated as `data', so can be analysed using a Bayesian statistical model, to formulate a prior model. Typically methods for encoding a single expert's knowledge have been parametric, constrained by the extent of an expert's knowledge and energy regarding a target parameter. Interestingly these methods have often been deterministic, in that all elicited information is treated at `face value', without error. Here we sought a parametric and statistical approach for encoding assessments from multiple experts. Our recent work proposed and demonstrated the use of a flexible hierarchical model for this purpose. In contrast to previous mathematical approaches like linear or geometric pooling, our new approach accounts for several sources of variation: elicitation error, encoding error and expert diversity. Of interest are the practical, mathematical and philosophical interpretations of this form of hierarchical pooling (which is both statistical and parametric), and how it fits within the subjective Bayesian paradigm. Case studies from a bioassay and project management (on PhDs) are used to illustrate the approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Flat-detector, cone-beam computed tomography (CBCT) has enormous potential to improve the accuracy of treatment delivery in image-guided radiotherapy (IGRT). To assist radiotherapists in interpreting these images, we use a Bayesian statistical model to label each voxel according to its tissue type. Methods: The rich sources of prior information in IGRT are incorporated into a hidden Markov random field (MRF) model of the 3D image lattice. Tissue densities in the reference CT scan are estimated using inverse regression and then rescaled to approximate the corresponding CBCT intensity values. The treatment planning contours are combined with published studies of physiological variability to produce a spatial prior distribution for changes in the size, shape and position of the tumour volume and organs at risk (OAR). The voxel labels are estimated using the iterated conditional modes (ICM) algorithm. Results: The accuracy of the method has been evaluated using 27 CBCT scans of an electron density phantom (CIRS, Inc. model 062). The mean voxel-wise misclassification rate was 6.2%, with Dice similarity coefficient of 0.73 for liver, muscle, breast and adipose tissue. Conclusions: By incorporating prior information, we are able to successfully segment CBCT images. This could be a viable approach for automated, online image analysis in radiotherapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We estimated the heritability and correlations between body and carcass weight traits in a cultured stock of giant freshwater prawn (GFP) (Macrobrachium rosenbergii) selected for harvest body weight in Vietnam. The data set consisted of 18,387 body and 1,730 carcass records, as well as full pedigree information collected over four generations. Variance and covariance components were estimated by restricted maximum likelihood fitting a multi-trait animal model. Across generations, estimates of heritability for body and carcass weight traits were moderate and ranged from 0.14 to 0.19 and 0.17 to 0.21, respectively. Body trait heritabilities estimated for females were significantly higher than for males whereas carcass weight trait heritabilities estimated for females and males were not significantly different (P>. 0.05). Maternal effects for body traits accounted for 4 to 5% of the total variance and were greater in females than in males. Genetic correlations among body traits were generally high in the mixed sexes. Genetic correlations between body and carcass weight traits were also high. Although some issues remain regarding the best statistical model to be fitted to GFP data, our results suggest that selection for high harvest body weight based on breeding values estimated by fitting an animal model to the data can significantly improve mean body and carcass weight in GFP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cone-beam computed tomography (CBCT) has enormous potential to improve the accuracy of treatment delivery in image-guided radiotherapy (IGRT). To assist radiotherapists in interpreting these images, we use a Bayesian statistical model to label each voxel according to its tissue type. The rich sources of prior information in IGRT are incorporated into a hidden Markov random field model of the 3D image lattice. Tissue densities in the reference CT scan are estimated using inverse regression and then rescaled to approximate the corresponding CBCT intensity values. The treatment planning contours are combined with published studies of physiological variability to produce a spatial prior distribution for changes in the size, shape and position of the tumour volume and organs at risk. The voxel labels are estimated using iterated conditional modes. The accuracy of the method has been evaluated using 27 CBCT scans of an electron density phantom. The mean voxel-wise misclassification rate was 6.2\%, with Dice similarity coefficient of 0.73 for liver, muscle, breast and adipose tissue. By incorporating prior information, we are able to successfully segment CBCT images. This could be a viable approach for automated, online image analysis in radiotherapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The occurrence of extreme water level events along low-lying, highly populated and/or developed coastlines can lead to devastating impacts on coastal infrastructure. Therefore it is very important that the probabilities of extreme water levels are accurately evaluated to inform flood and coastal management and for future planning. The aim of this study was to provide estimates of present day extreme total water level exceedance probabilities around the whole coastline of Australia, arising from combinations of mean sea level, astronomical tide and storm surges generated by both extra-tropical and tropical storms, but exclusive of surface gravity waves. The study has been undertaken in two main stages. In the first stage, a high-resolution (~10 km along the coast) hydrodynamic depth averaged model has been configured for the whole coastline of Australia using the Danish Hydraulics Institute’s Mike21 modelling suite of tools. The model has been forced with astronomical tidal levels, derived from the TPX07.2 global tidal model, and meteorological fields, from the US National Center for Environmental Prediction’s global reanalysis, to generate a 61-year (1949 to 2009) hindcast of water levels. This model output has been validated against measurements from 30 tide gauge sites around Australia with long records. At each of the model grid points located around the coast, time series of annual maxima and the several highest water levels for each year were derived from the multi-decadal water level hindcast and have been fitted to extreme value distributions to estimate exceedance probabilities. Stage 1 provided a reliable estimate of the present day total water level exceedance probabilities around southern Australia, which is mainly impacted by extra-tropical storms. However, as the meteorological fields used to force the hydrodynamic model only weakly include the effects of tropical cyclones the resultant water levels exceedance probabilities were underestimated around western, northern and north-eastern Australia at higher return periods. Even if the resolution of the meteorological forcing was adequate to represent tropical cyclone-induced surges, multi-decadal periods yielded insufficient instances of tropical cyclones to enable the use of traditional extreme value extrapolation techniques. Therefore, in the second stage of the study, a statistical model of tropical cyclone tracks and central pressures was developed using histroic observations. This model was then used to generate synthetic events that represented 10,000 years of cyclone activity for the Australia region, with characteristics based on the observed tropical cyclones over the last ~40 years. Wind and pressure fields, derived from these synthetic events using analytical profile models, were used to drive the hydrodynamic model to predict the associated storm surge response. A random time period was chosen, during the tropical cyclone season, and astronomical tidal forcing for this period was included to account for non-linear interactions between the tidal and surge components. For each model grid point around the coast, annual maximum total levels for these synthetic events were calculated and these were used to estimate exceedance probabilities. The exceedance probabilities from stages 1 and 2 were then combined to provide a single estimate of present day extreme water level probabilities around the whole coastline of Australia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bayesian experimental design is a fast growing area of research with many real-world applications. As computational power has increased over the years, so has the development of simulation-based design methods, which involve a number of algorithms, such as Markov chain Monte Carlo, sequential Monte Carlo and approximate Bayes methods, facilitating more complex design problems to be solved. The Bayesian framework provides a unified approach for incorporating prior information and/or uncertainties regarding the statistical model with a utility function which describes the experimental aims. In this paper, we provide a general overview on the concepts involved in Bayesian experimental design, and focus on describing some of the more commonly used Bayesian utility functions and methods for their estimation, as well as a number of algorithms that are used to search over the design space to find the Bayesian optimal design. We also discuss other computational strategies for further research in Bayesian optimal design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives This study builds on research undertaken by Bernasco and Nieuwbeerta and explores the generalizability of a theoretically derived offender target selection model in three cross-national study regions. Methods Taking a discrete spatial choice approach, we estimate the impact of both environment- and offender-level factors on residential burglary placement in the Netherlands, the United Kingdom, and Australia. Combining cleared burglary data from all study regions in a single statistical model, we make statistical comparisons between environments. Results In all three study regions, the likelihood an offender selects an area for burglary is positively influenced by proximity to their home, the proportion of easily accessible targets, and the total number of targets available. Furthermore, in two of the three study regions, juvenile offenders under the legal driving age are significantly more influenced by target proximity than adult offenders. Post hoc tests indicate the magnitudes of these impacts vary significantly between study regions. Conclusions While burglary target selection strategies are consistent with opportunity-based explanations of offending, the impact of environmental context is significant. As such, the approach undertaken in combining observations from multiple study regions may aid criminology scholars in assessing the generalizability of observed findings across multiple environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Meta-analysis is a method to obtain a weighted average of results from various studies. In addition to pooling effect sizes, meta-analysis can also be used to estimate disease frequencies, such as incidence and prevalence. In this article we present methods for the meta-analysis of prevalence. We discuss the logit and double arcsine transformations to stabilise the variance. We note the special situation of multiple category prevalence, and propose solutions to the problems that arise. We describe the implementation of these methods in the MetaXL software, and present a simulation study and the example of multiple sclerosis from the Global Burden of Disease 2010 project. We conclude that the double arcsine transformation is preferred over the logit, and that the MetaXL implementation of multiple category prevalence is an improvement in the methodology of the meta-analysis of prevalence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND This paper describes the first national burden of disease study for South Africa. The main focus is the burden due to premature mortality, i.e. years of life lost (YLLs). In addition, estimates of the burden contributed by morbidity, i.e. the years lived with disability (YLDs), are obtained to calculate disability-adjusted life years (DALYs); and the impact of AIDS on premature mortality in the year 2010 is assessed. METHOD Owing to the rapid mortality transition and the lack of timely data, a modelling approach has been adopted. The total mortality for the year 2000 is estimated using a demographic and AIDS model. The non-AIDS cause-of-death profile is estimated using three sources of data: Statistics South Africa, the National Department of Home Affairs, and the National Injury Mortality Surveillance System. A ratio method is used to estimate the YLDs from the YLL estimates. RESULTS The top single cause of mortality burden was HIV/AIDS followed by homicide, tuberculosis, road traffic accidents and diarrhoea. HIV/AIDS accounted for 38% of total YLLs, which is proportionately higher for females (47%) than for males (33%). Pre-transitional diseases, usually associated with poverty and underdevelopment, accounted for 25%, non-communicable diseases 21% and injuries 16% of YLLs. The DALY estimates highlight the fact that mortality alone underestimates the burden of disease, especially with regard to unintentional injuries, respiratory disease, and nervous system, mental and sense organ disorders. The impact of HIV/AIDS is expected to more than double the burden of premature mortality by the year 2010. CONCLUSION This study has drawn together data from a range of sources to develop coherent estimates of premature mortality by cause. South Africa is experiencing a quadruple burden of disease comprising the pre-transitional diseases, the emerging chronic diseases, injuries, and HIV/AIDS. Unless interventions that reduce morbidity and delay morbidity become widely available, the burden due to HIV/AIDS can be expected to grow very rapidly in the next few years. An improved base of information is needed to assess the morbidity impact more accurately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Summarizing the epidemiology of major depressive disorder (MDD) at a global level is complicated by significant heterogeneity in the data. The aim of this study is to present a global summary of the prevalence and incidence of MDD, accounting for sources of bias, and dealing with heterogeneity. Findings are informing MDD burden quantification in the Global Burden of Disease (GBD) 2010 Study. Method A systematic review of prevalence and incidence of MDD was undertaken. Electronic databases Medline, PsycINFO and EMBASE were searched. Community-representative studies adhering to suitable diagnostic nomenclature were included. A meta-regression was conducted to explore sources of heterogeneity in prevalence and guide the stratification of data in a meta-analysis. Results The literature search identified 116 prevalence and four incidence studies. Prevalence period, sex, year of study, depression subtype, survey instrument, age and region were significant determinants of prevalence, explaining 57.7% of the variability between studies. The global point prevalence of MDD, adjusting for methodological differences, was 4.7% (4.4–5.0%). The pooled annual incidence was 3.0% (2.4–3.8%), clearly at odds with the pooled prevalence estimates and the previously reported average duration of 30 weeks for an episode of MDD. Conclusions Our findings provide a comprehensive and up-to-date profile of the prevalence of MDD globally. Region and study methodology influenced the prevalence of MDD. This needs to be considered in the GBD 2010 study and in investigations into the ecological determinants of MDD. Good-quality estimates from low-/middle-income countries were sparse. More accurate data on incidence are also required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe an investigation into how Massey University’s Pollen Classifynder can accelerate the understanding of pollen and its role in nature. The Classifynder is an imaging microscopy system that can locate, image and classify slide based pollen samples. Given the laboriousness of purely manual image acquisition and identification it is vital to exploit assistive technologies like the Classifynder to enable acquisition and analysis of pollen samples. It is also vital that we understand the strengths and limitations of automated systems so that they can be used (and improved) to compliment the strengths and weaknesses of human analysts to the greatest extent possible. This article reviews some of our experiences with the Classifynder system and our exploration of alternative classifier models to enhance both accuracy and interpretability. Our experiments in the pollen analysis problem domain have been based on samples from the Australian National University’s pollen reference collection (2,890 grains, 15 species) and images bundled with the Classifynder system (400 grains, 4 species). These samples have been represented using the Classifynder image feature set.We additionally work through a real world case study where we assess the ability of the system to determine the pollen make-up of samples of New Zealand honey. In addition to the Classifynder’s native neural network classifier, we have evaluated linear discriminant, support vector machine, decision tree and random forest classifiers on these data with encouraging results. Our hope is that our findings will help enhance the performance of future releases of the Classifynder and other systems for accelerating the acquisition and analysis of pollen samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Busway stations are the interface between passengers and services. The station is crucial to line operation as it is typically the only location where buses can pass each other. Congestion may occur here when buses manoeuvring into and out of the platform lane interfere with bus flow, or when a queue of buses forms upstream of the platform lane blocking the passing lane. Further, some systems include operation where express buses do not observe the station, resulting in a proportion of non-stopping buses. It is important to understand the operation of the station under this type of operation and its effect on busway capacity. This study uses microscopic simulation to treat the busway station operation and to analyse the relationship between station potential capacity where all buses stop, and Mixed Potential Capacity where there is a mixture of stopping and non-stopping buses. First, the micro simulation technique is used to analyze the All Stopping Buses (ASB) scenario and then statistical model is tuned and calibrated for a specified range of controlled scenarios of dwell time characteristics Subsequently, a mathematical model is developed for Mixed Stopping Buses (MSB) Potential Capacity by introducing different proportions of express (or non-stopping) buses. The proposed models for a busway station bus capacity provide a better understanding of operation and are useful to transit agencies in busway planning, design and operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Busway stations are the interface between passengers and services. The station is crucial to line operation as it is typically the only location where buses can pass each other. Congestion may occur here when buses manoeuvring into and out of the platform lane interfere with bus flow, or when a queue of buses forms upstream of the platform lane blocking the passing lane. Further, some systems include operation where express buses do not observe the station, resulting in a proportion of non-stopping buses. It is important to understand the operation of the station under this type of operation and its effect on busway capacity. This study uses microscopic simulation to treat the busway station operation and to analyse the relationship between station potential capacity where all buses stop, and Mixed Potential Capacity where there is a mixture of stopping and non-stopping buses. First, the micro simulation technique is used to analyze the All Stopping Buses (ASB) scenario and then statistical model is tuned and calibrated for a specified range of controlled scenarios of dwell time characteristics Subsequently, a mathematical model is developed for Mixed Stopping Buses (MSB) Potential Capacity by introducing different proportions of express (or non-stopping) buses. The proposed models for a busway station bus capacity provide a better understanding of operation and are useful to transit agencies in busway planning, design and operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional sensitivity and elasticity analyses of matrix population models have been used to inform management decisions, but they ignore the economic costs of manipulating vital rates. For example, the growth rate of a population is often most sensitive to changes in adult survival rate, but this does not mean that increasing that rate is the best option for managing the population because it may be much more expensive than other options. To explore how managers should optimize their manipulation of vital rates, we incorporated the cost of changing those rates into matrix population models. We derived analytic expressions for locations in parameter space where managers should shift between management of fecundity and survival, for the balance between fecundity and survival management at those boundaries, and for the allocation of management resources to sustain that optimal balance. For simple matrices, the optimal budget allocation can often be expressed as simple functions of vital rates and the relative costs of changing them. We applied our method to management of the Helmeted Honeyeater (Lichenostomus melanops cassidix; an endangered Australian bird) and the koala (Phascolarctos cinereus) as examples. Our method showed that cost-efficient management of the Helmeted Honeyeater should focus on increasing fecundity via nest protection, whereas optimal koala management should focus on manipulating both fecundity and survival simultaneously. These findings are contrary to the cost-negligent recommendations of elasticity analysis, which would suggest focusing on managing survival in both cases. A further investigation of Helmeted Honeyeater management options, based on an individual-based model incorporating density dependence, spatial structure, and environmental stochasticity, confirmed that fecundity management was the most cost-effective strategy. Our results demonstrate that decisions that ignore economic factors will reduce management efficiency. ©2006 Society for Conservation Biology.