957 resultados para Probability distributions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the prominent use of the Suchey-Brooks (S-B) method of age estimation in forensic anthropological practice, it is subject to intrinsic limitations, with reports of differential inter-population error rates between geographical locations. This study assessed the accuracy of the S-B method to a contemporary adult population in Queensland, Australia and provides robust age parameters calibrated for our population. Three-dimensional surface reconstructions were generated from computed tomography scans of the pubic symphysis of male and female Caucasian individuals aged 15–70 years (n = 195) in Amira® and Rapidform®. Error was analyzed on the basis of bias, inaccuracy and percentage correct classification for left and right symphyseal surfaces. Application of transition analysis and Chi-square statistics demonstrated 63.9% and 69.7% correct age classification associated with the left symphyseal surface of Australian males and females, respectively, using the S-B method. Using Bayesian statistics, probability density distributions for each S-B phase were calculated, providing refined age parameters for our population. Mean inaccuracies of 6.77 (±2.76) and 8.28 (±4.41) years were reported for the left surfaces of males and females, respectively; with positive biases for younger individuals (<55 years) and negative biases in older individuals. Significant sexual dimorphism in the application of the S-B method was observed; and asymmetry in phase classification of the pubic symphysis was a frequent phenomenon. These results recommend that the S-B method should be applied with caution in medico-legal death investigations of Queensland skeletal remains and warrant further investigation of reliable age estimation techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first representative chemical, structural, and morphological analysis of the solid particles from a single collection surface has been performed. This collection surface sampled the stratosphere between 17 and 19km in altitude in the summer of 1981, and therefore before the 1982 eruptions of El Chichón. A particle collection surface was washed free of all particles with rinses of Freon and hexane, and the resulting wash was directed through a series of vertically stacked Nucleopore filters. The size cutoff for the solid particle collection process in the stratosphere is found to be considerably less than 1 μm. The total stratospheric number density of solid particles larger than 1μm in diameter at the collection time is calculated to be about 2.7×10−1 particles per cubic meter, of which approximately 95% are smaller than 5μm in diameter. Previous classification schemes are expanded to explicitly recognize low atomic number material. With the single exception of the calcium-aluminum-silicate (CAS) spheres all solid particle types show a logarithmic increase in number concentration with decreasing diameter. The aluminum-rich particles are unique in showing bimodal size distributions. In addition, spheres constitute only a minor fraction of the aluminum-rich material. About 2/3 of the particles examined were found to be shards of rhyolitic glass. This abundant volcanic material could not be correlated with any eruption plume known to have vented directly to the stratosphere. The micrometeorite number density calculated from this data set is 5×10−2 micrometeorites per cubic meter of air, an order of magnitude greater than the best previous estimate. At the collection altitude, the maximum collision frequency of solid particles >5μm in average diameter is calculated to be 6.91×10−16 collisions per second, which indicates negligible contamination of extraterrestrial particles in the stratosphere by solid anthropogenic particles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate the distribution of the product of Rayleigh distributed random variables. Considering the Mellin-Barnes inversion formula and using the saddle point approach we obtain an upper bound for the product distribution. The accuracy of this tail-approximation increases as the number of random variables in the product increase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This volume puts together the works of a group of distinguished scholars and active researchers in the field of media and communication studies to reflect upon the past, present, and future of new media research. The chapters examine the implications of new media technologies on everyday life, existing social institutions, and the society at large at various levels of analysis. Macro-level analyses of changing techno-social formation – such as discussions of the rise of surveillance society and the "fifth estate" – are combined with studies on concrete and specific new media phenomena, such as the rise of Pro-Am collaboration and "fan labor" online. In the process, prominent concepts in the field of new media studies, such as social capital, displacement, and convergence, are critically examined, while new theoretical perspectives are proposed and explicated. Reflecting the inter-disciplinary nature of the field of new media studies and communication research in general, the chapters interrogate into the problematic through a range of theoretical and methodological approaches. The book should offer students and researchers who are interested in the social impact of new media both critical reviews of the existing literature and inspirations for developing new research questions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context: Anti-Müllerian hormone (AMH) concentration reflects ovarian aging and is argued to be a useful predictor of age at menopause (AMP). It is hypothesized that AMH falling below a critical threshold corresponds to follicle depletion, which results in menopause. With this threshold, theoretical predictions of AMP can be made. Comparisons of such predictions with observed AMP from population studies support the role for AMH as a forecaster of menopause. Objective: The objective of the study was to investigate whether previous relationships between AMH and AMP are valid using a much larger data set. Setting: AMH was measured in 27 563 women attending fertility clinics. Study Design: From these data a model of age-related AMH change was constructed using a robust regression analysis. Data on AMP from subfertile women were obtained from the population-based Prospect-European Prospective Investigation into Cancer and Nutrition (Prospect- EPIC) cohort (n � 2249). By constructing a probability distribution of age at which AMH falls below a critical threshold and fitting this to Prospect-EPIC menopausal age data using maximum likelihood, such a threshold was estimated. Main Outcome: The main outcome was conformity between observed and predicted AMP. Results: To get a distribution of AMH-predicted AMP that fit the Prospect-EPIC data, we found the critical AMH threshold should vary among women in such a way that women with low age-specific AMH would have lower thresholds, whereas women with high age-specific AMH would have higher thresholds (mean 0.075 ng/mL; interquartile range 0.038–0.15 ng/mL). Such a varying AMH threshold for menopause is a novel and biologically plausible finding. AMH became undetectable (�0.2 ng/mL) approximately 5 years before the occurrence of menopause, in line with a previous report. Conclusions: The conformity of the observed and predicted distributions of AMP supports the hypothesis that declining population averages of AMH are associated with menopause, making AMH an excellent candidate biomarker for AMP prediction. Further research will help establish the accuracy of AMH levels to predict AMP within individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In condition-based maintenance (CBM), effective diagnostic and prognostic tools are essential for maintenance engineers to identify imminent fault and predict the remaining useful life before the components finally fail. This enables remedial actions to be taken in advance and reschedule of production if necessary. All machine components are subjected to degradation processes in real environments and they have certain failure characteristics which can be related to the operating conditions. This paper describes a technique for accurate assessment of the remnant life of bearings based on health state probability estimation and historical knowledge embedded in the closed loop diagnostics and prognostics system. The technique uses the Support Vector Machine (SVM) classifier as a tool for estimating health state probability of machine degradation process to provide long term prediction. To validate the feasibility of the proposed model, real life fault historical data from bearings of High Pressure-Liquefied Natural Gas (HP-LNG) pumps were analysed and used to obtain the optimal prediction of remaining useful life (RUL). The results obtained were very encouraging and showed that the proposed prognosis system based on health state probability estimation has the potential to be used as an estimation tool for remnant life prediction in industrial machinery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dose kernels may be used to calculate dose distributions in radiotherapy (as described by Ahnesjo et al., 1999). Their calculation requires use of Monte Carlo methods, usually by forcing interactions to occur at a point. The Geant4 Monte Carlo toolkit provides a capability to force interactions to occur in a particular volume. We have modified this capability and created a Geant4 application to calculate dose kernels in cartesian, cylindrical, and spherical scoring systems. The simulation considers monoenergetic photons incident at the origin of a 3 m x 3 x 9 3 m water volume. Photons interact via compton, photo-electric, pair production, and rayleigh scattering. By default, Geant4 models photon interactions by sampling a physical interaction length (PIL) for each process. The process returning the smallest PIL is then considered to occur. In order to force the interaction to occur within a given length, L_FIL, we scale each PIL according to the formula: PIL_forced = L_FIL 9 (1 - exp(-PIL/PILo)) where PILo is a constant. This ensures that the process occurs within L_FIL, whilst correctly modelling the relative probability of each process. Dose kernels were produced for an incident photon energy of 0.1, 1.0, and 10.0 MeV. In order to benchmark the code, dose kernels were also calculated using the EGSnrc Edknrc user code. Identical scoring systems were used; namely, the collapsed cone approach of the Edknrc code. Relative dose difference images were then produced. Preliminary results demonstrate the ability of the Geant4 application to reproduce the shape of the dose kernels; median relative dose differences of 12.6, 5.75, and 12.6 % were found for an incident photon energy of 0.1, 1.0, and 10.0 MeV respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A technique for analysing exhaust emission plumes from unmodified locomotives under real world conditions is described and applied to the task of characterizing plumes from railway trains servicing an Australian shipping port. The method utilizes the simultaneous measurement, downwind of the railway line, of the following pollutants; particle number, PM2.5 mass fraction, SO2, NOx and CO2, with the last of these being used as an indicator of fuel combustion. Emission factors are then derived, in terms of number of particles and mass of pollutant emitted per unit mass of fuel consumed. Particle number size distributions are also presented. The practical advantages of the method are discussed including the capacity to routinely collect emission factor data for passing trains and to thereby build up a comprehensive real world database for a wide range of pollutants. Samples from 56 train movements were collected, analyzed and presented. The quantitative results for emission factors are: EF(N)=(1.7±1)×1016 kg-1, EF(PM2.5)= (1.1±0.5) g·kg-1, EF(NOx)= (28±14) g·kg-1, and EF(SO2 )= (1.4±0.4) g·kg-1. The findings are compared with comparable previously published work. Statistically significant (p<α, α=0.05) correlations within the group of locomotives sampled were found between the emission factors for particle number and both SO2 and NOx.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mean action time is the mean of a probability density function that can be interpreted as a critical time, which is a finite estimate of the time taken for the transient solution of a reaction-diffusion equation to effectively reach steady state. For high-variance distributions, the mean action time under-approximates the critical time since it neglects to account for the spread about the mean. We can improve our estimate of the critical time by calculating the higher moments of the probability density function, called the moments of action, which provide additional information regarding the spread about the mean. Existing methods for calculating the nth moment of action require the solution of n nonhomogeneous boundary value problems which can be difficult and tedious to solve exactly. Here we present a simplified approach using Laplace transforms which allows us to calculate the nth moment of action without solving this family of boundary value problems and also without solving for the transient solution of the underlying reaction-diffusion problem. We demonstrate the generality of our method by calculating exact expressions for the moments of action for three problems from the biophysics literature. While the first problem we consider can be solved using existing methods, the second problem, which is readily solved using our approach, is intractable using previous techniques. The third problem illustrates how the Laplace transform approach can be used to study coupled linear reaction-diffusion equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel framework for the modelling of passenger facilitation in a complex environment. The research is motivated by the challenges in the airport complex system, where there are multiple stakeholders, differing operational objectives and complex interactions and interdependencies between different parts of the airport system. Traditional methods for airport terminal modelling do not explicitly address the need for understanding causal relationships in a dynamic environment. Additionally, existing Bayesian Network (BN) models, which provide a means for capturing causal relationships, only present a static snapshot of a system. A method to integrate a BN complex systems model with stochastic queuing theory is developed based on the properties of the Poisson and Exponential distributions. The resultant Hybrid Queue-based Bayesian Network (HQBN) framework enables the simulation of arbitrary factors, their relationships, and their effects on passenger flow and vice versa. A case study implementation of the framework is demonstrated on the inbound passenger facilitation process at Brisbane International Airport. The predicted outputs of the model, in terms of cumulative passenger flow at intermediary and end points in the inbound process, are found to have an $R^2$ goodness of fit of 0.9994 and 0.9982 respectively over a 10 hour test period. The utility of the framework is demonstrated on a number of usage scenarios including real time monitoring and `what-if' analysis. This framework provides the ability to analyse and simulate a dynamic complex system, and can be applied to other socio-technical systems such as hospitals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents new theoretical and empirical evidence on the forecasting ability of prediction markets. We develop a model that predicts that the time until expiration of a prediction market should negatively affect the accuracy of prices as a forecasting tool in the direction of a ‘favourite/longshot bias’. That is, high-likelihood events are underpriced, and low-likelihood events are over-priced. We confirm this result using a large data set of prediction market transaction prices. Prediction markets are reasonably well calibrated when time to expiration is relatively short, but prices are significantly biased for events farther in the future. When time value of money is considered, the miscalibration can be exploited to earn excess returns only when the trader has a relatively low discount rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A known limitation of the Probability Ranking Principle (PRP) is that it does not cater for dependence between documents. Recently, the Quantum Probability Ranking Principle (QPRP) has been proposed, which implicitly captures dependencies between documents through “quantum interference”. This paper explores whether this new ranking principle leads to improved performance for subtopic retrieval, where novelty and diversity is required. In a thorough empirical investigation, models based on the PRP, as well as other recently proposed ranking strategies for subtopic retrieval (i.e. Maximal Marginal Relevance (MMR) and Portfolio Theory(PT)), are compared against the QPRP. On the given task, it is shown that the QPRP outperforms these other ranking strategies. And unlike MMR and PT, one of the main advantages of the QPRP is that no parameter estimation/tuning is required; making the QPRP both simple and effective. This research demonstrates that the application of quantum theory to problems within information retrieval can lead to significant improvements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we summarise the development of a ranking principle based on quantum probability theory, called the Quantum Probability Ranking Principle (QPRP), and we also provide an overview of the initial experiments performed employing the QPRP. The main difference between the QPRP and the classic Probability Ranking Principle, is that the QPRP implicitly captures the dependencies between documents by means of quantum interference". Subsequently, the optimal ranking of documents is not based solely on documents' probability of relevance but also on the interference with the previously ranked documents. Our research shows that the application of quantum theory to problems within information retrieval can lead to consistently better retrieval effectiveness, while still being simple, elegant and tractable.