93 resultados para Probabilities
Resumo:
We develop a recursion-relation approach for calculating the failure probabilities of a fiber bundle with local load sharing. This recursion relation is exact, so it provides a way to test the validity of the various approximate methods. Applying the exact calculation to uniform and Weibull threshold distributions, we find that the most probable failure load coincides with the average strength as the size of the system N --> infinity.
Resumo:
The photomineralisation of 4-chlorophenol (4-CP) sensitised by Degussa P25 TiO2 in O2-saturated solution represents a possible standard test system in semiconductor-sensitised photomineralisation studies. As part of a detailed examination of this photosystem, the results of the temporal variations in the concentrations of 4-CP, CO2, Cl- and the major organic intermediates, namely, 4-chlorocatechol (4-CC), hydroquinone (HQ), benzoquinone and 4-chlororesorcinol, are reported. The observed variations in [4-CP], [4-CC], [HQ] and [CO2] fit those predicted by a kinetic model which utilises kinetic equations with a Langmuir-Hinshelwood form and assumes that there are three major possible routes in which the photogenerated hydroxyl radicals can react with 4-CP, ie. 4-CP --> 4-CC, 4-CP --> HQ and 4-CP --> (unstable intermediate) --> CO2 and that these routes have the following probabilities of occurring: 48%, 10% and 42%.
Resumo:
This paper introduces the discrete choice model-paradigm of Random Regret Minimization (RRM) to the field of environmental and resource economics. The RRM-approach has been very recently developed in the context of travel demand modelling and presents a tractable, regret-based alternative to the dominant choice-modelling paradigm based on Random Utility Maximization-theory (RUM-theory). We highlight how RRM-based models provide closed form, logit-type formulations for choice probabilities that allow for capturing semi-compensatory behaviour and choice set-composition effects while being equally parsimonious as their utilitarian counterparts. Using data from a Stated Choice-experiment aimed at identifying valuations of characteristics of nature parks, we compare RRM-based models and RUM-based models in terms of parameter estimates, goodness of fit, elasticities and consequential policy implications.
Resumo:
The electronic redistribution of an ion or atom induced by a sudden recoil of the nucleus occurring during the emission or capture of a neutral particle is theoretically investigated. For one-electron systems, analytical expressions are derived for the electronic transition probabilities to bound and continuum states. The quality of a B-spline basis set approach is evaluated from a detailed comparison with the analytical results. This numerical approach is then used Io study the dynamics of two-electron systems (neutral He and Ne ) using correlated wavefunctions for both the target and daughter ions. The total transition probabilities to discrete states, autoionizing states and direct single- and double-ionization probabilities are calculated from the pseudospectra. Sum rules for transition probabilities involving an initial bound state and a complete final series are discussed.
Resumo:
A new model to explain animal spacing, based on a trade-off between foraging efficiency and predation risk, is derived from biological principles. The model is able to explain not only the general tendency for animal groups to form, but some of the attributes of real groups. These include the independence of mean animal spacing from group population, the observed variation of animal spacing with resource availability and also with the probability of predation, and the decline in group stability with group size. The appearance of "neutral zones" within which animals are not motivated to adjust their relative positions is also explained. The model assumes that animals try to minimize a cost potential combining the loss of intake rate due to foraging interference and the risk from exposure to predators. The cost potential describes a hypothetical field giving rise to apparent attractive and repulsive forces between animals. Biologically based functions are given for the decline in interference cost and increase in the cost of predation risk with increasing animal separation. Predation risk is calculated from the probabilities of predator attack and predator detection as they vary with distance. Using example functions for these probabilities and foraging interference, we calculate the minimum cost potential for regular lattice arrangements of animals before generalizing to finite-sized groups and random arrangements of animals, showing optimal geometries in each case and describing how potentials vary with animal spacing. (C) 1999 Academic Press.</p>
Resumo:
An important issue in risk analysis is the distinction between epistemic and aleatory uncertainties. In this paper, the use of distinct representation formats for aleatory and epistemic uncertainties is advocated, the latter being modelled by sets of possible values. Modern uncertainty theories based on convex sets of probabilities are known to be instrumental for hybrid representations where aleatory and epistemic components of uncertainty remain distinct. Simple uncertainty representation techniques based on fuzzy intervals and p-boxes are used in practice. This paper outlines a risk analysis methodology from elicitation of knowledge about parameters to decision. It proposes an elicitation methodology where the chosen representation format depends on the nature and the amount of available information. Uncertainty propagation methods then blend Monte Carlo simulation and interval analysis techniques. Nevertheless, results provided by these techniques, often in terms of probability intervals, may be too complex to interpret for a decision-maker and we, therefore, propose to compute a unique indicator of the likelihood of risk, called confidence index. It explicitly accounts for the decisionmaker’s attitude in the face of ambiguity. This step takes place at the end of the risk analysis process, when no further collection of evidence is possible that might reduce the ambiguity due to epistemic uncertainty. This last feature stands in contrast with the Bayesian methodology, where epistemic uncertainties on input parameters are modelled by single subjective probabilities at the beginning of the risk analysis process.
Resumo:
This paper studies the impact of belief elicitation on informational efficiency and individual behavior in experimental parimutuel betting markets. In one treatment, groups of eight participants, who possess a private signal about the eventual outcome, play a sequential betting game. The second treatment is identical, except that bettors are observed by eight other participants who submit incentivized beliefs about the winning probabilities of each outcome. In the third treatment, the same individuals make bets and assess the winning probabilities of the outcomes. Market probabilities more accurately reflect objective probabilities in the third than in the other two treatments. Submitting beliefs reduces the favorite-longshot bias and making bets improves the accuracy of elicited beliefs. A level-k framework provides some insights about why belief elicitation improves the capacity of betting markets to aggregate information. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The common prior assumption justifies private beliefs as posterior probabilities when updating a common prior based on individual information. We dispose of the common prior assumption for a homogeneous oligopoly market with uncertain costs and firms entertaining arbitrary priors about other firms' cost-type. We show that true prior beliefs can not be evolutionarily stable when truly expected profit measures (reproductive) success.
Resumo:
The greatest common threat to birds in Madagascar has historically been from anthropogenic deforestation. During recent decades, global climate change is now also regarded as a significant threat to biodiversity. This study uses Maximum Entropy species distribution modeling to explore how potential climate change could affect the distribution of 17 threatened forest endemic bird species, using a range of climate variables from the Hadley Center's HadCM3 climate change model, for IPCC scenario B2a, for 2050. We explore the importance of forest cover as a modeling variable and we test the use of pseudo-presences drawn from extent of occurrence distributions. Inclusion of the forest cover variable improves the models and models derived from real-presence data with forest layer are better predictors than those from pseudo-presence data. Using real-presence data, we analyzed the impacts of climate change on the distribution of nine species. We could not predict the impact of climate change on eight species because of low numbers of occurrences. All nine species were predicted to experience reductions in their total range areas, and their maximum modeled probabilities of occurrence. In general, species range and altitudinal contractions follow the reductive trend of the Maximum presence probability. Only two species (Tyto soumagnei and Newtonia fanovanae) are expected to expand their altitude range. These results indicate that future availability of suitable habitat at different elevations is likely to be critical for species persistence through climate change. Five species (Eutriorchis astur, Neodrepanis hypoxantha, Mesitornis unicolor, Euryceros prevostii, and Oriola bernieri) are probably the most vulnerable to climate change. Four of them (E. astur, M. unicolor, E. prevostii, and O. bernieri) were found vulnerable to the forest fragmentation during previous research. Combination of these two threats in the future could negatively affect these species in a drastic way. Climate change is expected to act differently on each species and it is important to incorporate complex ecological variables into species distribution models.
Resumo:
We present results for a suite of 14 three-dimensional, high-resolution hydrodynamical simulations of delayed-detonation models of Type Ia supernova (SN Ia) explosions. This model suite comprises the first set of three-dimensional SN Ia simulations with detailed isotopic yield information. As such, it may serve as a data base for Chandrasekhar-mass delayed-detonation model nucleosynthetic yields and for deriving synthetic observables such as spectra and light curves. We employ aphysically motivated, stochastic model based on turbulent velocity fluctuations and fuel density to calculate in situ the deflagration-to-detonation transition probabilities. To obtain different strengths of the deflagration phase and thereby different degrees of pre-expansion, we have chosen a sequence of initial models with 1, 3, 5, 10, 20, 40, 100, 150, 200, 300 and 1600 (two different realizations) ignition kernels in a hydrostatic white dwarf with a central density of 2.9 × 10 g cm, as well as one high central density (5.5 × 10 g cm) and one low central density (1.0 × 10 g cm) rendition of the 100 ignition kernel configuration. For each simulation, we determined detailed nucleosynthetic yields by postprocessing10 tracer particles with a 384 nuclide reaction network. All delayed-detonation models result in explosions unbinding thewhite dwarf, producing a range of 56Ni masses from 0.32 to 1.11M. As a general trend, the models predict that the stableneutron-rich iron-group isotopes are not found at the lowest velocities, but rather at intermediate velocities (~3000×10 000 km s) in a shell surrounding a Ni-rich core. The models further predict relatively low-velocity oxygen and carbon, with typical minimum velocities around 4000 and 10 000 km s, respectively. © 2012 The Authors. Published by Oxford University Press on behalf of the Royal Astronomical Society.
Resumo:
Aim: To determine whether internal limiting membrane (ILM) peeling is cost-effective compared with no peeling for patients with an idiopathic stage 2 or 3 full-thickness macular hole. Methods: A cost-effectiveness analysis was performed alongside a randomised controlled trial. 141 participants were randomly allocated to receive macular-hole surgery, with either ILM peeling or no peeling. Health-service resource use, costs and quality of life were calculated for each participant. The incremental cost per quality-adjusted life year (QALY) gained was calculated at 6 months. Results: At 6 months, the total costs were on average higher (£424, 95% CI -182 to 1045) in the No Peel arm, primarily owing to the higher reoperation rate in the No Peel arm. The mean additional QALYs from ILM peel at 6 months were 0.002 (95% CI 0.01 to 0.013), adjusting for baseline EQ-5D and other minimisation factors. A mean incremental cost per QALY was not computed, as Peeling was on average less costly and slightly more effective. A stochastic analysis suggested that there was more than a 90% probability that Peeling would be cost-effective at a willingness-to-pay threshold of £20 000 per QALY. Conclusion: Although there is no evidence of a statistically significant difference in either costs or QALYs between macular hole surgery with or without ILM peeling, the balance of probabilities is that ILM Peeling is likely to be a cost-effective option for the treatment of macular holes. Further long-term follow-up data are needed to confirm these findings.
Resumo:
Aim-To develop an expert system model for the diagnosis of fine needle aspiration cytology (FNAC) of the breast.
Methods-Knowledge and uncertainty were represented in the form of a Bayesian belief network which permitted the combination of diagnostic evidence in a cumulative manner and provided a final probability for the possible diagnostic outcomes. The network comprised 10 cytological features (evidence nodes), each independently linked to the diagnosis (decision node) by a conditional probability matrix. The system was designed to be interactive in that the cytopathologist entered evidence into the network in the form of likelihood ratios for the outcomes at each evidence node.
Results-The efficiency of the network was tested on a series of 40 breast FNAC specimens. The highest diagnostic probability provided by the network agreed with the cytopathologists' diagnosis in 100% of cases for the assessment of discrete, benign, and malignant aspirates. A typical probably benign cases were given probabilities in favour of a benign diagnosis. Suspicious cases tended to have similar probabilities for both diagnostic outcomes and so, correctly, could not be assigned as benign or malignant. A closer examination of cumulative belief graphs for the diagnostic sequence of each case provided insight into the diagnostic process, and quantitative data which improved the identification of suspicious cases.
Conclusion-The further development of such a system will have three important roles in breast cytodiagnosis: (1) to aid the cytologist in making a more consistent and objective diagnosis; (2) to provide a teaching tool on breast cytological diagnosis for the non-expert; and (3) it is the first stage in the development of a system capable of automated diagnosis through the use of expert system machine vision.
Resumo:
This article examines changes in attitudes to gender roles in contemporary Britain by using a first-order Markov process in which cumulative transition probabilities are logistic functions of a set of personal and socioeconomic characteristics of respondents. The data are taken from the British Household Panel Study (BHPS). The attitudinal responses examined take the form of ordinal responses concerning gender roles in 1991 and 2003. The likelihood function is partitioned to make possible the use of existing software for estimating model parameters. For the BHPS data, it was found that, depending on the value of the response in 1991, a variety of factors were important determinants of attitudes to gender roles by 2003.
Resumo:
The equiprobability bias is a tendency for individuals to think of probabilistic events as 'equiprobable' by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based on a misunderstanding of the concept of randomness.
Resumo:
With the growing interest in the topic of attribute non-attendance, there is now widespread use of latent class (LC) structures aimed at capturing such behaviour, across a number of different fields. Specifically, these studies rely on a confirmatory LC model, using two separate values for each coefficient, one of which is fixed to zero while the other is estimated, and then use the obtained class probabilities as an indication of the degree of attribute non-attendance. In the present paper, we argue that this approach is in fact misguided, and that the results are likely to be affected by confounding with regular taste heterogeneity. We contrast the confirmatory model with an exploratory LC structure in which the values in both classes are estimated. We also put forward a combined latent class mixed logit model (LC-MMNL) which allows jointly for attribute non-attendance and for continuous taste heterogeneity. Across three separate case studies, the exploratory LC model clearly rejects the confirmatory LC approach and suggests that rates of non-attendance may be much lower than what is suggested by the standard model, or even zero. The combined LC-MMNL model similarly produces significant improvements in model fit, along with substantial reductions in the implied rate of attribute non-attendance, in some cases even eliminating the phenomena across the sample population. Our results thus call for a reappraisal of the large body of recent work that has implied high rates of attribute non-attendance for some attributes. Finally, we also highlight a number of general issues with attribute non-attendance, in particular relating to the computation of willingness to pay measures.