951 resultados para exceedance probabilities
Resumo:
This paper considers the problem of inducing low-risk individuals of all ages to buy private health insurance in Australia. Our proposed subsidy scheme improves upon the age-based penalty scheme under the current "Australian Lifetime Cover" (LTC) scheme. We generate an alternative subsidy profile that obviates adverse selection in private health insurance markets with mandated, age-based, community rating. Our proposal is novel in that we generate subsidies that are both risk- and age-specific, based upon actual risk probabilities. The approach we take may prove useful in other jurisdictions where the extant law mandates community rating in private health insurance markets. Furthermore, our approach is useful in jurisdictions that seek to maintain private insurance to complement existing universal public systems.
Resumo:
Many populations have a negative impact on their habitat or upon other species in the environment if their numbers become too large. For this reason they are often subjected to some form of control. One common control regime is the reduction regime: when the population reaches a certain threshold it is controlled (for example culled) until it falls below a lower predefined level. The natural model for such a controlled population is a birth-death process with two phases, the phase determining which of two distinct sets of birth and death rates governs the process. We present formulae for the probability of extinction and the expected time to extinction, and discuss several applications. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
The first step in conservation planning is to identify objectives. Most stated objectives for conservation, such as to maximize biodiversity outcomes, are too vague to be useful within a decision-making framework. One way to clarify the issue is to define objectives in terms of the risk of extinction for multiple species. Although the assessment of extinction risk for single species is common, few researchers have formulated an objective function that combines the extinction risks of multiple species. We sought to translate the broad goal of maximizing the viability of species into explicit objectives for use in a decision-theoretic approach to conservation planning. We formulated several objective functions based on extinction risk across many species and illustrated the differences between these objectives with simple examples. Each objective function was the mathematical representation of an approach to conservation and emphasized different levels of threat Our objectives included minimizing the joint probability of one or more extinctions, minimizing the expected number of extinctions, and minimizing the increase in risk of extinction from the best-case scenario. With objective functions based on joint probabilities of extinction across species, any correlations in extinction probabilities bad to be known or the resultant decisions were potentially misleading. Additive objectives, such as the expected number of extinctions, did not produce the same anomalies. We demonstrated that the choice of objective function is central to the decision-making process because alternative objective functions can lead to a different ranking of management options. Therefore, decision makers need to think carefully in selecting and defining their conservation goals.
Resumo:
The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter.
Resumo:
In this Letter we numerically investigate the fault-tolerant threshold for optical cluster-state quantum computing. We allow both photon loss noise and depolarizing noise (as a general proxy for all local noise), and obtain a threshold region of allowed pairs of values for the two types of noise. Roughly speaking, our results show that scalable optical quantum computing is possible for photon loss probabilities < 3x10(-3), and for depolarization probabilities < 10(-4).
Resumo:
Background: The structure of proteins may change as a result of the inherent flexibility of some protein regions. We develop and explore probabilistic machine learning methods for predicting a continuum secondary structure, i.e. assigning probabilities to the conformational states of a residue. We train our methods using data derived from high-quality NMR models. Results: Several probabilistic models not only successfully estimate the continuum secondary structure, but also provide a categorical output on par with models directly trained on categorical data. Importantly, models trained on the continuum secondary structure are also better than their categorical counterparts at identifying the conformational state for structurally ambivalent residues. Conclusion: Cascaded probabilistic neural networks trained on the continuum secondary structure exhibit better accuracy in structurally ambivalent regions of proteins, while sustaining an overall classification accuracy on par with standard, categorical prediction methods.