951 resultados para Probabilities


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The leatherback turtle Dermochelys coriacea is considered to be at serious risk of global extinction, despite ongoing conservation efforts. Intensive long-term monitoring of a leatherback nesting population on Sandy Point (St. Croix, US Virgin Islands) offers a unique opportunity to quantify basic population parameters and evaluate effectiveness of nesting beach conservation practices. We report a significant increase in the number of females nesting annually from ca. 18-30 in the 1980s to 186 in 2001, with a corresponding increase in annual hatchling production from ca. 2000 to over 49,000. We then analyzed resighting data from 1991 to 2001 with an open robust-design capture-mark-recapture model to estimate annual nester survival and adult abundance for this population. The expected annual survival probability was estimated at ca. 0.893 (95% CL 0.87-0.92) and the population was estimated to be increasing ca. 13% pa since the early 1990s. Taken together with DNA fingerprinting that identify mother-daughter relations, our findings suggest that the increase in the size of the nesting population since 1991 was probably due to an aggressive program of beach protection and egg relocation initiated more than 20 years ago. Beach protection and egg relocation provide a simple and effective conservation strategy for this Northern Caribbean nesting population as long as adult survival at sea remains relatively high. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many long-lived marine species exhibit life history traits. that make them more vulnerable to overexploitation. Accurate population trend analysis is essential for development and assessment of management plans for these species. However, because many of these species disperse over large geographic areas, have life stages inaccessible to human surveyors, and/or undergo complex developmental migrations, data on trends in abundance are often available for only one stage of the population, usually breeding adults. The green turtle (Chelonia mydas) is one of these long-lived species for which population trends are based almost exclusively on either numbers of females that emerge to nest or numbers of nests deposited each year on geographically restricted beaches. In this study, we generated estimates of annual abundance for juvenile green turtles at two foraging grounds in the Bahamas based on long-term capture-mark-recapture (CMR) studies at Union Creek (24 years) and Conception Creek (13 years), using a two-stage approach. First, we estimated recapture probabilities from CMR data using the Cormack-Jolly-Seber models in the software program MARK; second, we estimated annual abundance of green turtles. at both study sites using the recapture probabilities in a Horvitz-Thompson type estimation procedure. Green turtle abundance did not change significantly in Conception Creek, but, in Union Creek, green turtle abundance had successive phases of significant increase, significant decrease, and stability. These changes in abundance resulted from changes in immigration, not survival or emigration. The trends in abundance on the foraging grounds did not conform to the significantly increasing trend for the major nesting population at Tortuguero, Costa Rica. This disparity highlights the challenges of assessing population-wide trends of green turtles and other long-lived species. The best approach for monitoring population trends may be a combination of (1) extensive surveys to provide data for large-scale trends in relative population abundance, and (2) intensive surveys, using CMR techniques, to estimate absolute abundance and evaluate the demographic processes' driving the trends.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiplication and comultiplication of beliefs represent a generalisation of multiplication and comultiplication of probabilities as well as of binary logic AND and OR. Our approach follows that of subjective logic, where belief functions are expressed as opinions that are interpreted as being equivalent to beta probability distributions. We compare different types of opinion product and coproduct, and show that they represent very good approximations of the analytical product and coproduct of beta probability distributions. We also define division and codivision of opinions, and compare our framework with other logic frameworks for combining uncertain propositions. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The argument from fine tuning is supposed to establish the existence of God from the fact that the evolution of carbon-based life requires the laws of physics and the boundary conditions of the universe to be more or less as they are. We demonstrate that this argument fails. In particular, we focus on problems associated with the role probabilities play in the argument. We show that, even granting the fine tuning of the universe, it does not follow that the universe is improbable, thus no explanation of the fine tuning, theistic or otherwise, is required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Networked information and communication technologies are rapidly advancing the capacities of governments to target and separately manage specific sub-populations, groups and individuals. Targeting uses data profiling to calculate the differential probabilities of outcomes associated with various personal characteristics. This knowledge is used to classify and sort people for differentiated levels of treatment. Targeting is often used to efficiently and effectively target government resources to the most disadvantaged. Although having many benefits, targeting raises several policy and ethical issues. This paper discusses these issues and the policy responses governments may take to maximise the benefits of targeting while ameliorating the negative aspects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers the problem of inducing low-risk individuals of all ages to buy private health insurance in Australia. Our proposed subsidy scheme improves upon the age-based penalty scheme under the current "Australian Lifetime Cover" (LTC) scheme. We generate an alternative subsidy profile that obviates adverse selection in private health insurance markets with mandated, age-based, community rating. Our proposal is novel in that we generate subsidies that are both risk- and age-specific, based upon actual risk probabilities. The approach we take may prove useful in other jurisdictions where the extant law mandates community rating in private health insurance markets. Furthermore, our approach is useful in jurisdictions that seek to maintain private insurance to complement existing universal public systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many populations have a negative impact on their habitat or upon other species in the environment if their numbers become too large. For this reason they are often subjected to some form of control. One common control regime is the reduction regime: when the population reaches a certain threshold it is controlled (for example culled) until it falls below a lower predefined level. The natural model for such a controlled population is a birth-death process with two phases, the phase determining which of two distinct sets of birth and death rates governs the process. We present formulae for the probability of extinction and the expected time to extinction, and discuss several applications. (c) 2006 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first step in conservation planning is to identify objectives. Most stated objectives for conservation, such as to maximize biodiversity outcomes, are too vague to be useful within a decision-making framework. One way to clarify the issue is to define objectives in terms of the risk of extinction for multiple species. Although the assessment of extinction risk for single species is common, few researchers have formulated an objective function that combines the extinction risks of multiple species. We sought to translate the broad goal of maximizing the viability of species into explicit objectives for use in a decision-theoretic approach to conservation planning. We formulated several objective functions based on extinction risk across many species and illustrated the differences between these objectives with simple examples. Each objective function was the mathematical representation of an approach to conservation and emphasized different levels of threat Our objectives included minimizing the joint probability of one or more extinctions, minimizing the expected number of extinctions, and minimizing the increase in risk of extinction from the best-case scenario. With objective functions based on joint probabilities of extinction across species, any correlations in extinction probabilities bad to be known or the resultant decisions were potentially misleading. Additive objectives, such as the expected number of extinctions, did not produce the same anomalies. We demonstrated that the choice of objective function is central to the decision-making process because alternative objective functions can lead to a different ranking of management options. Therefore, decision makers need to think carefully in selecting and defining their conservation goals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this Letter we numerically investigate the fault-tolerant threshold for optical cluster-state quantum computing. We allow both photon loss noise and depolarizing noise (as a general proxy for all local noise), and obtain a threshold region of allowed pairs of values for the two types of noise. Roughly speaking, our results show that scalable optical quantum computing is possible for photon loss probabilities < 3x10(-3), and for depolarization probabilities < 10(-4).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The structure of proteins may change as a result of the inherent flexibility of some protein regions. We develop and explore probabilistic machine learning methods for predicting a continuum secondary structure, i.e. assigning probabilities to the conformational states of a residue. We train our methods using data derived from high-quality NMR models. Results: Several probabilistic models not only successfully estimate the continuum secondary structure, but also provide a categorical output on par with models directly trained on categorical data. Importantly, models trained on the continuum secondary structure are also better than their categorical counterparts at identifying the conformational state for structurally ambivalent residues. Conclusion: Cascaded probabilistic neural networks trained on the continuum secondary structure exhibit better accuracy in structurally ambivalent regions of proteins, while sustaining an overall classification accuracy on par with standard, categorical prediction methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Markov chain Monte Carlo (MCMC) is a methodology that is gaining widespread use in the phylogenetics community and is central to phylogenetic software packages such as MrBayes. An important issue for users of MCMC methods is how to select appropriate values for adjustable parameters such as the length of the Markov chain or chains, the sampling density, the proposal mechanism, and, if Metropolis-coupled MCMC is being used, the number of heated chains and their temperatures. Although some parameter settings have been examined in detail in the literature, others are frequently chosen with more regard to computational time or personal experience with other data sets. Such choices may lead to inadequate sampling of tree space or an inefficient use of computational resources. We performed a detailed study of convergence and mixing for 70 randomly selected, putatively orthologous protein sets with different sizes and taxonomic compositions. Replicated runs from multiple random starting points permit a more rigorous assessment of convergence, and we developed two novel statistics, delta and epsilon, for this purpose. Although likelihood values invariably stabilized quickly, adequate sampling of the posterior distribution of tree topologies took considerably longer. Our results suggest that multimodality is common for data sets with 30 or more taxa and that this results in slow convergence and mixing. However, we also found that the pragmatic approach of combining data from several short, replicated runs into a metachain to estimate bipartition posterior probabilities provided good approximations, and that such estimates were no worse in approximating a reference posterior distribution than those obtained using a single long run of the same length as the metachain. Precision appears to be best when heated Markov chains have low temperatures, whereas chains with high temperatures appear to sample trees with high posterior probabilities only rarely. [Bayesian phylogenetic inference; heating parameter; Markov chain Monte Carlo; replicated chains.]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we consider dynamic programming for the election timing in the majoritarian parliamentary system such as in Australia, where the government has a constitutional right to call an early election. This right can give the government an advantage to remain in power for as long as possible by calling an election, when its popularity is high. On the other hand, the opposition's natural objective is to gain power, and it will apply controls termed as "boosts" to reduce the chance of the government being re-elected by introducing policy and economic responses. In this paper, we explore equilibrium solutions to the government, and the opposition strategies in a political game using stochastic dynamic programming. Results are given in terms of the expected remaining life in power, call and boost probabilities at each time at any level of popularity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data on the occurrence of species are widely used to inform the design of reserve networks. These data contain commission errors (when a species is mistakenly thought to be present) and omission errors (when a species is mistakenly thought to be absent), and the rates of the two types of error are inversely related. Point locality data can minimize commission errors, but those obtained from museum collections are generally sparse, suffer from substantial spatial bias and contain large omission errors. Geographic ranges generate large commission errors because they assume homogenous species distributions. Predicted distribution data make explicit inferences on species occurrence and their commission and omission errors depend on model structure, on the omission of variables that determine species distribution and on data resolution. Omission errors lead to identifying networks of areas for conservation action that are smaller than required and centred on known species occurrences, thus affecting the comprehensiveness, representativeness and efficiency of selected areas. Commission errors lead to selecting areas not relevant to conservation, thus affecting the representativeness and adequacy of reserve networks. Conservation plans should include an estimation of commission and omission errors in underlying species data and explicitly use this information to influence conservation planning outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article applies methods of latent class analysis (LCA) to data on lifetime illicit drug use in order to determine whether qualitatively distinct classes of illicit drug users can be identified. Self-report data on lifetime illicit drug use (cannabis, stimulants, hallucinogens, sedatives, inhalants, cocaine, opioids and solvents) collected from a sample of 6265 Australian twins (average age 30 years) were analyzed using LCA. Rates of childhood sexual and physical abuse, lifetime alcohol and tobacco dependence, symptoms of illicit drug abuse/dependence and psychiatric comorbidity were compared across classes using multinomial logistic regression. LCA identified a 5-class model: Class 1 (68.5%) had low risks of the use of all drugs except cannabis; Class 2 (17.8%) had moderate risks of the use of all drugs; Class 3 (6.6%) had high rates of cocaine, other stimulant and hallucinogen use but lower risks for the use of sedatives or opioids. Conversely, Class 4 (3.0%) had relatively low risks of cocaine, other stimulant or hallucinogen use but high rates of sedative and opioid use. Finally, Class 5 (4.2%) had uniformly high probabilities for the use of all drugs. Rates of psychiatric comorbidity were highest in the polydrug class although the sedative/opioid class had elevated rates of depression/suicidal behaviors and exposure to childhood abuse. Aggregation of population-level data may obscure important subgroup differences in patterns of illicit drug use and psychiatric comorbidity. Further exploration of a 'self-medicating' subgroup is needed.