75 resultados para Prior Probability
Resumo:
There has been recent interest in the use of X-chromosomal loci for forensic and relatedness testing casework, with many authors developing new X-linked short tandem repeat (STR) loci suitable for forensic use. Here we present formulae for two key quantities in paternity testing, the average probability of exclusion and the paternity index, which are suitable for Xchromosomal loci in the presence of population substructure.
Resumo:
Periods between predator detection and an escape response (escape delays) by prey upon attack by a predator often arise because animals trade-off the benefits such a delay gives for assessing risk accurately with the costs of not escaping as quickly as possible. We tested whether freezing behaviour (complete immobility in a previously foraging bird) observed in chaffinches before escaping from an approaching potential threat functions as a period of risk-assessment, and whether information on predator identity is gained even when time available is very short. We flew either a model of a sparrowhawk (predator) or a woodpigeon (no threat) at single chaffinches. Escape delays were significantly shorter with the hawk, except when a model first appeared close to the chaffinch. Chaffinches were significantly more vigilant when they resumed feeding after exposure to the sparrowhawk compared to the woodpigeon showing that they were able to distinguish between threats, and this applied even when time available for assessment was short (an average of 0.29 s). Our results show freezing in chaffinches functions as an effective economic risk assessment period, and that threat information is gained even when very short periods of time are available during an attack.
Resumo:
The aim of a phase H clinical trial is to decide whether or not to develop an experimental therapy further through phase III clinical evaluation. In this paper, we present a Bayesian approach to the phase H trial, although we assume that subsequent phase III clinical trials will hat,e standard frequentist analyses. The decision whether to conduct the phase III trial is based on the posterior predictive probability of a significant result being obtained. This fusion of Bayesian and frequentist techniques accepts the current paradigm for expressing objective evidence of therapeutic value, while optimizing the form of the phase II investigation that leads to it. By using prior information, we can assess whether a phase II study is needed at all, and how much or what sort of evidence is required. The proposed approach is illustrated by the design of a phase II clinical trial of a multi-drug resistance modulator used in combination with standard chemotherapy in the treatment of metastatic breast cancer. Copyright (c) 2005 John Wiley & Sons, Ltd.
Resumo:
The jackknife method is often used for variance estimation in sample surveys but has only been developed for a limited class of sampling designs.We propose a jackknife variance estimator which is defined for any without-replacement unequal probability sampling design. We demonstrate design consistency of this estimator for a broad class of point estimators. A Monte Carlo study shows how the proposed estimator may improve on existing estimators.
Resumo:
We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this study, complementary species-level and intraspecific phylogenies were used to better circumscribe the original native range and history of translocation of the invasive tree Parkinsonia aculeata. Species-level phylogenies were reconstructed using three chloroplast gene regions, and amplified fragment length polymorphism (AFLP) markers were used to reconstruct the intraspecific phylogeny. Together, these phylogenies revealed the timescale of transcontinental lineage divergence and the likely source of recent introductions of the invasive. The sequence data showed that divergence between North American and Argentinean P. aculeata occurred at least 5.7 million years ago, refuting previous hypotheses of recent dispersal between North and South America. AFLP phylogenies revealed the most likely sources of naturalized populations. The AFLP data also identified putatively introgressed plants, underlining the importance of wide sampling of AFLPs and of comparison with uniparentally inherited marker data when investigating hybridizing groups. Although P. aculeata has generally been considered North American, these data show that the original native range of P. aculeata included South America; recent introductions to Africa and Australia are most likely to have occurred from South American populations.
Resumo:
To explore the projection efficiency of a design, Tsai, et al [2000. Projective three-level main effects designs robust to model uncertainty. Biometrika 87, 467-475] introduced the Q criterion to compare three-level main-effects designs for quantitative factors that allow the consideration of interactions in addition to main effects. In this paper, we extend their method and focus on the case in which experimenters have some prior knowledge, in advance of running the experiment, about the probabilities of effects being non-negligible. A criterion which incorporates experimenters' prior beliefs about the importance of each effect is introduced to compare orthogonal, or nearly orthogonal, main effects designs with robustness to interactions as a secondary consideration. We show that this criterion, exploiting prior information about model uncertainty, can lead to more appropriate designs reflecting experimenters' prior beliefs. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Imputation is commonly used to compensate for item non-response in sample surveys. If we treat the imputed values as if they are true values, and then compute the variance estimates by using standard methods, such as the jackknife, we can seriously underestimate the true variances. We propose a modified jackknife variance estimator which is defined for any without-replacement unequal probability sampling design in the presence of imputation and non-negligible sampling fraction. Mean, ratio and random-imputation methods will be considered. The practical advantage of the method proposed is its breadth of applicability.
Resumo:
Individual identification via DNA profiling is important in molecular ecology, particularly in the case of noninvasive sampling. A key quantity in determining the number of loci required is the probability of identity (PIave), the probability of observing two copies of any profile in the population. Previously this has been calculated assuming no inbreeding or population structure. Here we introduce formulae that account for these factors, whilst also accounting for relatedness structure in the population. These formulae are implemented in API-CALC 1.0, which calculates PIave for either a specified value, or a range of values, for F-IS and F-ST.
Resumo:
Most factorial experiments in industrial research form one stage in a sequence of experiments and so considerable prior knowledge is often available from earlier stages. A Bayesian A-optimality criterion is proposed for choosing designs, when each stage in experimentation consists of a small number of runs and the objective is to optimise a response. Simple formulae for the weights are developed, some examples of the use of the design criterion are given and general recommendations are made. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Fast interceptive actions, such as catching a ball, rely upon accurate and precise information from vision. Recent models rely on flexible combinations of visual angle and its rate of expansion of which the tau parameter is a specific case. When an object approaches an observer, however, its trajectory may introduce bias into tau-like parameters that render these computations unacceptable as the sole source of information for actions. Here we show that observer knowledge of object size influences their action timing, and known size combined with image expansion simplifies the computations required to make interceptive actions and provides a route for experience to influence interceptive action.
Resumo:
Using the classical Parzen window estimate as the target function, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density estimates. The proposed algorithm incrementally minimises a leave-one-out test error score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights are finally updated using the multiplicative nonnegative quadratic programming algorithm, which has the ability to reduce the model size further. Except for the kernel width, the proposed algorithm has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Two examples are used to demonstrate the ability of this regression-based approach to effectively construct a sparse kernel density estimate with comparable accuracy to that of the full-sample optimised Parzen window density estimate.
Resumo:
A basic principle in data modelling is to incorporate available a priori information regarding the underlying data generating mechanism into the modelling process. We adopt this principle and consider grey-box radial basis function (RBF) modelling capable of incorporating prior knowledge. Specifically, we show how to explicitly incorporate the two types of prior knowledge: the underlying data generating mechanism exhibits known symmetric property and the underlying process obeys a set of given boundary value constraints. The class of orthogonal least squares regression algorithms can readily be applied to construct parsimonious grey-box RBF models with enhanced generalisation capability.