888 resultados para probability of precocious pregnancy
Resumo:
Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
Recombination is thought to occur only rarely in animal mitochondrial DNA ( mtDNA). However, detection of mtDNA recombination requires that cells become heteroplasmic through mutation, intramolecular recombination or ' leakage' of paternal mtDNA. Interspecific hybridization increases the probability of detecting mtDNA recombinants due to higher levels of sequence divergence and potentially higher levels of paternal leakage. During a study of historical variation in Atlantic salmon ( Salmo salar) mtDNA, an individual with a recombinant haplotype containing sequence from both Atlantic salmon and brown trout ( Salmo trutta) was detected. The individual was not an F1 hybrid but it did have an unusual nuclear genotype which suggested that it was a later-generation backcross. No other similar recombinant haplotype was found from the same population or three neighbouring Atlantic salmon populations in 717 individuals collected during 1948 - 2002. Interspecific recombination may increase mtDNA variability within species and can have implications for phylogenetic studies.
Resumo:
Stephens and Donnelly have introduced a simple yet powerful importance sampling scheme for computing the likelihood in population genetic models. Fundamental to the method is an approximation to the conditional probability of the allelic type of an additional gene, given those currently in the sample. As noted by Li and Stephens, the product of these conditional probabilities for a sequence of draws that gives the frequency of allelic types in a sample is an approximation to the likelihood, and can be used directly in inference. The aim of this note is to demonstrate the high level of accuracy of "product of approximate conditionals" (PAC) likelihood when used with microsatellite data. Results obtained on simulated microsatellite data show that this strategy leads to a negligible bias over a wide range of the scaled mutation parameter theta. Furthermore, the sampling variance of likelihood estimates as well as the computation time are lower than that obtained with importance sampling on the whole range of theta. It follows that this approach represents an efficient substitute to IS algorithms in computer intensive (e.g. MCMC) inference methods in population genetics. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
Background: We report an analysis of a protein network of functionally linked proteins, identified from a phylogenetic statistical analysis of complete eukaryotic genomes. Phylogenetic methods identify pairs of proteins that co-evolve on a phylogenetic tree, and have been shown to have a high probability of correctly identifying known functional links. Results: The eukaryotic correlated evolution network we derive displays the familiar power law scaling of connectivity. We introduce the use of explicit phylogenetic methods to reconstruct the ancestral presence or absence of proteins at the interior nodes of a phylogeny of eukaryote species. We find that the connectivity distribution of proteins at the point they arise on the tree and join the network follows a power law, as does the connectivity distribution of proteins at the time they are lost from the network. Proteins resident in the network acquire connections over time, but we find no evidence that 'preferential attachment' - the phenomenon of newly acquired connections in the network being more likely to be made to proteins with large numbers of connections - influences the network structure. We derive a 'variable rate of attachment' model in which proteins vary in their propensity to form network interactions independently of how many connections they have or of the total number of connections in the network, and show how this model can produce apparent power-law scaling without preferential attachment. Conclusion: A few simple rules can explain the topological structure and evolutionary changes to protein-interaction networks: most change is concentrated in satellite proteins of low connectivity and small phenotypic effect, and proteins differ in their propensity to form attachments. Given these rules of assembly, power law scaled networks naturally emerge from simple principles of selection, yielding protein interaction networks that retain a high-degree of robustness on short time scales and evolvability on longer evolutionary time scales.
Resumo:
Natural exposure to prion disease is likely to occur throughout successive challenges, yet most experiments focus on single large doses of infectious material. We analyze the results from an experiment in which rodents were exposed to multiple doses of feed contaminated with the scrapie agent. We formally define hypotheses for how the doses combine in terms of statistical models. The competing hypotheses are that only the total dose of infectivity is important (cumulative model), doses act independently, or a general alternative that interaction between successive doses occurs (to raise or lower the risk of infection). We provide sample size calculations to distinguish these hypotheses. In the experiment, a fixed total dose has a significantly reduced probability of causing infection if the material is presented as multiple challenges, and as the time between challenges lengthens. Incubation periods are shorter and less variable if all material is consumed on one occasion. We show that the probability of infection is inconsistent with the hypothesis that each dose acts as a cumulative or independent challenge. The incubation periods are inconsistent with the independence hypothesis. Thus, although a trend exists for the risk of infection with prion disease to increase with repeated doses, it does so to a lesser degree than is expected if challenges combine independently or in a cumulative manner.
Resumo:
1. Demographic models are assuming an important role in management decisions for endangered species. Elasticity analysis and scope for management analysis are two such applications. Elasticity analysis determines the vital rates that have the greatest impact on population growth. Scope for management analysis examines the effects that feasible management might have on vital rates and population growth. Both methods target management in an attempt to maximize population growth. 2. The Seychelles magpie robin Copsychus sechellarum is a critically endangered island endemic, the population of which underwent significant growth in the early 1990s following the implementation of a recovery programme. We examined how the formal use of elasticity and scope for management analyses might have shaped management in the recovery programme, and assessed their effectiveness by comparison with the actual population growth achieved. 3. The magpie robin population doubled from about 25 birds in 1990 to more than 50 by 1995. A simple two-stage demographic model showed that this growth was driven primarily by a significant increase in the annual survival probability of first-year birds and an increase in the birth rate. Neither the annual survival probability of adults nor the probability of a female breeding at age 1 changed significantly over time. 4. Elasticity analysis showed that the annual survival probability of adults had the greatest impact on population growth. There was some scope to use management to increase survival, but because survival rates were already high (> 0.9) this had a negligible effect on population growth. Scope for management analysis showed that significant population growth could have been achieved by targeting management measures at the birth rate and survival probability of first-year birds, although predicted growth rates were lower than those achieved by the recovery programme when all management measures were in place (i.e. 1992-95). 5. Synthesis and applications. We argue that scope for management analysis can provide a useful basis for management but will inevitably be limited to some extent by a lack of data, as our study shows. This means that identifying perceived ecological problems and designing management to alleviate them must be an important component of endangered species management. The corollary of this is that it will not be possible or wise to consider only management options for which there is a demonstrable ecological benefit. Given these constraints, we see little role for elasticity analysis because, when data are available, a scope for management analysis will always be of greater practical value and, when data are lacking, precautionary management demands that as many perceived ecological problems as possible are tackled.
Communicating risk of medication side effects: an empirical evaluation of EU recommended terminology
Resumo:
Two experiments compared people's interpretation of verbal and numerical descriptions of the risk of medication side effects occurring. The verbal descriptors were selected from those recommended for use by the European Union (very common, common, uncommon, rare, very rare). Both experiments used a controlled empirical methodology, in which nearly 500 members of the general population were presented with a fictitious (but realistic) scenario about visiting the doctor and being prescribed medication, together with information about the medicine's side effects and their probability of occurrence. Experiment 1 found that, in all three age groups tested (18 - 40, 41 - 60 and over 60), participants given a verbal descriptor (very common) estimated side effect risk to be considerably higher than those given a comparable numerical description. Furthermore, the differences in interpretation were reflected in their judgements of side effect severity, risk to health, and intention to comply. Experiment 2 confirmed these findings using two different verbal descriptors (common and rare) and in scenarios which described either relatively severe or relatively mild side effects. Strikingly, only 7 out of 180 participants in this study gave a probability estimate which fell within the EU assigned numerical range. Thus, large scale use of the descriptors could have serious negative consequences for individual and public health. We therefore recommend that the EU and National authorities suspend their recommendations regarding these descriptors until a more substantial evidence base is available to support their appropriate use.
Resumo:
Objectives: To examine doctors' (Experiment 1) and doctors' and lay people's (Experiment 2) interpretations of two sets of recommended verbal labels for conveying information about side effects incidence rates. Method: Both studies used a controlled empirical methodology in which participants were presented with a hypothetical, but realistic, scenario involving a prescribed medication that was said to be associated with either mild or severe side effects. The probability of each side effect was described using one of the five descriptors advocated by the European Union (Experiment 1) or one of the six descriptors advocated in Calman's risk scale (Experiment 2), and study participants were required to estimate (numerically) the probability of each side effect occurring. Key findings: Experiment 1 showed that the doctors significantly overestimated the risk of side effects occurring when interpreting the five EU descriptors, compared with the assigned probability ranges. Experiment 2 showed that both groups significantly overestimated risk when given the six Calman descriptors, although the degree of overestimation was not as great for the doctors as for the lay people. Conclusion: On the basis of our findings, we argue that we are still a long way from achieving a standardised language of risk for use by both professionals and the general public, although there might be more potential for use of standardised terms among professionals. In the meantime, the EU and other regulatory bodies and health professionals should be very cautious about advocating the use of particular verbal labels for describing medication side effects.
Resumo:
A study examined people's interpretation of European Commission (EC) recommended verbal descriptors for risk of medicine side effects, and actions to take if they do occur. Members of the general public were presented with a fictitious (but realistic) scenario about suffering from a stiff neck, visiting the local pharmacy and purchasing an over the counter (OTC) medicine (Ibruprofen). The medicine came with an information leaflet which included information about the medicine's side effects, their risk of occurrence, and recommended actions to take if adverse effects are experienced. Probability of occurrence was presented numerically (6%) or verbally, using the recommended EC descriptor (common). Results showed that, in line with findings of our earlier work with prescribed medicines, participants significantly overestimated side effect risk. Furthermore, the differences in interpretation were reflected in their judgements of satisfaction, side effect severity, risk to health, and intention to take the medicine. Finally, we observed no significant difference between people's interpretation of the recommended action descriptors ('immediately' and 'as soon as possible'). (C) 2003 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
Background/Objectives: Prebiotics have attracted interest for their ability to positively affect the colonic microbiota composition, thus increasing resistance to infection and diarrhoeal disease. This study assessed the effectiveness of a prebiotic galacto-oligosaccharide mixture (B-GOS) on the severity and/or incidence of travellers' diarrhoea (TD) in healthy subjects. Subjects/Methods: The study was a placebo-controlled, randomized, double blind of parallel design in 159 healthy volunteers, who travelled for minimum of 2 weeks to a country of low or high risk for TD. The investigational product was the B-GOS and the placebo was maltodextrin. Volunteers were randomized into groups with an equal probability of receiving either the prebiotic or placebo. The protocol comprised of a 1 week pre-holiday period recording bowel habit, while receiving intervention and the holiday period. Bowel habit included the number of bowel movements and average consistency of the stools as well as occurrence of abdominal discomfort, flatulence, bloating or vomiting. A clinical report was completed in the case of diarrhoeal incidence. A post-study questionnaire was also completed by all subjects on their return. Results: Results showed significant differences between the B-GOS and the placebo group in the incidence (P<0.05) and duration (P<0.05) of TD. Similar findings occurred on abdominal pain (P<0.05) and the overall quality of life assessment (P<0.05). Conclusions: Consumption of the tested galacto-oligosaccharide mixture showed significant potential in preventing the incidence and symptoms of TD.
Resumo:
The differential phase (ΦDP) measured by polarimetric radars is recognized to be a very good indicator of the path integrated by rain. Moreover, if a linear relationship is assumed between the specific differential phase (KDP) and the specific attenuation (AH) and specific differential attenuation (ADP), then attenuation can easily be corrected. The coefficients of proportionality, γH and γDP, are, however, known to be dependent in rain upon drop temperature, drop shapes, drop size distribution, and the presence of large drops causing Mie scattering. In this paper, the authors extensively apply a physically based method, often referred to as the “Smyth and Illingworth constraint,” which uses the constraint that the value of the differential reflectivity ZDR on the far side of the storm should be low to retrieve the γDP coefficient. More than 30 convective episodes observed by the French operational C-band polarimetric Trappes radar during two summers (2005 and 2006) are used to document the variability of γDP with respect to the intrinsic three-dimensional characteristics of the attenuating cells. The Smyth and Illingworth constraint could be applied to only 20% of all attenuated rays of the 2-yr dataset so it cannot be considered the unique solution for attenuation correction in an operational setting but is useful for characterizing the properties of the strongly attenuating cells. The range of variation of γDP is shown to be extremely large, with minimal, maximal, and mean values being, respectively, equal to 0.01, 0.11, and 0.025 dB °−1. Coefficient γDP appears to be almost linearly correlated with the horizontal reflectivity (ZH), differential reflectivity (ZDR), and specific differential phase (KDP) and correlation coefficient (ρHV) of the attenuating cells. The temperature effect is negligible with respect to that of the microphysical properties of the attenuating cells. Unusually large values of γDP, above 0.06 dB °−1, often referred to as “hot spots,” are reported for 15%—a nonnegligible figure—of the rays presenting a significant total differential phase shift (ΔϕDP > 30°). The corresponding strongly attenuating cells are shown to have extremely high ZDR (above 4 dB) and ZH (above 55 dBZ), very low ρHV (below 0.94), and high KDP (above 4° km−1). Analysis of 4 yr of observed raindrop spectra does not reproduce such low values of ρHV, suggesting that (wet) ice is likely to be present in the precipitation medium and responsible for the attenuation and high phase shifts. Furthermore, if melting ice is responsible for the high phase shifts, this suggests that KDP may not be uniquely related to rainfall rate but can result from the presence of wet ice. This hypothesis is supported by the analysis of the vertical profiles of horizontal reflectivity and the values of conventional probability of hail indexes.
Resumo:
This paper analyzes the delay performance of Enhanced relay-enabled Distributed Coordination Function (ErDCF) for wireless ad hoc networks under ideal condition and in the presence of transmission errors. Relays are nodes capable of supporting high data rates for other low data rate nodes. In ideal channel ErDCF achieves higher throughput and reduced energy consumption compared to IEEE 802.11 Distributed Coordination Function (DCF). This gain is still maintained in the presence of errors. It is also expected of relays to reduce the delay. However, the impact on the delay behavior of ErDCF under transmission errors is not known. In this work, we have presented the impact of transmission errors on delay. It turns out that under transmission errors of sufficient magnitude to increase dropped packets, packet delay is reduced. This is due to increase in the probability of failure. As a result the packet drop time increases, thus reflecting the throughput degradation.
Resumo:
The intensity and distribution of daily precipitation is predicted to change under scenarios of increased greenhouse gases (GHGs). In this paper, we analyse the ability of HadCM2, a general circulation model (GCM), and a high-resolution regional climate model (RCM), both developed at the Met Office's Hadley Centre, to simulate extreme daily precipitation by reference to observations. A detailed analysis of daily precipitation is made at two UK grid boxes, where probabilities of reaching daily thresholds in the GCM and RCM are compared with observations. We find that the RCM generally overpredicts probabilities of extreme daily precipitation but that, when the GCM and RCM simulated values are scaled to have the same mean as the observations, the RCM captures the upper-tail distribution more realistically. To compare regional changes in daily precipitation in the GHG-forced period 2080-2100 in the GCM and the RCM, we develop two methods. The first considers the fractional changes in probability of local daily precipitation reaching or exceeding a fixed 15 mm threshold in the anomaly climate compared with the control. The second method uses the upper one-percentile of the control at each point as the threshold. Agreement between the models is better in both seasons with the latter method, which we suggest may be more useful when considering larger scale spatial changes. On average, the probability of precipitation exceeding the 1% threshold increases by a factor of 2.5 (GCM and RCM) in winter and by I .7 (GCM) or 1.3 (RCM) in summer.
Resumo:
Empirical studies using satellite data and radiosondes have shown that precipitation increases with column water vapor (CWV) in the tropics, and that this increase is much steeper above some critical CWV value. Here, eight years of 1-min-resolution microwave radiometer and optical gauge data at Nauru Island are analyzed to better understand the relationships among CWV, column liquid water (CLW), and precipitation at small time scales. CWV is found to have large autocorrelation times compared with CLW and precipitation. Before precipitation events, CWV increases on both a synoptic-scale time period and a subsequent shorter time period consistent with mesoscale convective activity; the latter period is associated with the highest CWV levels. Probabilities of precipitation increase greatly with CWV. Given initial high CWV, this increased probability of precipitation persists at least 10–12 h. Even in periods of high CWV, however, probabilities of initial precipitation in a 5-min period remain low enough that there tends to be a lag before the start of the next precipitation event. This is consistent with precipitation occurring stochastically within environments containing high CWV, with the latter being established by a combination of synoptic-scale and mesoscale forcing.
Resumo:
Summary 1. In recent decades there have been population declines of many UK bird species, which have become the focus of intense research and debate. Recently, as the populations of potential predators have increased there is concern that increased rates of predation may be contributing to the declines. In this review, we assess the methodologies behind the current published science on the impacts of predators on avian prey in the UK. 2. We identified suitable studies, classified these according to study design (experimental ⁄observational) and assessed the quantity and quality of the data upon which any variation in predation rates was inferred. We then explored whether the underlying study methodology had implications for study outcome. 3. We reviewed 32 published studies and found that typically observational studies comprehensively monitored significantly fewer predator species than experimental studies. Data for a difference in predator abundance from targeted (i.e. bespoke) census techniques were available for less than half of the 32 predator species studied. 4. The probability of a study detecting an impact on prey abundance was strongly, positively related to the quality and quantity of data upon which the gradient in predation rates was inferred. 5. The findings suggest that if a study is based on good quality abundance data for a range of predator species then it is more likely to detect an effect than if it relies on opportunistic data for a smaller number of predators. 6. We recommend that the findings from studies which use opportunistic data, for a limited number of predator species, should be treated with caution and that future studies employ bespoke census techniques to monitor predator abundance for an appropriate suite of predators.