83 resultados para probability of precocious pregnancy
Resumo:
We predicted that the probability of egg occurrence of salamander Salamandrina perspicillata depended on stream features and predation by native crayfish Austropotamobius fulcisianus and the introduced trout Salmo trutta. We assessed the presence of S. perspicillata at 54 sites within a natural reserve of southern Tuscany, Italy. Generalized linear models with binomial errors were constructed using egg presence/absence and altitude, stream mean size and slope, electrical conductivity, water pH and temperature, and a predation factor, defined according to the presence/absence of crayfish and trout. Some competing models also included an autocovariate term, which estimated how much the response variable at any one sampling point reflected response values at surrounding points. The resulting models were compared using Akaike's information criterion. Model selection led to a subset of 14 models with Delta AIC(c) <7 (i.e., models ranging from substantial support to considerably less support), and all but one of these included an effect of predation. Models with the autocovariate term had considerably more support than those without the term. According to multimodel inference, the presence of trout and crayfish reduced the probability of egg occurrence from a mean level of 0.90 (SE limits: 0.98-0.55) to 0.12 (SE limits: 0.34-0.04). The presence of crayfish alone had no detectable effects (SE limits: 0.86-0.39). The results suggest that introduced trout have a detrimental effect on the reproductive output of S. perspicillata and confirm the fundamental importance of distinguishing the roles of endogenous and exogenous forces that act on population distribution.
Resumo:
In conditional probabilistic logic programming, given a query, the two most common forms for answering the query are either a probability interval or a precise probability obtained by using the maximum entropy principle. The former can be noninformative (e.g.,interval [0; 1]) and the reliability of the latter is questionable when the priori knowledge isimprecise. To address this problem, in this paper, we propose some methods to quantitativelymeasure if a probability interval or a single probability is sufficient for answering a query. We first propose an approach to measuring the ignorance of a probabilistic logic program with respect to a query. The measure of ignorance (w.r.t. a query) reflects howreliable a precise probability for the query can be and a high value of ignorance suggests that a single probability is not suitable for the query. We then propose a method to measure the probability that the exact probability of a query falls in a given interval, e.g., a second order probability. We call it the degree of satisfaction. If the degree of satisfaction is highenough w.r.t. the query, then the given interval can be accepted as the answer to the query. We also prove our measures satisfy many properties and we use a case study to demonstrate the significance of the measures. © Springer Science+Business Media B.V. 2012
Resumo:
Despite the increased applications of the composite materials in aerospace due to their exceptional physical and mechanical properties, the machining of composites remains a challenge. Fibre reinforced laminated composites are prone to different damages during machining process such as delamination, fibre pull-out, microcracks, thermal damages. Optimization of the drilling process parameters can reduces the probability of these damages. In the current research, a 3D finite element (FE) model is developed of the process of drilling in the carbon fibre reinforced composite (CFC). The FE model is used to investigate the effects of cutting speed and feed rate on thrust force, torque and delamination in the drilling of carbon fiber reinforced laminated composite. A mesoscale FE model taking into account of the different oriented plies and interfaces has been proposed to predict different damage modes in the plies and delamination. For validation purposes, experimental drilling tests have been performed and compared to the results of the finite element analysis. Using Matlab a digital image analysis code has been developed to assess the delamination factor produced in CFC as a result of drilling. © Springer Science+Business Media B.V. 2011.
Resumo:
This paper estimates the marginal willingness-to-pay for attributes of a hypothetical HIV vaccine using discrete choice modeling. We use primary data from 326 respondents from Bangkok and Chiang Mai, Thailand, in 2008–2009, selected using purposive, venue-based sampling across two strata. Participants completed a structured questionnaire and full rank discrete choice modeling task administered using computer-assisted personal interviewing. The choice experiment was used to rank eight hypothetical HIV vaccine scenarios, with each scenario comprising seven attributes (including cost) each of which had two levels. The data were analyzed in two alternative specifications: (1) best-worst; and (2) full-rank, using logit likelihood functions estimated with custom routines in Gauss matrix programming language. In the full-rank specification, all vaccine attributes are significant predictors of probability of vaccine choice. The biomedical attributes of the hypothetical HIV vaccine (efficacy, absence of VISP, absence of side effects, and duration of effect) are the most important attributes for HIV vaccine choice. On average respondents are more than twice as likely to accept a vaccine with 99% efficacy, than a vaccine with 50% efficacy. This translates to a willingness to pay US$383 more for a high efficacy vaccine compared with the low efficacy vaccine. Knowledge of the relative importance of determinants of HIV vaccine acceptability is important to ensure the success of future vaccination programs. Future acceptability studies of hypothetical HIV vaccines should use more finely grained biomedical attributes, and could also improve the external validity of results by including more levels of the cost attribute.
Resumo:
Schizophrenia is a common disorder with high heritability and a 10-fold increase in risk to siblings of probands. Replication has been inconsistent for reports of significant genetic linkage. To assess evidence for linkage across studies, rank-based genome scan meta-analysis (GSMA) was applied to data from 20 schizophrenia genome scans. Each marker for each scan was assigned to 1 of 120 30-cM bins, with the bins ranked by linkage scores (1 = most significant) and the ranks averaged across studies (R(avg)) and then weighted for sample size (N(sqrt)[affected casess]). A permutation test was used to compute the probability of observing, by chance, each bin's average rank (P(AvgRnk)) or of observing it for a bin with the same place (first, second, etc.) in the order of average ranks in each permutation (P(ord)). The GSMA produced significant genomewide evidence for linkage on chromosome 2q (PAvgRnk
Resumo:
This paper proposes a discrete mixture model which assigns individuals, up to a probability, to either a class of random utility (RU) maximizers or a class of random regret (RR) minimizers, on the basis of their sequence of observed choices. Our proposed model advances the state of the art of RU-RR mixture models by (i) adding and simultaneously estimating a membership model which predicts the probability of belonging to a RU or RR class; (ii) adding a layer of random taste heterogeneity within each behavioural class; and (iii) deriving a welfare measure associated with the RU-RR mixture model and consistent with referendum-voting, which is the adequate mechanism of provision for such local public goods. The context of our empirical application is a stated choice experiment concerning traffic calming schemes. We find that the random parameter RU-RR mixture model not only outperforms its fixed coefficient counterpart in terms of fit-as expected-but also in terms of plausibility of membership determinants of behavioural class. In line with psychological theories of regret, we find that, compared to respondents who are familiar with the choice context (i.e. the traffic calming scheme), unfamiliar respondents are more likely to be regret minimizers than utility maximizers. © 2014 Elsevier Ltd.
Resumo:
We present results for a variety of Monte Carlo annealing approaches, both classical and quantum, benchmarked against one another for the textbook optimization exercise of a simple one-dimensional double well. In classical (thermal) annealing, the dependence upon the move chosen in a Metropolis scheme is studied and correlated with the spectrum of the associated Markov transition matrix. In quantum annealing, the path integral Monte Carlo approach is found to yield nontrivial sampling difficulties associated with the tunneling between the two wells. The choice of fictitious quantum kinetic energy is also addressed. We find that a "relativistic" kinetic energy form, leading to a higher probability of long real-space jumps, can be considerably more effective than the standard nonrelativistic one.
Resumo:
In this paper, the impact of multiple active eavesdroppers on cooperative single carrier systems with multiple relays and multiple destinations is examined. To achieve the secrecy diversity gains in the form of opportunistic selection, a two-stage scheme is proposed for joint relay and destination selection, in which, after the selection of the relay with the minimum effective maximum signal-to-noise ratio (SNR) to a cluster of eavesdroppers, the destination that has the maximum SNR from the chosen relay is selected. In order to accurately assess the secrecy performance, the exact and asymptotic expressions are obtained in closed-form for several security metrics including the secrecy outage probability, the probability of non-zero secrecy rate, and the ergodic secrecy rate in frequency selective fading. Based on the asymptotic analysis, key design parameters such as secrecy diversity gain, secrecy array gain, secrecy multiplexing gain, and power cost are characterized, from which new insights are drawn. Moreover, it is concluded that secrecy performance limits occur when the average received power at the eavesdropper is proportional to the counterpart at the destination. Specifically, for the secrecy outage probability, it is confirmed that the secrecy diversity gain collapses to zero with outage floor, whereas for the ergodic secrecy rate, it is confirmed confirm that its slope collapses to zero with capacity ceiling.
Resumo:
Background: The postpartum period is a vulnerable time for excess weight retention, particularly for the increasing number of women who are overweight at the start of their pregnancy and subsequently find it difficult to lose additional weight gained during pregnancy. Although postpartum weight management interventions play an important role in breaking this potentially vicious cycle of weight gain, the effectiveness of such interventions in breastfeeding women remains unclear. Our aim was to systematically review the literature about the effectiveness of weight management interventions in breastfeeding women.
Methods: Seven electronic databases were searched for eligible papers. Intervention studies included were carried out exclusively in breastfeeding mothers, ≤2 years postpartum and with a body mass index greater than 18.5 kg/m2, with an outcome measure of change in weight and/or body composition.
Results: Six studies met the selection criteria, and were stratified according to the type of intervention and outcome measures. Despite considerable heterogeneity among studies, the dietary-based intervention studies appeared to be the most efficacious in promoting weight loss; however, few studies were tailored toward the needs of breastfeeding women.
Conclusions: Weight management interventions which include an energy-restricted diet may play a key role in successful postpartum weight loss for breastfeeding mothers.
Resumo:
Objective
To indirectly compare aflibercept, bevacizumab, dexamethasone, ranibizumab and triamcinolone for treatment of macular oedema secondary to central retinal vein occlusion using a network meta-analysis (NMA).
Design
NMA.
Data sources
The following databases were searched from January 2005 to March 2013: MEDLINE, MEDLINE In-process, EMBASE; CDSR, DARE, HTA, NHSEED, CENTRAL; Science Citation Index and Conference Proceedings Citation Index-Science.
Eligibility criteria for selecting studies
Only randomised controlled trials assessing patients with macular oedema secondary to central retinal vein occlusion were included. Studies had to report either proportions of patients gaining ≥3 lines, losing ≥3 lines, or the mean change in best corrected visual acuity. Two authors screened titles and abstracts, extracted data and undertook risk of bias assessment. Bayesian NMA was used to compare the different interventions.
Results
Seven studies, assessing five drugs, were judged to be sufficiently comparable for inclusion in the NMA. For the proportions of patients gaining ≥3 lines, triamcinolone 4 mg, ranibizumab 0.5 mg, bevacizumab 1.25 mg and aflibercept 2 mg had a higher probability of being more effective than sham and dexamethasone. A smaller proportion of patients treated with triamcinolone 4 mg, ranibizumab 0.5 mg or aflibercept 2 mg lost ≥3 lines of vision compared to those treated with sham. Patients treated with triamcinolone 4 mg, ranibizumab 0.5 mg, bevacizumab 1.25 mg and aflibercept 2 mg had a higher probability of improvement in the mean best corrected visual acuity compared to those treated with sham injections.
Conclusions
We found no evidence of differences between ranibizumab, aflibercept, bevacizumab and triamcinolone for improving vision. The antivascular endothelial growth factors (VEGFs) are likely to be favoured because they are not associated with steroid-induced cataract formation. Aflibercept may be preferred by clinicians because it might require fewer injections.
Resumo:
Background: Excessive use of empirical antibiotics is common in critically ill patients. Rapid biomarker-based exclusion of infection may improve antibiotic stewardship in ventilator-acquired pneumonia (VAP). However, successful validation of the usefulness of potential markers in this setting is exceptionally rare.
Objectives: We sought to validate the capacity for specific host inflammatory mediators to exclude pneumonia in patients with suspected VAP.
Methods: A prospective, multicentre, validation study of patients with suspected VAP was conducted in 12 intensive care units. VAP was confirmed following bronchoscopy by culture of a potential pathogen in bronchoalveolar lavage fluid (BALF) at >104 colony forming units per millilitre (cfu/mL). Interleukin-1 beta (IL-1β), IL-8, matrix metalloproteinase-8 (MMP-8), MMP-9 and human neutrophil elastase (HNE) were quantified in BALF. Diagnostic utility was determined for biomarkers individually and in combination.
Results: Paired BALF culture and biomarker results were available for 150 patients. 53 patients (35%) had VAP and 97 (65%) patients formed the non-VAP group. All biomarkers were significantly higher in the VAP group (p<0.001). The area under the receiver operator characteristic curve for IL-1β was 0.81; IL-8, 0.74; MMP-8, 0.76; MMP-9, 0.79 and HNE, 0.78. A combination of IL-1β and IL-8, at the optimal cut-point, excluded VAP with a sensitivity of 100%, a specificity of 44.3% and a post-test probability of 0% (95% CI 0% to 9.2%).
Conclusions: Low BALF IL-1β in combination with IL-8 confidently excludes VAP and could form a rapid biomarker-based rule-out test, with the potential to improve antibiotic stewardship.
Resumo:
Overwintering diving ducks at Lough Neagh have declined dramatically in recent years, but it has been suggested that on-to-offshore redistribution may have led to an underestimate of numbers. Most species feed nocturnally and their distribution at night is unknown. We used radar and visual observations from on board commercial sand barges to determine the diurnal distribution of diving duck flocks in an effort to assess the feasibility of using standard
boat-mounted radar to describe their nocturnal feeding distribution. Sand barge radar was poor in identifying flocks compared to independent visual observations as it was sensitive to interference by waves during windy conditions. However, visual observations were useful in describing diurnal distribution. Sand barges were on average 1.5km from shore when a flock of diving ducks was observed and the probability of detection declined with distance from shore. This supports the reliability of shore-based counts in monitoring and surveillance. Given the poor performance of commercially available boatmounted radar systems, we recommend the use of specialised terrestrial Bird Detecting Radar to determine the movements of diving ducks at Lough Neagh.
Resumo:
We characterize the planetary system Kepler-101 by performing a combined differential evolution Markov chain Monte Carlo analysisof Kepler data and forty radial velocities obtained with the HARPS-N spectrograph. This system was previously validated and iscomposed of a hot super-Neptune, Kepler-101b, and an Earth-sized planet, Kepler-101c. These two planets orbit the slightly evolvedand metal-rich G-type star in 3.49 and 6.03 days, respectively. With mass Mp = 51.1+5.1−4.7 M⊕, radius Rp = 5.77+0.85−0.79 R⊕, and density ρp = 1.45+0.83 −0.48 g cm−3, Kepler-101b is the first fully characterized super-Neptune, and its density suggests that heavy elements makeup a significant fraction of its interior; more than 60% of its total mass. Kepler-101c has a radius of 1.25+0.19−0.17 R⊕, which implies theabsence of any H/He envelope, but its mass could not be determined because of the relative faintness of the parent star for highly precise radial-velocity measurements (Kp = 13.8) and the limited number of radial velocities. The 1σ upper limit, Mp < 3.8 M⊕, excludes a pure iron composition with a probability of 68.3%. The architecture of the planetary system Kepler-101 − containing aclose-in giant planet and an outer Earth-sized planet with a period ratio slightly larger than the 3:2 resonance − is certainly of interest for scenarios of planet formation and evolution. This system does not follow the previously reported trend that the larger planet has the longer period in the majority of Kepler systems of planet pairs with at least one Neptune-sized or larger planet.
Resumo:
BACKGROUND: Age-related macular degeneration is the most common cause of sight impairment in the UK. In neovascular age-related macular degeneration (nAMD), vision worsens rapidly (over weeks) due to abnormal blood vessels developing that leak fluid and blood at the macula.
OBJECTIVES: To determine the optimal role of optical coherence tomography (OCT) in diagnosing people newly presenting with suspected nAMD and monitoring those previously diagnosed with the disease.
DATA SOURCES: Databases searched: MEDLINE (1946 to March 2013), MEDLINE In-Process & Other Non-Indexed Citations (March 2013), EMBASE (1988 to March 2013), Biosciences Information Service (1995 to March 2013), Science Citation Index (1995 to March 2013), The Cochrane Library (Issue 2 2013), Database of Abstracts of Reviews of Effects (inception to March 2013), Medion (inception to March 2013), Health Technology Assessment database (inception to March 2013).
REVIEW METHODS: Types of studies: direct/indirect studies reporting diagnostic outcomes.
INDEX TEST: time domain optical coherence tomography (TD-OCT) or spectral domain optical coherence tomography (SD-OCT).
COMPARATORS: clinical evaluation, visual acuity, Amsler grid, colour fundus photographs, infrared reflectance, red-free images/blue reflectance, fundus autofluorescence imaging, indocyanine green angiography, preferential hyperacuity perimetry, microperimetry. Reference standard: fundus fluorescein angiography (FFA). Risk of bias was assessed using quality assessment of diagnostic accuracy studies, version 2. Meta-analysis models were fitted using hierarchical summary receiver operating characteristic curves. A Markov model was developed (65-year-old cohort, nAMD prevalence 70%), with nine strategies for diagnosis and/or monitoring, and cost-utility analysis conducted. NHS and Personal Social Services perspective was adopted. Costs (2011/12 prices) and quality-adjusted life-years (QALYs) were discounted (3.5%). Deterministic and probabilistic sensitivity analyses were performed.
RESULTS: In pooled estimates of diagnostic studies (all TD-OCT), sensitivity and specificity [95% confidence interval (CI)] was 88% (46% to 98%) and 78% (64% to 88%) respectively. For monitoring, the pooled sensitivity and specificity (95% CI) was 85% (72% to 93%) and 48% (30% to 67%) respectively. The FFA for diagnosis and nurse-technician-led monitoring strategy had the lowest cost (£39,769; QALYs 10.473) and dominated all others except FFA for diagnosis and ophthalmologist-led monitoring (£44,649; QALYs 10.575; incremental cost-effectiveness ratio £47,768). The least costly strategy had a 46.4% probability of being cost-effective at £30,000 willingness-to-pay threshold.
LIMITATIONS: Very few studies provided sufficient information for inclusion in meta-analyses. Only a few studies reported other tests; for some tests no studies were identified. The modelling was hampered by a lack of data on the diagnostic accuracy of strategies involving several tests.
CONCLUSIONS: Based on a small body of evidence of variable quality, OCT had high sensitivity and moderate specificity for diagnosis, and relatively high sensitivity but low specificity for monitoring. Strategies involving OCT alone for diagnosis and/or monitoring were unlikely to be cost-effective. Further research is required on (i) the performance of SD-OCT compared with FFA, especially for monitoring but also for diagnosis; (ii) the performance of strategies involving combinations/sequences of tests, for diagnosis and monitoring; (iii) the likelihood of active and inactive nAMD becoming inactive or active respectively; and (iv) assessment of treatment-associated utility weights (e.g. decrements), through a preference-based study.
STUDY REGISTRATION: This study is registered as PROSPERO CRD42012001930.
FUNDING: The National Institute for Health Research Health Technology Assessment programme.
Resumo:
We examined a remnant host plant (Primula veris L.) habitat network that was last inhabited by the rare butterfly Hamearis lucina L. in north Wales in 1943, to assess the relative contribution of several spatial parameters to its regional extinction. We first examined relationships between P. veris characteristics and H. lucina eggs in surviving H. lucina populations, and used these to predict the suitability and potential carrying capacity of the habitat network in north Wales. This resulted in an estimate of roughly 4500 eggs (ca 227 adults). We developed a discrete space, discrete time metapopulation model to evaluate the relative contribution of dispersal distance, habitat and environmental stochasticity as possible causes of extinction. We simulated the potential persistence of the butterfly in the current network as well as in three artificial (historical and present) habitat networks that differed in the quantity (current and X3) and fragmentation of the habitat (current and aggregated). We identified that reduced habitat quantity and increased isolation would have increased the probability of regional extinction, in conjunction with environmental stochasticity and H. lucina's dispersal distance. This general trend did not change in a qualitative manner when we modified the ability of dispersing females to stay in, and find suitable habitats (by changing the size of the grid cells used in the model). Contrary to most metapopulation model predictions, system persistence declined with increasing migration rate, suggesting that the mortality of migrating individuals in fragmented landscapes may pose significant risks to system-wide persistence. Based on model predictions for the present landscape we argue that a major programme of habitat restoration would be required for a re-established metapopulation to persist for > 100 years.