996 resultados para prediction intervals


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several studies have suggested that men with raised plasma triglycerides (TGs) in combination with adverse levels of other lipids may be at special risk of subsequent ischemic heart disease (IHD). We examined the independent and combined effects of plasma lipids at 10 years of follow-up. We measured fasting TGs, total cholesterol (TC), and high density lipoprotein cholesterol (HDLC) in 4362 men (aged 45 to 63 years) from 2 study populations and reexamined them at intervals during a 10-year follow-up. Major IHD events (death from IHD, clinical myocardial infarction, or ECG-defined myocardial infarction) were recorded. Five hundred thirty-three major IHD events occurred. All 3 lipids were strongly and independently predictive of IHD after 10 years of follow-up. Subjects were then divided into 27 groups (ie, 33) by the tertiles of TGs, TC, and HDLC. The number of events observed in each group was compared with that predicted by a logistic regression model, which included terms for the 3 lipids (without interactions) and potential confounding variables. The incidence of IHD was 22.6% in the group with the lipid risk factor combination with the highest expected risk (high TGs, high TC, and low HDLC) and 4.7% in the group with the lowest expected risk (P

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the importance of larval abundance in determining the recruitment of benthic marine invertebrates and as a major factor in marine benthic community structure, relating planktonic larval abundance with post-settlement post-larvae and juveniles in the benthos is difficult. It is hampered by several methodological difficulties, including sampling frequency, ability to follow larval and post-larval or juvenile cohorts, and ability to calculate growth and mortality rates. In our work, an intensive sampling strategy was used. Larvae in the plankton were collected at weekly intervals, while post-larvae that settled into collectors were analysed fortnightly. Planktonic larval and benthic post-larval/juvenile cohorts were determined, and growth and mortality rates calculated. Integration of all equations allowed the development of a theoretical formulation that, based on the abundance and planktonic larval duration, permits an estimation of the future abundance of post-larvae/juveniles during the first year of benthic life. The model can be applied to a sample in which it was necessary only to measure larval length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the methodology of providing multiprobability predictions for proteomic mass spectrometry data. The methodology is based on a newly developed machine learning framework called Venn machines. Is allows to output a valid probability interval. The methodology is designed for mass spectrometry data. For demonstrative purposes, we applied this methodology to MALDI-TOF data sets in order to predict the diagnosis of heart disease and early diagnoses of ovarian cancer and breast cancer. The experiments showed that probability intervals are narrow, that is, the output of the multiprobability predictor is similar to a single probability distribution. In addition, probability intervals produced for heart disease and ovarian cancer data were more accurate than the output of corresponding probability predictor. When Venn machines were forced to make point predictions, the accuracy of such predictions is for the most data better than the accuracy of the underlying algorithm that outputs single probability distribution of a label. Application of this methodology to MALDI-TOF data sets empirically demonstrates the validity. The accuracy of the proposed method on ovarian cancer data rises from 66.7 % 11 months in advance of the moment of diagnosis to up to 90.2 % at the moment of diagnosis. The same approach has been applied to heart disease data without time dependency, although the achieved accuracy was not as high (up to 69.9 %). The methodology allowed us to confirm mass spectrometry peaks previously identified as carrying statistically significant information for discrimination between controls and cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background
Automated candidate gene prediction systems allow geneticists to hone in on disease genes more rapidly by identifying the most probable candidate genes linked to the disease phenotypes under investigation. Here we assessed the ability of eight different candidate gene prediction systems to predict disease genes in intervals previously associated with type 2 diabetes by benchmarking their performance against genes implicated by recent genome-wide association studies.

Results

Using a search space of 9556 genes, all but one of the systems pruned the genome in favour of genes associated with moderate to highly significant SNPs. Of the 11 genes associated with highly significant SNPs identified by the genome-wide association studies, eight were flagged as likely candidates by at least one of the prediction systems. A list of candidates produced by a previous consensus approach did not match any of the genes implicated by 706 moderate to highly significant SNPs flagged by the genome-wide association studies. We prioritized genes associated with medium significance SNPs.

Conclusion
The study appraises the relative success of several candidate gene prediction systems against independent genetic data. Even when confronted with challengingly large intervals, the candidate gene prediction systems can successfully select likely disease genes. Furthermore, they can be used to filter statistically less-well-supported genetic data to select more likely candidates. We suggest consensus approaches fail because they penalize novel predictions made from independent underlying databases. To realize their full potential further work needs to be done on prioritization and annotation of genes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the efficacy and reliability of Artificial Neural Networks (ANNs) as an intelligent decision support tool for pharmaceutical product formulation. Two case studies have been employed to evaluate capabilities of the Multilayer Perceptron network in predicting drug dissolution/release profiles. Performances of the network were evaluated using similarity factor (&fnof[sub 2]) — an index recommended by the United States Food and Drug Administration for profile comparison in pharmaceutical research. In addition, the bootstrap method was applied to assess the network prediction reliability by estimating confidence intervals associated with the results. The Multilayer Perceptron network also demonstrated a superior performance in comparison with multiple regression models. The results reveal that the ANN system has potentials to be a decision support tool for profile prediction in pharmaceutical experimentation, and the bootstrap method could be used as a means to assess reliability of the network prediction. [ABSTRACT FROM AUTHOR].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background : Although a wealth of studies have tested the link between negative mood states and likelihood of a subsequent binge eating episode, the assumption that this relationship follows a typical linear dose–response pattern (i.e., that risk of a binge episode increases in proportion to level of negative mood) has not been challenged. The present study demonstrates the applicability of an alternative, non-linear conceptualization of this relationship, in which the strength of association between negative mood and probability of a binge episode increases above a threshold value for the mood variable relative to the slope below this threshold value (threshold dose response model).

Methods
: A sample of 93 women aged 18 to 40 completed an online survey at random intervals seven times per day for a period of one week. Participants self-reported their current mood state and whether they had recently engaged in an eating episode symptomatic of a binge.

Results
: As hypothesized, the threshold approach was a better predictor than the linear dose–response modeling of likelihood of a binge episode. The superiority of the threshold approach was found even at low levels of negative mood (3 out of 10, with higher scores reflecting more  negative mood). Additionally, severity of negative mood beyond this threshold value appears to be useful for predicting time to onset of a binge episode.

Conclusions
: Present findings suggest that simple dose–response formulations for the association between  negative mood and onset of binge episodes miss vital aspects of this relationship. Most  notably, the impact of mood on binge eating appears to depend on whether a threshold value  of negative mood has been breached, and elevation in mood beyond this point may be useful  for clinicians and researchers to identify time to onset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated the relationship between the Big 5, measured at factor and facet levels, and dimensions of both psychological and subjective well-being. Three hundred and thirty-seven participants completed the 30 Facet International Personality Item Pool Scale, Satisfaction with Life Scale, Positive and Negative Affectivity Schedule, and Ryff’s Scales of Psychological Well-Being. Cross-correlation decomposition presented a parsimonious picture of how well-being is related to personality factors. Incremental facet prediction was examined using double-adjusted r2 confidence intervals and semi-partial correlations. Incremental prediction by facets over factors ranged from almost nothing to a third more variance explained, suggesting a more modest incremental prediction than presented in the literature previously. Examination of semi-partial correlations controlling for factors revealed a small number of important facet-well-being correlations. All data and R analysis scripts are made available in an online repository.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Treatments of cancer cause severe side effects called toxicities. Reduction of such effects is crucial in cancer care. To impact care, we need to predict toxicities at fortnightly intervals. This toxicity data differs from traditional time series data as toxicities can be caused by one treatment on a given day alone, and thus it is necessary to consider the effect of the singular data vector causing toxicity. We model the data before prediction points using the multiple instance learning, where each bag is composed of multiple instances associated with daily treatments and patient-specific attributes, such as chemotherapy, radiotherapy, age and cancer types. We then formulate a Bayesian multi-task framework to enhance toxicity prediction at each prediction point. The use of the prior allows factors to be shared across task predictors. Our proposed method simultaneously captures the heterogeneity of daily treatments and performs toxicity prediction at different prediction points. Our method was evaluated on a real-word dataset of more than 2000 cancer patients and had achieved a better prediction accuracy in terms of AUC than the state-of-art baselines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Genética e Melhoramento Animal - FCAV

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suppose that we are interested in establishing simple, but reliable rules for predicting future t-year survivors via censored regression models. In this article, we present inference procedures for evaluating such binary classification rules based on various prediction precision measures quantified by the overall misclassification rate, sensitivity and specificity, and positive and negative predictive values. Specifically, under various working models we derive consistent estimators for the above measures via substitution and cross validation estimation procedures. Furthermore, we provide large sample approximations to the distributions of these nonsmooth estimators without assuming that the working model is correctly specified. Confidence intervals, for example, for the difference of the precision measures between two competing rules can then be constructed. All the proposals are illustrated with two real examples and their finite sample properties are evaluated via a simulation study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical evaluation of arterial potency in acute ST-elevation myocardial infarction (STEMI) is unreliable. We sought to identify infarction and predict infarct-related artery potency measured by the Thrombolysis In Myocardial Infarction (TIMI) score with qualitative and quantitative intravenous myocardial contrast echocardiography (MCE). Thirty-four patients with suspected STEMI underwent MCE before emergency angiography and planned angioplasty. MCE was performed with harmonic imaging and variable triggering intervals during intravenous administration of Optison. Myocardial perfusion was quantified offline, fitting an exponential function to contrast intensity at various pulsing intervals. Plateau myocardial contrast intensity (A), rate of rise (beta), and myocardial flow (Q = A x beta) were assessed in 6 segments. Qualitative assessment of perfusion defects was sensitive for the diagnosis of infarction (sensitivity 93%) and did not differ between anterior and inferior infarctions. However, qualitative assessment had only moderate specificity (50%), and perfusion defects were unrelated to TIMI flow. In patients with STEMI, quantitatively derived myocardial blood flow Q (A x beta) was significantly lower in territories subtended by an artery with impaired (TIMI 0 to 2) flow than those territories supplied by a reperfused artery with TIMI 3 flow (10.2 +/- 9.1 vs 44.3 +/- 50.4, p = 0.03). Quantitative flow was also lower in segments with impaired flow in the subtending artery compared with normal patients with TIMI 3 flow (42.8 +/- 36.6, p = 0.006) and all segments with TIMI 3 flow (35.3 +/- 32.9, p = 0.018). An receiver-operator characteristic curve derived cut-off Q value of

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel method for prediction of the onset of a spontaneous (paroxysmal) atrial fibrilation episode by representing the electrocardiograph (ECG) output as two time series corresponding to the interbeat intervals and the lengths of the atrial component of the ECG. We will then show how different entropy measures can be calulated from both of these series and then combined in a neural network trained using the Bayesian evidence procedure to form and effective predictive classifier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62E16, 65C05, 65C20.