67 resultados para prior probabilities
Resumo:
PURPOSE: To evaluate the consecutive treatment results regarding pterygium recurrence and the efficacy of exclusive strontium-/yttrium-90 beta-irradiation for primary and recurrent pterygia and to analyze the functional outcome. PATIENTS AND METHODS: Between October 1974 and December 2005, 58 primary and 21 recurrent pterygia were exclusively treated with strontium-/yttrium-90 beta-irradiation with doses ranging from 3,600 to 5,500 cGy. The follow-up time was 46.6 +/- 26.7 months, with a median of 46.5 months. RESULTS: The treatment led to a size reduction in all pterygia (p < 0.0001). Neither recurrences nor side effects were observed during therapy and follow-up in this study. Best-corrected visual acuity increased (p = 0.0064). Corneal astigmatism was reduced in recurrent pterygia (p = 0.009). CONCLUSION: Exclusive strontium-/yttrium-90 beta-irradiation of pterygia is a very efficient and well-tolerated treatment, with remarkable aesthetic and rehabilitative results in comparison to conventional treatments, especially for recurrent lesions which have undergone prior surgical excision.
A global historical ozone data set and prominent features of stratospheric variability prior to 1979
Resumo:
We present a vertically resolved zonal mean monthly mean global ozone data set spanning the period 1901 to 2007, called HISTOZ.1.0. It is based on a new approach that combines information from an ensemble of chemistry climate model (CCM) simulations with historical total column ozone information. The CCM simulations incorporate important external drivers of stratospheric chemistry and dynamics (in particular solar and volcanic effects, greenhouse gases and ozone depleting substances, sea surface temperatures, and the quasi-biennial oscillation). The historical total column ozone observations include ground-based measurements from the 1920s onward and satellite observations from 1970 to 1976. An off-line data assimilation approach is used to combine model simulations, observations, and information on the observation error. The period starting in 1979 was used for validation with existing ozone data sets and therefore only ground-based measurements were assimilated. Results demonstrate considerable skill from the CCM simulations alone. Assimilating observations provides additional skill for total column ozone. With respect to the vertical ozone distribution, assimilating observations increases on average the correlation with a reference data set, but does not decrease the mean squared error. Analyses of HISTOZ.1.0 with respect to the effects of El Niño–Southern Oscillation (ENSO) and of the 11 yr solar cycle on stratospheric ozone from 1934 to 1979 qualitatively confirm previous studies that focussed on the post-1979 period. The ENSO signature exhibits a much clearer imprint of a change in strength of the Brewer–Dobson circulation compared to the post-1979 period. The imprint of the 11 yr solar cycle is slightly weaker in the earlier period. Furthermore, the total column ozone increase from the 1950s to around 1970 at northern mid-latitudes is briefly discussed. Indications for contributions of a tropospheric ozone increase, greenhouse gases, and changes in atmospheric circulation are found. Finally, the paper points at several possible future improvements of HISTOZ.1.0.
Resumo:
Exposure to urinary catheters is considered the most important risk factor for healthcare-associated urinary tract infection (UTI) and is associated with significant morbidity and substantial extra-costs. In this study, we assessed the impact of urinary catheterisation (UC) on symptomatic healthcare-associated UTI among hospitalized patients.
Resumo:
Information theory-based metric such as mutual information (MI) is widely used as similarity measurement for multimodal registration. Nevertheless, this metric may lead to matching ambiguity for non-rigid registration. Moreover, maximization of MI alone does not necessarily produce an optimal solution. In this paper, we propose a segmentation-assisted similarity metric based on point-wise mutual information (PMI). This similarity metric, termed SPMI, enhances the registration accuracy by considering tissue classification probabilities as prior information, which is generated from an expectation maximization (EM) algorithm. Diffeomorphic demons is then adopted as the registration model and is optimized in a hierarchical framework (H-SPMI) based on different levels of anatomical structure as prior knowledge. The proposed method is evaluated using Brainweb synthetic data and clinical fMRI images. Both qualitative and quantitative assessment were performed as well as a sensitivity analysis to the segmentation error. Compared to the pure intensity-based approaches which only maximize mutual information, we show that the proposed algorithm provides significantly better accuracy on both synthetic and clinical data.
Resumo:
A microbiopsy system for fast excision and transfer of biological specimens from donor to high-pressure freezer was developed. With a modified, commercially available, Promag 1.2 biopsy gun, tissue samples can be excised with a size small enough (0.6 mm x 1.2 mm x 0.3 mm) to be easily transferred into a newly designed specimen platelet. A self-made transfer unit allows fast transfer of the specimen from the needle into the specimen platelet. The platelet is then fixed in a commercially available specimen holder of a high-pressure freezing machine (EM PACT, Leica Microsystems, Vienna, Austria) and frozen therein. The time required by a well-instructed (but not experienced) person to execute all steps is in the range of half a minute. This period is considered short enough to maintain the excised tissue pieces close to their native state. We show that a range of animal tissues (liver, brain, kidney and muscle) are well preserved. To prove the quality of freezing achieved with the system, we show vitrified ivy leaves high-pressure frozen in the new specimen platelet.
Resumo:
Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two strategies, viz. a globalist as suggested by Loewer, and a localist as advocated by Frigg and Hoefer. Both strategies fail because the system they are part of have rivals that are roughly equally good, while ontic probabilities should be part of a clearly winning system. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.
Resumo:
The talk starts out with a short introduction to the philosophy of probability. I highlight the need to interpret probabilities in the sciences and motivate objectivist accounts of probabilities. Very roughly, according to such accounts, ascriptions of probabilities have truth-conditions that are independent of personal interests and needs. But objectivist accounts are pointless if they do not provide an objectivist epistemology, i.e., if they do not determine well-defined methods to support or falsify claims about probabilities. In the rest of the talk I examine recent philosophical proposals for an objectivist methodology. Most of them take up ideas well-known from statistics. I nevertheless find some proposals incompatible with objectivist aspirations.
Resumo:
How do probabilistic models represent their targets and how do they allow us to learn about them? The answer to this question depends on a number of details, in particular on the meaning of the probabilities involved. To classify the options, a minimalist conception of representation (Su\'arez 2004) is adopted: Modelers devise substitutes (``sources'') of their targets and investigate them to infer something about the target. Probabilistic models allow us to infer probabilities about the target from probabilities about the source. This leads to a framework in which we can systematically distinguish between different models of probabilistic modeling. I develop a fully Bayesian view of probabilistic modeling, but I argue that, as an alternative, Bayesian degrees of belief about the target may be derived from ontic probabilities about the source. Remarkably, some accounts of ontic probabilities can avoid problems if they are supposed to apply to sources only.
Resumo:
When tilted sideways participants misperceive the visual vertical assessed by means of a luminous line in otherwise complete dark- ness. A recent modeling approach (De Vrijer et al., 2009) claimed that these typical patterns of errors (known as A- and E-effects) could be explained by as- suming that participants behave in a Bayes optimal manner. In this study, we experimentally manipulate participants’ prior information about body-in-space orientation and measure the effect of this manipulation on the subjective visual vertical (SVV). Specifically, we explore the effects of veridical and misleading instructions about body tilt orientations on the SVV. We used a psychophys- ical 2AFC SVV task at roll tilt angles of 0 degrees, 16 degrees and 4 degrees CW and CCW. Participants were tilted to 4 degrees under different instruction conditions: in one condition, participants received veridical instructions as to their tilt angle, whereas in another condition, participants received the mis- leading instruction that their body position was perfectly upright. Our results indicate systematic differences between the instruction conditions at 4 degrees CW and CCW. Participants did not simply use an ego-centric reference frame in the misleading condition; instead, participants’ estimates of the SVV seem to lie between their head’s Z-axis and the estimate of the SVV as measured in the veridical condition. All participants displayed A-effects at roll tilt an- gles of 16 degrees CW and CCW. We discuss our results in the context of the Bayesian model by De Vrijer et al. (2009), and claim that this pattern of re- sults is consistent with a manipulation of precision of a prior distribution over body-in-space orientations. Furthermore, we introduce a Bayesian Generalized Linear Model for estimating parameters of participants’ psychometric function, which allows us to jointly estimate group level and individual level parameters under all experimental conditions simultaneously, rather than relying on the traditional two-step approach to obtaining group level parameter estimates.
Resumo:
OBJECTIVES This study sought to determine the effect of rotational atherectomy (RA) on drug-eluting stent (DES) effectiveness. BACKGROUND DES are frequently used in complex lesions, including calcified stenoses, which may challenge DES delivery, expansion, and effectiveness. RA can adequately modify calcified plaques and facilitate stent delivery and expansion. Its impact on DES effectiveness is widely unknown. METHODS The ROTAXUS (Rotational Atherectomy Prior to TAXUS Stent Treatment for Complex Native Coronary Artery Disease) study randomly assigned 240 patients with complex calcified native coronary lesions to RA followed by stenting (n = 120) or stenting without RA (n = 120, standard therapy group). Stenting was performed using a polymer-based slow-release paclitaxel-eluting stent. The primary endpoint was in-stent late lumen loss at 9 months. Secondary endpoints included angiographic and strategy success, binary restenosis, definite stent thrombosis, and major adverse cardiac events at 9 months. RESULTS Despite similar baseline characteristics, significantly more patients in the standard therapy group were crossed over (12.5% vs. 4.2%, p = 0.02), resulting in higher strategy success in the rotablation group (92.5% vs. 83.3%, p = 0.03). At 9 months, in-stent late lumen loss was higher in the rotablation group (0.44 ± 0.58 vs. 0.31 ± 0.52, p = 0.04), despite an initially higher acute lumen gain (1.56 ± 0.43 vs. 1.44 ± 0.49 mm, p = 0.01). In-stent binary restenosis (11.4% vs. 10.6%, p = 0.71), target lesion revascularization (11.7% vs. 12.5%, p = 0.84), definite stent thrombosis (0.8% vs. 0%, p = 1.0), and major adverse cardiac events (24.2% vs. 28.3%, p = 0.46) were similar in both groups. CONCLUSIONS Routine lesion preparation using RA did not reduce late lumen loss of DES at 9 months. Balloon dilation with only provisional rotablation remains the default strategy for complex calcified lesions before DES implantation.
Resumo:
BACKGROUND Patients with prior coronary artery bypass graft surgery (CABG) who present with an acute coronary syndrome have a high risk for recurrent events. Whether intensive antiplatelet therapy with ticagrelor might be beneficial compared with clopidogrel is unknown. In this substudy of the PLATO trial, we studied the effects of randomized treatment dependent on history of CABG. METHODS Patients participating in PLATO were classified according to whether they had undergone prior CABG. The trial's primary and secondary end points were compared using Cox proportional hazards regression. RESULTS Of the 18,613 study patients, 1,133 (6.1%) had prior CABG. Prior-CABG patients had more high-risk characteristics at study entry and a 2-fold increase in clinical events during follow-up, but less major bleeding. The primary end point (composite of cardiovascular death, myocardial infarction, and stroke) was reduced to a similar extent by ticagrelor among patients with (19.6% vs 21.4%; adjusted hazard ratio [HR], 0.91 [0.67, 1.24]) and without (9.2% vs 11.0%; adjusted HR, 0.86 [0.77, 0.96]; P(interaction) = .73) prior CABG. Major bleeding was similar with ticagrelor versus clopidogrel among patients with (8.1% vs 8.7%; adjusted HR, 0.89 [0.55, 1.47]) and without (11.8% vs 11.4%; HR, 1.08 [0.98, 1.20]; P(interaction) = .46) prior CABG. CONCLUSIONS Prior-CABG patients presenting with acute coronary syndrome are a high-risk cohort for death and recurrent cardiovascular events but have a lower risk for major bleeding. Similar to the results in no-prior-CABG patients, ticagrelor was associated with a reduction in ischemic events without an increase in major bleeding.
Resumo:
This article proposes computing sensitivities of upper tail probabilities of random sums by the saddlepoint approximation. The considered sensitivity is the derivative of the upper tail probability with respect to the parameter of the summation index distribution. Random sums with Poisson or Geometric distributed summation indices and Gamma or Weibull distributed summands are considered. The score method with importance sampling is considered as an alternative approximation. Numerical studies show that the saddlepoint approximation and the method of score with importance sampling are very accurate. But the saddlepoint approximation is substantially faster than the score method with importance sampling. Thus, the suggested saddlepoint approximation can be conveniently used in various scientific problems.