929 resultados para weak informative prior


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study panel count data with informative observation times. We assume nonparametric and semiparametric proportional rate models for the underlying recurrent event process, where the form of the baseline rate function is left unspecified and a subject-specific frailty variable inflates or deflates the rate function multiplicatively. The proposed models allow the recurrent event processes and observation times to be correlated through their connections with the unobserved frailty; moreover, the distributions of both the frailty variable and observation times are considered as nuisance parameters. The baseline rate function and the regression parameters are estimated by maximizing a conditional likelihood function of observed event counts and solving estimation equations. Large sample properties of the proposed estimators are studied. Numerical studies demonstrate that the proposed estimation procedures perform well for moderate sample sizes. An application to a bladder tumor study is presented to illustrate the use of the proposed methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A recent article in this journal (Ioannidis JP (2005) Why most published research findings are false. PLoS Med 2: e124) argued that more than half of published research findings in the medical literature are false. In this commentary, we examine the structure of that argument, and show that it has three basic components: 1)An assumption that the prior probability of most hypotheses explored in medical research is below 50%. 2)Dichotomization of P-values at the 0.05 level and introduction of a “bias” factor (produced by significance-seeking), the combination of which severely weakens the evidence provided by every design. 3)Use of Bayes theorem to show that, in the face of weak evidence, hypotheses with low prior probabilities cannot have posterior probabilities over 50%. Thus, the claim is based on a priori assumptions that most tested hypotheses are likely to be false, and then the inferential model used makes it impossible for evidence from any study to overcome this handicap. We focus largely on step (2), explaining how the combination of dichotomization and “bias” dilutes experimental evidence, and showing how this dilution leads inevitably to the stated conclusion. We also demonstrate a fallacy in another important component of the argument –that papers in “hot” fields are more likely to produce false findings. We agree with the paper’s conclusions and recommendations that many medical research findings are less definitive than readers suspect, that P-values are widely misinterpreted, that bias of various forms is widespread, that multiple approaches are needed to prevent the literature from being systematically biased and the need for more data on the prevalence of false claims. But calculating the unreliability of the medical research literature, in whole or in part, requires more empirical evidence and different inferential models than were used. The claim that “most research findings are false for most research designs and for most fields” must be considered as yet unproven.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous time series studies have provided strong evidence of an association between increased levels of ambient air pollution and increased levels of hospital admissions, typically at 0, 1, or 2 days after an air pollution episode. An important research aim is to extend existing statistical models so that a more detailed understanding of the time course of hospitalization after exposure to air pollution can be obtained. Information about this time course, combined with prior knowledge about biological mechanisms, could provide the basis for hypotheses concerning the mechanism by which air pollution causes disease. Previous studies have identified two important methodological questions: (1) How can we estimate the shape of the distributed lag between increased air pollution exposure and increased mortality or morbidity? and (2) How should we estimate the cumulative population health risk from short-term exposure to air pollution? Distributed lag models are appropriate tools for estimating air pollution health effects that may be spread over several days. However, estimation for distributed lag models in air pollution and health applications is hampered by the substantial noise in the data and the inherently weak signal that is the target of investigation. We introduce an hierarchical Bayesian distributed lag model that incorporates prior information about the time course of pollution effects and combines information across multiple locations. The model has a connection to penalized spline smoothing using a special type of penalty matrix. We apply the model to estimating the distributed lag between exposure to particulate matter air pollution and hospitalization for cardiovascular and respiratory disease using data from a large United States air pollution and hospitalization database of Medicare enrollees in 94 counties covering the years 1999-2002.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radiation dose delivered from the SCANORA radiography unit during the cross-sectional mode for dentotangential projections was determined. With regard to oral implantology, patient situations of an edentulous maxilla and mandible as well as a single tooth gap in regions 16 and 46 were simulated. Radiation doses were measured between 0.2 and 22.5 mGy to organs and tissues in the head and neck region when the complete maxilla or mandible was examined. When examining a single tooth gap, only 8% to 40% of that radiation dose was generally observed. Based on these results, the mortality risk was estimated according to a calculation model recommended by the Committee on the Biological Effects of Ionizing Radiations. The mortality risk ranged from 31.4 x 10(-6) for 20-year-old men to 4.8 x 10(-6) for 65-year-old women when cross-sectional imaging of the complete maxilla was performed. The values decreased by 70% when a single tooth gap in the molar region of the maxilla was radiographed. The figures for the mortality risk for examinations of the complete mandible were similar to those for the complete maxilla, but the mortality risk decreased by 80% if only a single tooth gap in the molar region of the mandible was examined. Calculations according to the International Commission on Radiological Protection carried out for comparison did not reveal the decrease of the mortality risk with age and resulted in a higher risk value in comparison to the group of 35-year old individuals in calculations according to the Committee on the Biological Effects of Ionizing Radiations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to Bell's theorem a large class of hidden-variable models obeying Bell's notion of local causality (LC) conflict with the predictions of quantum mechanics. Recently, a Bell-type theorem has been proven using a weaker notion of LC, yet assuming the existence of perfectly correlated event types. Here we present a similar Bell-type theorem without this latter assumption. The derived inequality differs from the Clauser-Horne inequality by some small correction terms, which render it less constraining.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: The clinical role of CAD systems to detect breast cancer, which have not been on cancer containing mammograms not detected by the radiologist was proven retrospectively. METHODS: All patients from 1992 to 2005 with a histologically verified malignant breast lesion and a mammogram at our department, were analyzed in retrospect focussing on the time of detection of the malignant lesion. All prior mammograms were analyzed by CAD (CADx, USA). The resulting CAD printout was matched with the cancer containing images yielding to the radiological diagnosis of breast cancer. CAD performance, sensitivity as well as the association of CAD and radiological features were analyzed. RESULTS: 278 mammograms fulfilled the inclusion criteria. 111 cases showed a retrospectively visible lesion (71 masses, 23 single microcalcification clusters, 16 masses with microcalcifications, in one case two microcalcification clusters). 54/87 masses and 34/41 microcalcifications were detected by CAD. Detection rates varied from 9/20 (ACR 1) to 5/7 (ACR 4) (45% vs. 71%). The detection of microcalcifications was not influenced by breast tissue density. CONCLUSION: CAD might be useful in an earlier detection of subtle breast cancer cases, which might remain otherwise undetected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: We investigated clinical predictors of appropriate prophylaxis prior to the onset of venous thromboembolism (VTE). METHODS: In 14 Swiss hospitals, 567 consecutive patients (306 medical, 261 surgical) with acute VTE and hospitalization < 30 days prior to the VTE event were enrolled. RESULTS: Prophylaxis was used in 329 (58%) patients within 30 days prior to the VTE event. Among the medical patients, 146 (48%) received prophylaxis, and among the surgical patients, 183 (70%) received prophylaxis (P < 0.001). The indication for prophylaxis was present in 262 (86%) medical patients and in 217 (83%) surgical patients. Among the patients with an indication for prophylaxis, 135 (52%) of the medical patients and 165 (76%) of the surgical patients received prophylaxis (P < 0.001). Admission to the intensive care unit [odds ratio (OR) 3.28, 95% confidence interval (CI) 1.94-5.57], recent surgery (OR 2.28, 95% CI 1.51-3.44), bed rest > 3 days (OR 2.12, 95% CI 1.45-3.09), obesity (OR 2.01, 95% CI 1.03-3.90), prior deep vein thrombosis (OR 1.71, 95% CI 1.31-2.24) and prior pulmonary embolism (OR 1.54, 95% CI 1.05-2.26) were independent predictors of prophylaxis. In contrast, cancer (OR 1.06, 95% CI 0.89-1.25), age (OR 0.99, 95% CI 0.98-1.01), acute heart failure (OR 1.13, 95% CI 0.79-1.63) and acute respiratory failure (OR 1.19, 95% CI 0.89-1.59) were not predictive of prophylaxis. CONCLUSIONS: Although an indication for prophylaxis was present in most patients who suffered acute VTE, almost half did not receive any form of prophylaxis. Future efforts should focus on the improvement of prophylaxis for hospitalized patients, particularly in patients with cancer, acute heart or respiratory failure, and in the elderly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: There are only limited data on whether prior statin use and/or cholesterol levels are associated with intracranial hemorrhage (ICH) and outcome after intra-arterial thrombolysis. The purpose of this study was to evaluate the association of statin pretreatment and cholesterol levels with the overall frequency of ICH, the frequency of symptomatic ICH, and clinical outcome at 3 months. METHODS: We analyzed 311 consecutive patients (mean age, 63 years; 43% women) who received intra-arterial thrombolysis. RESULTS: Statin pretreatment was present in 18%. The frequency of any ICH was 20.6% and of symptomatic ICH 4.8%. Patients with any ICH were more often taking statins (30% versus 15%, P=0.005), more often had atrial fibrillation (45% versus 30%, P=0.016), had more severe strokes (mean National Institute of Health Stroke Scale score 16.5 versus 14.7, P=0.022), and less often good collaterals (16% versus 24%, P=0.001). Patients with symptomatic ICH were more often taking statins (40% versus 15%, P=0.009) and had less often good collaterals (0% versus 24%, P<0.001). Any ICH or symptomatic ICH were not associated with cholesterol levels. After multivariate analysis, the frequency of any ICH remained independently associated with previous statin use (OR, 3.1; 95% CI, 1.53 to 6.39; P=0.004), atrial fibrillation (OR, 2.5; CI, 1.35 to 4.75; P=0.004), National Institutes of Health Stroke Scale score (OR, 1.1; CI, 1.00 to 1.10; P=0.037), and worse collaterals (OR, 1.7; CI, 1.19 to 2.42; P=0.004). There was no association of outcome with prior statin use, total cholesterol level, or low-density lipoprotein cholesterol level. CONCLUSIONS: Prior statin use, but not cholesterol levels on admission, is associated with a higher frequency of any ICH after intra-arterial thrombolysis without impact on outcome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing survival rates in young cancer patients, new reproductive techniques and the growing interest in quality of life after gonadotoxic cancer therapies have placed fertility preservation as an important issue to oncologists, fertility specialists and patients. Several techniques are now available for fertility preservation in these patients. A new promising method is cryopreservation and transplantation of ovarian cortex. Ovarian tissue can be extracted by laparoscopy without any significant delay of gonadotoxic therapy. The tissue can be cryopreserved by specialised centres of reproductive medicine and transplanted in case the women experience premature ovarian failure (POF). This review summarises the European expertise on cryopreservation and transplantation of ovarian tissue, following around 30 reported transplantations globally, resulting in six live births and several ongoing pregnancies. It emphasises that fertility preservation by the cryopreservation of ovarian tissue is a new but already a successful clinical option, which can be considered for selected cancer patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a statistical inference scenario, the estimation of target signal or its parameters is done by processing data from informative measurements. The estimation performance can be enhanced if we choose the measurements based on some criteria that help to direct our sensing resources such that the measurements are more informative about the parameter we intend to estimate. While taking multiple measurements, the measurements can be chosen online so that more information could be extracted from the data in each measurement process. This approach fits well in Bayesian inference model often used to produce successive posterior distributions of the associated parameter. We explore the sensor array processing scenario for adaptive sensing of a target parameter. The measurement choice is described by a measurement matrix that multiplies the data vector normally associated with the array signal processing. The adaptive sensing of both static and dynamic system models is done by the online selection of proper measurement matrix over time. For the dynamic system model, the target is assumed to move with some distribution and the prior distribution at each time step is changed. The information gained through adaptive sensing of the moving target is lost due to the relative shift of the target. The adaptive sensing paradigm has many similarities with compressive sensing. We have attempted to reconcile the two approaches by modifying the observation model of adaptive sensing to match the compressive sensing model for the estimation of a sparse vector.