109 resultados para Competing-risk analyses
Resumo:
To assess the prevalence of tooth wear on buccal/facial and lingual/palatal tooth surfaces and identify related risk factors in a sample of young European adults, aged 18-35 years. Calibrated and trained examiners measured tooth wear, using the basic erosive wear examination (BEWE) on in 3187 patients in seven European countries and assessed the impact of risk factors with a previously validated questionnaire. Each individual was characterized by the highest BEWE score recorded for any scoreable surface. Bivariate analyses examined the proportion of participants who scored 2 or 3 in relation to a range of demographic, dietary and oral care variables. The highest tooth wear BEWE score was 0 for 1368 patients (42.9%), 1 for 883 (27.7%), 2 for 831 (26.1%) and 3 for 105 (3.3%). There were large differences between different countries with the highest levels of tooth wear observed in the UK. Important risk factors for tooth wear included heartburn or acid reflux, repeated vomiting, residence in rural areas, electric tooth brushing and snoring. We found no evidence that waiting after breakfast before tooth brushing has any effect on the degree of tooth wear (p=0.088). Fresh fruit and juice intake was positively associated with tooth wear. In this adult sample 29% had signs of tooth wear making it a common presenting feature in European adults.
Resumo:
Systematic reviews and meta-analyses of randomized trials that include patient-reported outcomes (PROs) often provide crucial information for patients, clinicians and policy-makers facing challenging health care decisions. Based on emerging methods, guidance on improving the interpretability of meta-analysis of patient-reported outcomes, typically continuous in nature, is likely to enhance decision-making. The objective of this paper is to summarize approaches to enhancing the interpretability of pooled estimates of PROs in meta-analyses. When differences in PROs between groups are statistically significant, decision-makers must be able to interpret the magnitude of effect. This is challenging when, as is often the case, clinical trial investigators use different measurement instruments for the same construct within and between individual randomized trials. For such cases, in addition to pooling results as a standardized mean difference, we recommend that systematic review authors use other methods to present results such as relative (relative risk, odds ratio) or absolute (risk difference) dichotomized treatment effects, complimented by presentation in either: natural units (e.g. overall depression reduced by 2.4 points when measured on a 50-point Hamilton Rating Scale for Depression); minimal important difference units (e.g. where 1.0 unit represents the smallest difference in depression that patients, on average, perceive as important the depression score was 0.38 (95%CI 0.30 to 0.47) units less than the control group); or a ratio of means (e.g. where the mean in the treatment group is divided by the mean in the control group, the ratio of means is 1.27, representing a 27%relative reduction in the mean depression score).
Resumo:
BACKGROUND Marfan syndrome (MFS) is a variable, autosomal-dominant disorder of the connective tissue. In MFS serious ventricular arrhythmias and sudden cardiac death (SCD) can occur. The aim of this prospective study was to reveal underlying risk factors and to prospectively investigate the association between MFS and SCD in a long-term follow-up. METHODS 77 patients with MFS were included. At baseline serum N-terminal pro-brain natriuretic peptide (NT-proBNP), transthoracic echocardiogram, 12-lead resting ECG, signal-averaged ECG (SAECG) and a 24-h Holter ECG with time- and frequency domain analyses were performed. The primary composite endpoint was defined as SCD, ventricular tachycardia (VT), ventricular fibrillation (VF) or arrhythmogenic syncope. RESULTS The median follow-up (FU) time was 868 days. Among all risk stratification parameters, NT-proBNP remained the exclusive predictor (hazard ratio [HR]: 2.34, 95% confidence interval [CI]: 1.1 to 4.62, p=0.01) for the composite endpoint. With an optimal cut-off point at 214.3 pg/ml NT-proBNP predicted the composite primary endpoint accurately (AUC 0.936, p=0.00046, sensitivity 100%, specificity 79.0%). During FU, seven patients of Group 2 (NT-proBNP ≥ 214.3 pg/ml) reached the composite endpoint and 2 of these patients died due to SCD. In five patients, sustained VT was documented. All patients with a NT-proBNP<214.3 pg/ml (Group 1) experienced no events. Group 2 patients had a significantly higher risk of experiencing the composite endpoint (logrank-test, p<0.001). CONCLUSIONS In contrast to non-invasive electrocardiographic parameter, NT-proBNP independently predicts adverse arrhythmogenic events in patients with MFS.
Resumo:
BACKGROUND Obesity is a growing problem in western societies. The aim of this retrospective cohort study was to determine the association between the overweight and obese polytrauma patients and pneumonia after injury. METHODS A total of 628 patients with an Injury Severity Score (ISS) of 16 or greater and 16 years or older were included in this retrospective study. The sample was subdivided into three groups as follows: body mass index (BMI) of less than 25 kg/m2; BMI of 25 kg/m2 to 30 kg/m2; and BMI more than 30 kg/m2. The Murray score was assessed at admission and at its maximum during hospitalization to determine pulmonary problems. Pneumonia was defined as bacteriologically positive sputum with appropriate radiologic and laboratory changes (C-reactive protein and interleukin 6). Data are given as mean ± SEM. One-way analysis of variance and the Kruskal-Wallis test were used for the analyses, and the significance level was set at p < 0.05; Bonferroni-Dunn test was performed as post hoc analysis. RESULTS The Abbreviated Injury Scale (AIS) score for the thorax was 3.2 ± 0.1 in the group with a BMI of less than 25 kg/m2, 3.3 ± 0.1 in the group with a BMI of 25 kg/m2 to 30 kg/m2, and 2.8 ± 0.2 in the group with BMI of more than 30 kg/m2 (p = 0.044). The Murray score at admission was elevated with increasing BMI (0.8 ± 0.8 for BMI < 25 kg/m2, 0.9 ± 0.9 for BMI 25–30 kg/m2, and 1.0 ± 0.8 for BMI > 30 kg/m2; p = 0.137); the maximum Murray score during hospitalization revealed significant differences (1.2 ± 0.9 for BMI < 25 kg/m2, 1.6 ± 1.0 for BMI 25–30 kg/m2, and 1.5 ± 0.9 for BMI > 30 kg/m2; p < 0.001). The incidence of pneumonia also increased with increasing BMI (1.6% for BMI < 25 kg/m2, 2.0% for BMI 25–30 kg/m2, and 3.1% for BMI > 30 kg/m2; p = 0.044). CONCLUSION Obesity leads to an increased incidence of pneumonia in a polytrauma situation. LEVEL OF EVIDENCE Prognostic/epidemiologic study, level IV.
Resumo:
Over the last couple of decades, the UK experienced a substantial increase in the incidence and geographical spread of bovine tuberculosis (TB), in particular since the epidemic of foot-and-mouth disease (FMD) in 2001. The initiation of the Randomized Badger Culling Trial (RBCT) in 1998 in south-west England provided an opportunity for an in-depth collection of questionnaire data (covering farming practices, herd management and husbandry, trading and wildlife activity) from herds having experienced a TB breakdown between 1998 and early 2006 and randomly selected control herds, both within and outside the RBCT (the so-called TB99 and CCS2005 case-control studies). The data collated were split into four separate and comparable substudies related to either the pre-FMD or post-FMD period, which are brought together and discussed here for the first time. The findings suggest that the risk factors associated with TB breakdowns may have changed. Higher Mycobacterium bovis prevalence in badgers following the FMD epidemic may have contributed to the identification of the presence of badgers on a farm as a prominent TB risk factor only post-FMD. The strong emergence of contact/trading TB risk factors post-FMD suggests that the purchasing and movement of cattle, which took place to restock FMD-affected areas after 2001, may have exacerbated the TB problem. Post-FMD analyses also highlighted the potential impact of environmental factors on TB risk. Although no unique and universal solution exists to reduce the transmission of TB to and among British cattle, there is an evidence to suggest that applying the broad principles of biosecurity on farms reduces the risk of infection. However, with trading remaining as an important route of local and long-distance TB transmission, improvements in the detection of infected animals during pre- and post-movement testing should further reduce the geographical spread of the disease.
Resumo:
BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.
Resumo:
OBJECTIVE To explore the risk of endometrial cancer in relation to metformin and other antidiabetic drugs. METHODS We conducted a case-control analysis to explore the association between use of metformin and other antidiabetic drugs and the risk of endometrial cancer using the UK-based General Practice Research Database (GPRD). Cases were women with an incident diagnosis of endometrial cancer, and up to 6 controls per case were matched in age, sex, calendar time, general practice, and number of years of active history in the GPRD prior to the index date. Odds ratios (ORs) with 95% confidence intervals (95% CI) were calculated and results were adjusted by multivariate logistic regression analyses for BMI, smoking, a recorded diagnosis of diabetes mellitus, and diabetes duration. RESULTS A total of 2554 cases with incident endometrial cancer and 15,324 matched controls were identified. Ever use of metformin compared to never use of metformin was not associated with an altered risk of endometrial cancer (adj. OR 0.86, 95% CI 0.63-1.18). Stratified by exposure duration, neither long-term (≥25 prescriptions) use of metformin (adj. OR 0.79, 95% CI 0.54-1.17), nor long-term use of sulfonylureas (adj. OR 0.96, 95% CI 0.65-1.44), thiazolidinediones (≥15 prescriptions; adj. OR 1.22, 95% CI 0.67-2.21), or insulin (adj. OR 1.05 (0.79-1.82) was associated with the risk of endometrial cancer. CONCLUSION Use of metformin and other antidiabetic drugs were not associated with an altered risk of endometrial cancer.
Resumo:
Recent evidence suggests that transition risks from initial clinical high risk (CHR) status to psychosis are decreasing. The role played by remission in this context is mostly unknown. The present study addresses this issue by means of a meta-analysis including eight relevant studies published up to January 2012 that reported remission rates from an initial CHR status. The primary effect size measure was the longitudinal proportion of remissions compared to non-remission in subjects with a baseline CHR state. Random effect models were employed to address the high heterogeneity across studies included. To assess the robustness of the results, we performed sensitivity analyses by sequentially removing each study and rerunning the analysis. Of 773 subjects who met initial CHR criteria, 73% did not convert to psychosis along a 2-year follow. Of these, about 46% fully remitted from the baseline attenuated psychotic symptoms, as evaluated on the psychometric measures usually employed by prodromal services. The corresponding clinical remission was estimated as high as 35% of the baseline CHR sample. The CHR state is associated with a significant proportion of remitting subjects that can be accounted by the effective treatments received, a lead time bias, a dilution effect, a comorbid effect of other psychiatric diagnoses.
Resumo:
Objective: Section III of the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) lists attenuated psychosis syndrome as a condition for further study. One important question is its prevalence and clinical significance in the general population. Method: Analyses involved 1229 participants (age 16-40 years) from the general population of Canton Bern, Switzerland, enrolled from June 2011 to July 2012. "Symptom," "onset/worsening," "frequency," and "distress/disability" criteria of attenuated psychosis syndrome were assessed using the structured interview for psychosis-risk syndromes. Furthermore, help-seeking, psychosocial functioning, and current nonpsychotic axis I disorders were surveyed. Well-trained psychologists performed assessments using the computer-assisted telephone interviewing technique. Results: The symptom criterion was met by 12.9% of participants, onset/worsening by 1.1%, frequency by 3.8%, and distress/disability by 7.0%. Symptom, frequency, and distress/disability were met by 3.2%. Excluding trait-like attenuated psychotic symptoms (APS) decreased the prevalence to 2.6%, while adding onset/worsening reduced it to 0.3%. APS were associated with functional impairments, current mental disorders, and help-seeking although they were not a reason for help-seeking. These associations were weaker for attenuated psychosis syndrome. Conclusions: At the population level, only 0.3% met current attenuated psychosis syndrome criteria. Particularly, the onset/worsening criterion, originally included to increase the likelihood of progression to psychosis, lowered its prevalence. Because progression is not required for a self-contained syndrome, a revision of the restrictive onset criterion is proposed to avoid the exclusion of 2.3% of persons who experience and are distressed by APS from mental health care. Secondary analyses suggest that a revised syndrome would also possess higher clinical significance than the current syndrome.
Resumo:
BACKGROUND Conventional factors do not fully explain the distribution of cardiovascular outcomes. Biomarkers are known to participate in well-established pathways associated with cardiovascular disease, and may therefore provide further information over and above conventional risk factors. This study sought to determine whether individual and/or combined assessment of 9 biomarkers improved discrimination, calibration and reclassification of cardiovascular mortality. METHODS 3267 patients (2283 men), aged 18-95 years, at intermediate-to-high-risk of cardiovascular disease were followed in this prospective cohort study. Conventional risk factors and biomarkers were included based on forward and backward Cox proportional stepwise selection models. RESULTS During 10-years of follow-up, 546 fatal cardiovascular events occurred. Four biomarkers (interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D) were retained during stepwise selection procedures for subsequent analyses. Simultaneous inclusion of these biomarkers significantly improved discrimination as measured by the C-index (0.78, P = 0.0001), and integrated discrimination improvement (0.0219, P<0.0001). Collectively, these biomarkers improved net reclassification for cardiovascular death by 10.6% (P<0.0001) when added to the conventional risk model. CONCLUSIONS In terms of adverse cardiovascular prognosis, a biomarker panel consisting of interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D offered significant incremental value beyond that conveyed by simple conventional risk factors.
Resumo:
Background:Erythropoiesis-stimulating agents (ESAs) reduce the need for red blood cell transfusions; however, they increase the risk of thromboembolic events and mortality. The impact of ESAs on quality of life (QoL) is controversial and led to different recommendations of medical societies and authorities in the USA and Europe. We aimed to critically evaluate and quantify the effects of ESAs on QoL in cancer patients.Methods:We included data from randomised controlled trials (RCTs) on the effects of ESAs on QoL in cancer patients. Randomised controlled trials were identified by searching electronic data bases and other sources up to January 2011. To reduce publication and outcome reporting biases, we included unreported results from clinical study reports. We conducted meta-analyses on fatigue- and anaemia-related symptoms measured with the Functional Assessment of Cancer Therapy-Fatigue (FACT-F) and FACT-Anaemia (FACT-An) subscales (primary outcomes) or other validated instruments.Results:We identified 58 eligible RCTs. Clinical study reports were available for 27% (4 out of 15) of the investigator-initiated trials and 95% (41 out of 43) of the industry-initiated trials. We excluded 21 RTCs as we could not use their QoL data for meta-analyses, either because of incomplete reporting (17 RCTs) or because of premature closure of the trial (4 RCTs). We included 37 RCTs with 10 581 patients; 21 RCTs were placebo controlled. Chemotherapy was given in 27 of the 37 RCTs. The median baseline haemoglobin (Hb) level was 10.1 g dl(-1); in 8 studies ESAs were stopped at Hb levels below 13 g dl(-1) and in 27 above 13 g dl(-1). For FACT-F, the mean difference (MD) was 2.41 (95% confidence interval (95% CI) 1.39-3.43; P<0.0001; 23 studies, n=6108) in all cancer patients and 2.81 (95% CI 1.73-3.90; P<0.0001; 19 RCTs, n=4697) in patients receiving chemotherapy, which was below the threshold (⩾3) for a clinically important difference (CID). Erythropoiesis-stimulating agents had a positive effect on anaemia-related symptoms (MD 4.09; 95% CI 2.37-5.80; P=0.001; 14 studies, n=2765) in all cancer patients and 4.50 (95% CI 2.55-6.45; P<0.0001; 11 RCTs, n=2436) in patients receiving chemotherapy, which was above the threshold (⩾4) for a CID. Of note, this effect persisted when we restricted the analysis to placebo-controlled RCTs in patients receiving chemotherapy. There was some evidence that the MDs for FACT-F were above the threshold for a CID in RCTs including cancer patients receiving chemotherapy with Hb levels below 12 g dl(-1) at baseline and in RCTs stopping ESAs at Hb levels above 13 g dl(-1). However, these findings for FACT-F were not confirmed when we restricted the analysis to placebo-controlled RCTs in patients receiving chemotherapy.Conclusions:In cancer patients, particularly those receiving chemotherapy, we found that ESAs provide a small but clinically important improvement in anaemia-related symptoms (FACT-An). For fatigue-related symptoms (FACT-F), the overall effect did not reach the threshold for a CID.British Journal of Cancer advance online publication, 17 April 2014; doi:10.1038/bjc.2014.171 www.bjcancer.com.
Resumo:
QUESTIONS UNDER STUDY We sought to identify reasons for late human immunodeficiency virus (HIV) testing or late presentation for care. METHODS A structured chart review was performed to obtain data on test- and health-seeking behaviour of patients presenting late with CD4 cell counts below 350 cells/µl or with acquired immunodeficiency syndrome (AIDS), at the Zurich centre of the Swiss HIV Cohort Study between January 2009 and December 2011. Logistic regression analyses were used to compare demographic characteristics of persons presenting late with not late presenters. RESULTS Of 281 patients, 45% presented late, 48% were chronically HIV-infected non-late presenters, and an additional 7% fulfilled the <350 CD4 cells/µl criterion for late presentation but a chart review revealed that lymphopenia was caused by acute HIV infection. Among the late presenters, 60% were first tested HIV positive in a private practice. More than half of the tests (60%) were suggested by a physician, only 7% following a specific risk situation. The majority (88%) of patients entered medical care within 1 month of testing HIV positive. Risk factors for late presentation were older age (odds ratio [OR] for ≥50 vs <30 years: 3.16, p = 0.017), Asian versus Caucasian ethnicity (OR 3.5, p = 0.021). Compared with men who have sex with men (MSM) without stable partnership, MSM in a stable partnership appeared less likely to present late (OR 0.50, p = 0.034), whereas heterosexual men in a stable partnership had a 2.72-fold increased odds to present late (p = 0.049). CONCLUSIONS The frequency of late testing could be reduced by promoting awareness, particularly among older individuals and heterosexual men in stable partnerships.
Resumo:
Background Vitamin D insufficiency has been associated with the occurrence of various types of cancer, but causal relationships remain elusive. We therefore aimed to determine the relationship between genetic determinants of vitamin D serum levels and the risk of developing hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC). Methodology/Principal Findings Associations between CYP2R1, GC, and DHCR7 genotypes that are determinants of reduced 25-hydroxyvitamin D (25[OH]D3) serum levels and the risk of HCV-related HCC development were investigated for 1279 chronic hepatitis C patients with HCC and 4325 without HCC, respectively. The well-known associations between CYP2R1 (rs1993116, rs10741657), GC (rs2282679), and DHCR7 (rs7944926, rs12785878) genotypes and 25(OH)D3 serum levels were also apparent in patients with chronic hepatitis C. The same genotypes of these single nucleotide polymorphisms (SNPs) that are associated with reduced 25(OH)D3 serum levels were found to be associated with HCV-related HCC (P = 0.07 [OR = 1.13, 95% CI = 0.99–1.28] for CYP2R1, P = 0.007 [OR = 1.56, 95% CI = 1.12–2.15] for GC, P = 0.003 [OR = 1.42, 95% CI = 1.13–1.78] for DHCR7; ORs for risk genotypes). In contrast, no association between these genetic variations and liver fibrosis progression rate (P>0.2 for each SNP) or outcome of standard therapy with pegylated interferon-α and ribavirin (P>0.2 for each SNP) was observed, suggesting a specific influence of the genetic determinants of 25(OH)D3 serum levels on hepatocarcinogenesis. Conclusions/Significance Our data suggest a relatively weak but functionally relevant role for vitamin D in the prevention of HCV-related hepatocarcinogenesis.
Resumo:
Investigating the new product portfolio innovativeness of family firms connects two important topics that have recently received considerable attention in innovation and family firm research. First, new product portfolio innovativeness has been identified as a critical determinant of firm performance. Second, research on family firms has focused on the questions of if and why family firms are more or less innovative than other organizational forms. Research investigating the innovativeness of family firms has often applied a risk-oriented perspective by identifying socioemotional wealth (SEW) as the main reference that determines firm behavior. Thus, prior research has mainly focused on the organizational context to predict innovation-related family firm behavior and neglected the impact of preferences and the behavior of the chief executive officer (CEO), which have both been shown to affect firm outcomes. Hence, this study aims to extend the previous research by introducing the CEO's disposition to organizational context variables to explain the new product portfolio innovativeness of small and medium-sized family firms. Specifically, this study explores how the organizational context (i.e., ownership by top management team [TMT] family members and generation in charge of the family firm) of family firms interacts with CEO risk-taking propensity to affect new product portfolio innovativeness. Using a sample of 114 German CEOs of small and medium-sized family firms operating in manufacturing industries, the results show that CEO risk-taking propensity has a positive effect on new product portfolio innovativeness. Moreover, the analyses show that the organizational context of family firms impacts the relationship between CEO risk-taking propensity and new product portfolio innovativeness. Specifically, the relationship between CEO risk-taking propensity and new product portfolio innovativeness is weaker if levels of ownership by TMT family members are high (high SEW). Additionally, the effect of CEO risk-taking propensity on new product portfolio innovativeness is stronger in family firms at earlier generational stages (high SEW). This result suggests that if SEW is a strong reference, family firm-specific characteristics can affect individual dispositions and, in turn, the behaviors of executives. Therefore, this study helps extend the knowledge on the determinants of new product portfolio innovativeness of family firms by considering an individual CEO preference and the organizational context variables of family firms simultaneously.
Resumo:
A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed at an early stage may render timely preventive measures difficult. In order to assess the risk factors, patients should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. A table with common beverages and foodstuffs is presented for judging the erosive potential. Particularly, patients with more than 4 dietary acid intakes have a higher risk for erosion when other risk factors are present. Regurgitation of gastric acids is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, use of calcium-enriched beverages, optimization of prophylactic regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with an erosive-protecting toothpaste as well as rinsing solutions. Since erosion and abrasion often occur simultaneously, all of the causative components must be taken into consideration when planning preventive strategies but only those important and feasible for an individual should be communicated to the patient.