979 resultados para Hazard Risk
Resumo:
BACKGROUND: This collaboration of seven observational clinical cohorts investigated risk factors for treatment-limiting toxicities in both antiretroviral-naive and experienced patients starting nevirapine-based combination antiretroviral therapy (NVPc). METHODS: Patients starting NVPc after 1 January 1998 were included. CD4 cell count at starting NVPc was classified as high (>400/microl/>250/microl for men/women, respectively) or low. Cox models were used to investigate risk factors for discontinuations due to hypersensitivity reactions (HSR, n = 6547) and discontinuation of NVPc due to treatment-limiting toxicities and/or patient/physician choice (TOXPC, n = 10,186). Patients were classified according to prior antiretroviral treatment experience and CD4 cell count/viral load at start NVPc. Models were stratified by cohort and adjusted for age, sex, nadir CD4 cell count, calendar year of starting NVPc and mode of transmission. RESULTS: Median time from starting NVPc to TOXPC and HSR were 162 days [interquartile range (IQR) 31-737] and 30 days (IQR 17-60), respectively. In adjusted Cox analyses, compared to naive patients with a low CD4 cell count, treatment-experienced patients with high CD4 cell count and viral load more than 400 had a significantly increased risk for HSR [hazard ratio 1.45, confidence interval (CI) 1.03-2.03] and TOXPC within 18 weeks (hazard ratio 1.34, CI 1.08-1.67). In contrast, treatment-experienced patients with high CD4 cell count and viral load less than 400 had no increased risk for HSR 1.10 (0.82-1.46) or TOXPC within 18 weeks (hazard ratio 0.94, CI 0.78-1.13). CONCLUSION: Our results suggest it may be relatively well tolerated to initiate NVPc in antiretroviral-experienced patients with high CD4 cell counts provided there is no detectable viremia.
Resumo:
Although growth opportunities fade and profitability declines as firms mature, older firms are no more likely to be acquired than young firms are. This article documents and explains that phenomenon. We argue that, because mature organizations are rationally less flexible, they are more costly to integrate and therefore comparatively unattractive acquisition candidates. The evidence supports this explanation of the negative age dependence of takeover hazard. The evidence also shows that negative exogenous shocks to merger benefits further reduce the takeover hazard of mature firms. We test many alternative explanations and find no evidence that they can explain the hazard decline.
Resumo:
Background: In contrast with established evidence linking high doses of ionizing radiation with childhood cancer, research on low-dose ionizing radiation and childhood cancer has produced inconsistent results. Objective: We investigated the association between domestic radon exposure and childhood cancers, particularly leukemia and central nervous system (CNS) tumors. Methods: We conducted a nationwide census-based cohort study including all children < 16 years of age living in Switzerland on 5 December 2000, the date of the 2000 census. Follow-up lasted until the date of diagnosis, death, emigration, a child’s 16th birthday, or 31 December 2008. Domestic radon levels were estimated for each individual home address using a model developed and validated based on approximately 45,000 measurements taken throughout Switzerland. Data were analyzed with Cox proportional hazard models adjusted for child age, child sex, birth order, parents’ socioeconomic status, environmental gamma radiation, and period effects. Results: In total, 997 childhood cancer cases were included in the study. Compared with children exposed to a radon concentration below the median (< 77.7 Bq/m3), adjusted hazard ratios for children with exposure ≥ the 90th percentile (≥ 139.9 Bq/m3) were 0.93 (95% CI: 0.74, 1.16) for all cancers, 0.95 (95% CI: 0.63, 1.43) for all leukemias, 0.90 (95% CI: 0.56, 1.43) for acute lymphoblastic leukemia, and 1.05 (95% CI: 0.68, 1.61) for CNS tumors. Conclusions: We did not find evidence that domestic radon exposure is associated with childhood cancer, despite relatively high radon levels in Switzerland.
Resumo:
The indications for screening and TSH threshold levels for treatment of subclinical hypothyroidism have remained a clinical controversy for over 20 years. Subclinical thyroid dysfunction is a common finding in the growing population of older adults, occurring in 10–15% among those age 65 and older, and may contribute to multiple common problems of older age, including cardiovascular disease, muscular impairment, mood problems, and cognitive dysfunction (1). In 2004, both the U.S. Preventive Services Task Force (2) and a clinical consensus group of experts (3) concluded that the existing evidence about the association between subclinical hypothyroidism and cardiovascular risks, primarily cross-sectional or case-control studies (4), was insufficient. For example, a frequently cited analysis from the Rotterdam study found a cross-sectional association between subclinical hypothyroidism and atherosclerosis, as measured by abdominal aortic calcification (odds ratio, 1.7; 95% confidence interval [CI], 1.1–2.6) and prevalent myocardial infarction (MI) (odds ratio, 2.3; 95% CI, 1.3–4.0) (5). Conversely, the prospective part of this study included only 16 incident MIs; the hazard ratio (HR) for subclinical hypothyroidism was 2.50, with broad 95% CIs (0.70–9.10). Potential mechanisms for the associations with cardiovascular diseases among adults with subclinical hypothyroidism include elevated cholesterol levels, inflammatory markers, raised homocysteine, increased oxidative stress, insulin resistance, increased systemic vascular resistance, arterial stiffness, altered endothelial function, and activation of thrombosis and hypercoagulability that have all been reported to be associated with subclinical hypothyroidism (1, 6).
Resumo:
BACKGROUND Homicide-suicides are rare but catastrophic events. This study examined the epidemiology of homicide-suicide in Switzerland. METHODS The study identified homicide-suicide events 1991-2008 in persons from the same household in the Swiss National Cohort, which links census and mortality records. The analysis examined the association of the risk of dying in a homicide-suicide event with socio-demographic variables, measured at the individual-level, household composition variables and area-level variables. Proportional hazards regression models were calculated for male perpetrators and female victims. Results are presented as age-adjusted hazard ratios (HR) with 95% confidence intervals (95%CI). RESULTS The study identified 158 deaths from homicide-suicide events, including 85 murder victims (62 women, 4 men, 19 children and adolescents) and 68 male and 5 female perpetrators. The incidence was 3 events per million households and year. Firearms were the most prominent method for both homicides and suicides. The risk of perpetrating homicide-suicide was higher in divorced than in married men (HR 3.64; 95%CI 1.56-8.49), in foreigners without permanent residency compared to Swiss citizens (HR 3.95; 1.52-10.2), higher in men without religious affiliations than in Catholics (HR 2.23; 1.14-4.36) and higher in crowded households (HR 4.85; 1.72-13.6 comparing ≥2 with <1 persons/room). There was no association with education, occupation or nationality, the number of children, the language region or degree of urbanicity. Associations were similar for female victims. CONCLUSIONS This national longitudinal study shows that living conditions associated with psychological stress and lower levels of social support are associated with homicide-suicide events in Switzerland.
Resumo:
INTRODUCTION Current literature suggesting a higher bleeding risk during combination therapy compared to oral anticoagulation alone is primarily based on retrospective studies or specific populations. We aimed to prospectively evaluate whether unselected medical patients on oral anticoagulation have an increased risk of bleeding when on concomitant antiplatelet therapy. MATERIAL AND METHODS We prospectively studied consecutive adult medical patients who were discharged on oral anticoagulants between 01/2008 and 03/2009 from a Swiss university hospital. The primary outcome was the time to a first major bleed on oral anticoagulation within 12 months, adjusted for age, international normalized ratio target, number of medications, and history of myocardial infarction and major bleeding. RESULTS Among the 515 included anticoagulated patients, the incidence rate of a first major bleed was 8.2 per 100 patient-years. Overall, 161 patients (31.3%) were on both anticoagulant and antiplatelet therapy, and these patients had a similar incidence rate of major bleeding compared to patients on oral anticoagulation alone (7.6 vs. 8.4 per 100 patient-years, P=0.81). In a multivariate analysis, the association of concomitant antiplatelet therapy with the risk of major bleeding was not statistically significant (hazard ratio 0.89, 95% confidence interval, 0.37-2.10). CONCLUSIONS The risk of bleeding in patients receiving oral anticoagulants combined with antiplatelet therapy was similar to patients receiving oral anticoagulants alone, suggesting that the incremental bleeding risk of combination therapy might not be clinically significant.
Resumo:
BACKGROUND Marfan syndrome (MFS) is a variable, autosomal-dominant disorder of the connective tissue. In MFS serious ventricular arrhythmias and sudden cardiac death (SCD) can occur. The aim of this prospective study was to reveal underlying risk factors and to prospectively investigate the association between MFS and SCD in a long-term follow-up. METHODS 77 patients with MFS were included. At baseline serum N-terminal pro-brain natriuretic peptide (NT-proBNP), transthoracic echocardiogram, 12-lead resting ECG, signal-averaged ECG (SAECG) and a 24-h Holter ECG with time- and frequency domain analyses were performed. The primary composite endpoint was defined as SCD, ventricular tachycardia (VT), ventricular fibrillation (VF) or arrhythmogenic syncope. RESULTS The median follow-up (FU) time was 868 days. Among all risk stratification parameters, NT-proBNP remained the exclusive predictor (hazard ratio [HR]: 2.34, 95% confidence interval [CI]: 1.1 to 4.62, p=0.01) for the composite endpoint. With an optimal cut-off point at 214.3 pg/ml NT-proBNP predicted the composite primary endpoint accurately (AUC 0.936, p=0.00046, sensitivity 100%, specificity 79.0%). During FU, seven patients of Group 2 (NT-proBNP ≥ 214.3 pg/ml) reached the composite endpoint and 2 of these patients died due to SCD. In five patients, sustained VT was documented. All patients with a NT-proBNP<214.3 pg/ml (Group 1) experienced no events. Group 2 patients had a significantly higher risk of experiencing the composite endpoint (logrank-test, p<0.001). CONCLUSIONS In contrast to non-invasive electrocardiographic parameter, NT-proBNP independently predicts adverse arrhythmogenic events in patients with MFS.
Resumo:
The paper deals with the development of a general as well as integrative and holistic framework to systematize and assess vulnerability, risk and adaptation. The framework is a thinking tool meant as a heuristic that outlines key factors and different dimensions that need to be addressed when assessing vulnerability in the context of natural hazards and climate change. The approach underlines that the key factors of such a common framework are related to the exposure of a society or system to a hazard or stressor, the susceptibility of the system or community exposed, and its resilience and adaptive capacity. Additionally, it underlines the necessity to consider key factors and multiple thematic dimensions when assessing vulnerability in the context of natural and socio-natural hazards. In this regard, it shows key linkages between the different concepts used within the disaster risk management (DRM) and climate change adaptation (CCA) research. Further, it helps to illustrate the strong relationships between different concepts used in DRM and CCA. The framework is also a tool for communicating complexity and stresses the need for societal change in order to reduce risk and to promote adaptation. With regard to this, the policy relevance of the framework and first results of its application are outlined. Overall, the framework presented enhances the discussion on how to frame and link vulnerability, disaster risk, risk management and adaptation concepts.
Resumo:
OBJECTIVES: To assess health care utilisation for patients co-infected with TB and HIV (TB-HIV), and to develop a weighted health care index (HCI) score based on commonly used interventions and compare it with patient outcome. METHODS: A total of 1061 HIV patients diagnosed with TB in four regions, Central/Northern, Southern and Eastern Europe and Argentina, between January 2004 and December 2006 were enrolled in the TB-HIV study. A weighted HCI score (range 0–5), based on independent prognostic factors identified in multivariable Cox models and the final score, included performance of TB drug susceptibility testing (DST), an initial TB regimen containing a rifamycin, isoniazid and pyrazinamide, and start of combination antiretroviral treatment (cART). RESULTS: The mean HCI score was highest in Central/Northern Europe (3.2, 95%CI 3.1–3.3) and lowest in Eastern Europe (1.6, 95%CI 1.5–1.7). The cumulative probability of death 1 year after TB diagnosis decreased from 39% (95%CI 31–48) among patients with an HCI score of 0, to 9% (95%CI 6–13) among those with a score of ≥4. In an adjusted Cox model, a 1-unit increase in the HCI score was associated with 27% reduced mortality (relative hazard 0.73, 95%CI 0.64–0.84). CONCLUSIONS: Our results suggest that DST, standard anti-tuberculosis treatment and early cART may improve outcome for TB-HIV patients. The proposed HCI score provides a tool for future research and monitoring of the management of TB-HIV patients. The highest HCI score may serve as a benchmark to assess TB-HIV management, encouraging continuous health care improvement.
Resumo:
OBJECTIVE To examine the degree to which use of β blockers, statins, and diuretics in patients with impaired glucose tolerance and other cardiovascular risk factors is associated with new onset diabetes. DESIGN Reanalysis of data from the Nateglinide and Valsartan in Impaired Glucose Tolerance Outcomes Research (NAVIGATOR) trial. SETTING NAVIGATOR trial. PARTICIPANTS Patients who at baseline (enrolment) were treatment naïve to β blockers (n=5640), diuretics (n=6346), statins (n=6146), and calcium channel blockers (n=6294). Use of calcium channel blocker was used as a metabolically neutral control. MAIN OUTCOME MEASURES Development of new onset diabetes diagnosed by standard plasma glucose level in all participants and confirmed with glucose tolerance testing within 12 weeks after the increased glucose value was recorded. The relation between each treatment and new onset diabetes was evaluated using marginal structural models for causal inference, to account for time dependent confounding in treatment assignment. RESULTS During the median five years of follow-up, β blockers were started in 915 (16.2%) patients, diuretics in 1316 (20.7%), statins in 1353 (22.0%), and calcium channel blockers in 1171 (18.6%). After adjusting for baseline characteristics and time varying confounders, diuretics and statins were both associated with an increased risk of new onset diabetes (hazard ratio 1.23, 95% confidence interval 1.06 to 1.44, and 1.32, 1.14 to 1.48, respectively), whereas β blockers and calcium channel blockers were not associated with new onset diabetes (1.10, 0.92 to 1.31, and 0.95, 0.79 to 1.13, respectively). CONCLUSIONS Among people with impaired glucose tolerance and other cardiovascular risk factors and with serial glucose measurements, diuretics and statins were associated with an increased risk of new onset diabetes, whereas the effect of β blockers was non-significant.
Resumo:
Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.
Resumo:
We investigated the association between exposure to radio-frequency electromagnetic fields (RF-EMFs) from broadcast transmitters and childhood cancer. First, we conducted a time-to-event analysis including children under age 16 years living in Switzerland on December 5, 2000. Follow-up lasted until December 31, 2008. Second, all children living in Switzerland for some time between 1985 and 2008 were included in an incidence density cohort. RF-EMF exposure from broadcast transmitters was modeled. Based on 997 cancer cases, adjusted hazard ratios in the time-to-event analysis for the highest exposure category (>0.2 V/m) as compared with the reference category (<0.05 V/m) were 1.03 (95% confidence interval (CI): 0.74, 1.43) for all cancers, 0.55 (95% CI: 0.26, 1.19) for childhood leukemia, and 1.68 (95% CI: 0.98, 2.91) for childhood central nervous system (CNS) tumors. Results of the incidence density analysis, based on 4,246 cancer cases, were similar for all types of cancer and leukemia but did not indicate a CNS tumor risk (incidence rate ratio = 1.03, 95% CI: 0.73, 1.46). This large census-based cohort study did not suggest an association between predicted RF-EMF exposure from broadcasting and childhood leukemia. Results for CNS tumors were less consistent, but the most comprehensive analysis did not suggest an association.
Resumo:
The relationship between MMAC/PTEN, DMBT1 and the progression and prognosis of glioma, and the association between the alterations of MMAC/PTEN, p53, p16, and Rb and some cancer risk factors, such as smoking, exposure to radiation, family cancer history, and previous cancer history, were assessed in 4 studies. ^ By allelic deletion analysis, MMAC/PTEN locus was shown to be frequently lost in glioblastomas multiforme (GM) but maintained in most lower-grade astrocytic tumors. DMBT1 locus, however, was frequently lost in all grades of gliomas examined. The potential biological significance of these two regions was frontier assessed by examining microcell-hybrids that contained various fragments of 10q. Somatic cell hybrid clones that retained the MMAC/PTEN locus have less transformed phenotypes, exhibiting an inability to grow in soft agarose. On the other hand, the presence or absence of DAMT1 did not correlate with any in vitro phenotype assessed in our model system. Further, Cox proportional hazards regression analysis, adjusted for age at surgery and histologic grades (GM, and non-GM), showed that without LOH at the MMAC/PTEN locus had a significantly better prognosis than did patients with LOH at MMAC/ PTEN (hazard ratio = 0.5; 95% Cl = 0.28–0.89; P = 0.018). Furthermore, status of LOH at MMAC/PTEN was found to be significantly associated with age, while that for DMBT1 was not. These results suggest that the DMBT1 may be involved early in the oncogenesis of gliomas, while alterations in the MMAC /PTEN may be a late event in the oncogenesis related with progression of gliomas and provide a significant prognostic marker for patient survival. ^ The associations between 4 cancer risk factors and 4 tumor suppressor genes were assessed. The expression of p16 was observed to be associated with current smoking (adjusted OR = 1.9, 95% CI = 1.02–3.6) but not the former smoking (adjusted OR = 1.1, 95% Cl = 0.5–3.5). The expression of p53 was found to be associated with the family cancer history (OR = 3.5, 95% Cl = 1.07–11 for patients with first-degree family history of cancer). MMAC/ PTEN was associated with the histologic grade (OR = 2.8, 95% CI = 1.2–6.6) and age (P = 0.035). Also, the OR for LOH around MMAC/PTEN in patients with a family history of cancer was elevated (OR = 1.9, 95% CI = 0.8–4.6 for patients with first-degree family history of cancer). The associations between exposure and the alterations of tumor suppressor genes, between smoking and p16, between family history of cancer and p53 and MMAC/PTEN, provide suggestive evidences that those exposures are related to the development of gliomas. ^
Resumo:
The Food and Drug Administration (FDA) is responsible for risk assessment and risk management in the post-market surveillance of the U.S. medical device industry. One of the FDA regulatory mechanisms, the Medical Device Reporting System (MDR) is an adverse event reporting system intended to provide the FDA with advance warning of device problems. It includes voluntary reporting for individuals, and mandatory reporting for device manufacturers. ^ In a study of alleged breast implant safety problems, this research examines the organizational processes by which the FDA gathers data on adverse events and uses adverse event reporting systems to assess and manage risk. The research reviews the literature on problem recognition, risk perception, and organizational learning to understand the influence highly publicized events may have on adverse event reporting. Understanding the influence of an environmental factor, such as publicity, on adverse event reporting can provide insight into the question of whether the FDA's adverse event reporting system operates as an early warning system for medical device problems. ^ The research focuses on two main questions. The first question addresses the relationship between publicity and the voluntary and mandatory reporting of adverse events. The second question examines whether government agencies make use of these adverse event reports. ^ Using quantitative and qualitative methods, a longitudinal study was conducted of the number and content of adverse event reports regarding breast implants filed with the FDA's medical device reporting system during 1985–1991. To assess variation in publicity over time, the print media were analyzed to identify articles related to breast implant failures. ^ The exploratory findings suggest that an increase in media activity is related to an increase in voluntary reporting, especially following periods of intense media coverage of the FDA. However, a similar relationship was not found between media activity and manufacturers' mandatory adverse event reporting. A review of government committee and agency reports on the FDA published during 1976–1996 produced little evidence to suggest that publicity or MDR information contributed to problem recognition, agenda setting, or the formulation of policy recommendations. ^ The research findings suggest that the reporting of breast implant problems to FDA may reflect the perceptions and concerns of the reporting groups, a barometer of the volume and content of media attention. ^
Resumo:
BACKGROUND Prediction studies in subjects at Clinical High Risk (CHR) for psychosis are hampered by a high proportion of uncertain outcomes. We therefore investigated whether quantitative EEG (QEEG) parameters can contribute to an improved identification of CHR subjects with a later conversion to psychosis. METHODS This investigation was a project within the European Prediction of Psychosis Study (EPOS), a prospective multicenter, naturalistic field study with an 18-month follow-up period. QEEG spectral power and alpha peak frequencies (APF) were determined in 113 CHR subjects. The primary outcome measure was conversion to psychosis. RESULTS Cox regression yielded a model including frontal theta (HR=1.82; [95% CI 1.00-3.32]) and delta (HR=2.60; [95% CI 1.30-5.20]) power, and occipital-parietal APF (HR=.52; [95% CI .35-.80]) as predictors of conversion to psychosis. The resulting equation enabled the development of a prognostic index with three risk classes (hazard rate 0.057 to 0.81). CONCLUSIONS Power in theta and delta ranges and APF contribute to the short-term prediction of psychosis and enable a further stratification of risk in CHR samples. Combined with (other) clinical ratings, EEG parameters may therefore be a useful tool for individualized risk estimation and, consequently, targeted prevention.