103 resultados para Risk-factor Profile
Resumo:
The blaESBL and blaAmpC genes in Enterobacteriaceae are spread by plasmid-mediated integrons, insertion sequences, and transposons, some of which are homologous in bacteria from food animals, foods, and humans. These genes have been frequently identified in Escherichia coli and Salmonella from food animals, the most common being blaCTX-M-1, blaCTX-M-14, and blaCMY-2. Identification of risk factors for their occurrence in food animals is complex. In addition to generic antimicrobial use, cephalosporin usage is an important risk factor for selection and spread of these genes. Extensive international trade of animals is a further risk factor. There are no data on the effectiveness of individual control options in reducing public health risks. A highly effective option would be to stop or restrict cephalosporin usage in food animals. Decreasing total antimicrobial use is also of high priority. Implementation of measures to limit strain dissemination (increasing farm biosecurity, controls in animal trade, and other general postharvest controls) are also important.
Resumo:
BACKGROUND & AIMS Development of strictures is a major concern for patients with eosinophilic esophagitis (EoE). At diagnosis, EoE can present with an inflammatory phenotype (characterized by whitish exudates, furrows, and edema), a stricturing phenotype (characterized by rings and stenosis), or a combination of these. Little is known about progression of stricture formation; we evaluated stricture development over time in the absence of treatment and investigated risk factors for stricture formation. METHODS We performed a retrospective study using the Swiss EoE Database, collecting data on 200 patients with symptomatic EoE (153 men; mean age at diagnosis, 39 ± 15 years old). Stricture severity was graded based on the degree of difficulty associated with passing of the standard adult endoscope. RESULTS The median delay in diagnosis of EoE was 6 years (interquartile range, 2-12 years). With increasing duration of delay in diagnosis, the prevalence of fibrotic features of EoE, based on endoscopy, increased from 46.5% (diagnostic delay, 0-2 years) to 87.5% (diagnostic delay, >20 years; P = .020). Similarly, the prevalence of esophageal strictures increased with duration of diagnostic delay, from 17.2% (diagnostic delay, 0-2 years) to 70.8% (diagnostic delay, >20 years; P < .001). Diagnostic delay was the only risk factor for strictures at the time of EoE diagnosis (odds ratio = 1.08; 95% confidence interval: 1.040-1.122; P < .001). CONCLUSIONS The prevalence of esophageal strictures correlates with the duration of untreated disease. These findings indicate the need to minimize delay in diagnosis of EoE.
Resumo:
BACKGROUND AND AIMS Hypoxia can induce inflammation in the gastrointestinal tract. However, the impact of hypoxia on the course of inflammatory bowel disease (IBD) is poorly understood. We aimed to evaluate whether flights and/or journeys to regions lying at an altitude of >2000m above the sea level are associated with flare-ups within 4weeks of the trip. METHODS IBD patients with at least one flare-up during a 12-month observation period were compared to a group of patients in remission. Both groups completed a questionnaire. RESULTS A total of 103 IBD patients were included (43 with Crohn's disease (CD): mean age 39.3±14.6years; 60 with ulcerative colitis (UC): mean age 40.4±15.1years). Fifty-two patients with flare-ups were matched to 51 patients in remission. IBD patients experiencing flare-ups had more frequently undertaken flights and/or journeys to regions >2000m above sea level within four weeks of the flare-up when compared to patients in remission (21/52 [40.4%] vs. 8/51 [15.7%], p=0.005). CONCLUSIONS Journeys to high altitude regions and/or flights are a risk factor for IBD flare-ups occurring within 4weeks of travel.
Resumo:
Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.
Resumo:
Over the last couple of decades, the UK experienced a substantial increase in the incidence and geographical spread of bovine tuberculosis (TB), in particular since the epidemic of foot-and-mouth disease (FMD) in 2001. The initiation of the Randomized Badger Culling Trial (RBCT) in 1998 in south-west England provided an opportunity for an in-depth collection of questionnaire data (covering farming practices, herd management and husbandry, trading and wildlife activity) from herds having experienced a TB breakdown between 1998 and early 2006 and randomly selected control herds, both within and outside the RBCT (the so-called TB99 and CCS2005 case-control studies). The data collated were split into four separate and comparable substudies related to either the pre-FMD or post-FMD period, which are brought together and discussed here for the first time. The findings suggest that the risk factors associated with TB breakdowns may have changed. Higher Mycobacterium bovis prevalence in badgers following the FMD epidemic may have contributed to the identification of the presence of badgers on a farm as a prominent TB risk factor only post-FMD. The strong emergence of contact/trading TB risk factors post-FMD suggests that the purchasing and movement of cattle, which took place to restock FMD-affected areas after 2001, may have exacerbated the TB problem. Post-FMD analyses also highlighted the potential impact of environmental factors on TB risk. Although no unique and universal solution exists to reduce the transmission of TB to and among British cattle, there is an evidence to suggest that applying the broad principles of biosecurity on farms reduces the risk of infection. However, with trading remaining as an important route of local and long-distance TB transmission, improvements in the detection of infected animals during pre- and post-movement testing should further reduce the geographical spread of the disease.
Resumo:
BACKGROUND: Depressed mood following an acute coronary syndrome (ACS) is a risk factor for future cardiac morbidity. Hypothalamic-pituitary-adrenal (HPA) axis dysregulation is associated with depression, and may be a process through which depressive symptoms influence later cardiac health. Additionally, a history of depression predicts depressive symptoms in the weeks following ACS. The purpose of this study was to determine whether a history of depression and/or current depression are associated with the HPA axis dysregulation following ACS. METHOD: A total of 152 cardiac patients completed a structured diagnostic interview, a standardized depression questionnaire and a cortisol profile over the day, 3 weeks after an ACS. Cortisol was analysed using: the cortisol awakening response (CAR), total cortisol output estimated using the area under the curve method, and the slope of cortisol decline over the day. RESULTS: Total cortisol output was positively associated with history of depression, after adjustment for age, gender, marital status, ethnicity, smoking status, body mass index (BMI), Global Registry of Acute Cardiac Events (GRACE) risk score, days in hospital, medication with statins and antiplatelet compounds, and current depression score. Men with clinically diagnosed depression after ACS showed a blunted CAR, but the CAR was not related to a history of depression. CONCLUSIONS: Patients with a history of depression showed increased total cortisol output, but this is unlikely to be responsible for associations between depression after ACS and later cardiac morbidity. However, the blunted CAR in patients with severe depression following ACS indicates that HPA dysregulation is present.
Resumo:
OBJECTIVES This study sought to determine whether high intestinal cholesterol absorption represents a cardiovascular risk factor and to link ABCG8 and ABO variants to cardiovascular disease (CVD). BACKGROUND Plant sterol-enriched functional foods are widely used for cholesterol lowering. Their regular intake yields a 2-fold increase in circulating plant sterol levels that equally represent markers of cholesterol absorption. Variants in ABCG8 and ABO have been associated with circulating plant sterol levels and CVD, thereby suggesting atherogenic effects of plant sterols or of cholesterol uptake. METHODS The cholestanol-to-cholesterol ratio (CR) was used as an estimate of cholesterol absorption because it is independent of plant sterols. First, we investigated the associations of 6 single nucleotide polymorphisms in ABCG8 and ABO with CR in the LURIC (LUdwisghafen RIsk and Cardiovascular health study) and the YFS (Young Finns Study) cohorts. Second, we conducted a systematic review and meta-analysis to investigate whether CR might be related to CVD. RESULTS In LURIC, the minor alleles of rs4245791 and rs4299376 and the major alleles of rs41360247, rs6576629, and rs4953023 of the ABCG8 gene and the minor allele of rs657152 of the ABO gene were significantly associated with higher CR. Consistent results were obtained for rs4245791, rs4299376, rs6576629, and rs4953023 in YFS. The meta-analysis, including 6 studies and 4,362 individuals, found that CR was significantly increased in individuals with CVD. CONCLUSIONS High cholesterol absorption is associated with risk alleles in ABCG8 and ABO and with CVD. Harm caused by elevated cholesterol absorption rather than by plant sterols may therefore mediate the relationships of ABCG8 and ABO variants with CVD.
Resumo:
AIM To investigate risk factors for the loss of multi-rooted teeth (MRT) in subjects treated for periodontitis and enrolled in supportive periodontal therapy (SPT). MATERIAL AND METHODS A total of 172 subjects were examined before (T0) and after active periodontal therapy (APT)(T1) and following a mean of 11.5 ± 5.2 (SD) years of SPT (T2). The association of risk factors with loss of MRT was analysed with multilevel logistic regression. The tooth was the unit of analysis. RESULTS Furcation involvement (FI) = 1 before APT was not a risk factor for tooth loss compared with FI = 0 (p = 0.37). Between T0 and T2, MRT with FI = 2 (OR: 2.92, 95% CI: 1.68, 5.06, p = 0.0001) and FI = 3 (OR: 6.85, 95% CI: 3.40, 13.83, p < 0.0001) were at a significantly higher risk to be lost compared with those with FI = 0. During SPT, smokers lost significantly more MRT compared with non-smokers (OR: 2.37, 95% CI: 1.05, 5.35, p = 0.04). Non-smoking and compliant subjects with FI = 0/1 at T1 lost significantly less MRT during SPT compared with non-compliant smokers with FI = 2 (OR: 10.11, 95% CI: 2.91, 35.11, p < 0.0001) and FI = 3 (OR: 17.18, 95% CI: 4.98, 59.28, p < 0.0001) respectively. CONCLUSIONS FI = 1 was not a risk factor for tooth loss compared with FI = 0. FI = 2/3, smoking and lack of compliance with regular SPT represented risk factors for the loss of MRT in subjects treated for periodontitis.
Resumo:
A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed at an early stage may render timely preventive measures difficult. In order to assess the risk factors, patients should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. A table with common beverages and foodstuffs is presented for judging the erosive potential. Particularly, patients with more than 4 dietary acid intakes have a higher risk for erosion when other risk factors are present. Regurgitation of gastric acids is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, use of calcium-enriched beverages, optimization of prophylactic regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with an erosive-protecting toothpaste as well as rinsing solutions. Since erosion and abrasion often occur simultaneously, all of the causative components must be taken into consideration when planning preventive strategies but only those important and feasible for an individual should be communicated to the patient.
Resumo:
BACKGROUND Anemia has been shown to be a risk factor for coronary artery disease and mortality. The involvement of body iron stores in the development of CAD remains controversial. So far, studies that examined hemoglobin and parameters of iron metabolism simultaneously do not exist. METHODS AND RESULTS Hemoglobin and iron status were determined in 1480 patients with stable angiographic coronary artery disease (CAD) and in 682 individuals in whom CAD had been ruled out by angiography. The multivariate adjusted odds ratios (OR) for CAD in the lowest quartiles of hemoglobin and iron were 1.62 (95%CI: 1.22-2.16), and 2.05 (95%CI: 1.51-2.78), respectively compared to their highest gender-specific quartiles. The fully adjusted ORs for CAD in the lowest quartiles of transferrin saturation, ferritin (F) and soluble transferrin receptor (sTfR)/log10F index were 1.69 (95%CI: 1.25-2.27), 1.98 (95%CI: 1.48-2.65), and 1.64 (95%CI: 1.23-2.18), respectively compared to their highest gender-specific quartiles. When adjusting in addition for iron and ferritin the OR for CAD in the lowest quartiles of hemoglobin was still 1.40 (95%CI: 1.04-1.90) compared to the highest gender-specific quartiles. Thus, the associations between either iron status or low hemoglobin and CAD appeared independent from each other. The sTfR was only marginally associated with angiographic CAD. CONCLUSIONS Both low hemoglobin and iron depletion are independently associated with angiographic CAD.
Resumo:
BACKGROUND High-risk prostate cancer (PCa) is an extremely heterogeneous disease. A clear definition of prognostic subgroups is mandatory. OBJECTIVE To develop a pretreatment prognostic model for PCa-specific survival (PCSS) in high-risk PCa based on combinations of unfavorable risk factors. DESIGN, SETTING, AND PARTICIPANTS We conducted a retrospective multicenter cohort study including 1360 consecutive patients with high-risk PCa treated at eight European high-volume centers. INTERVENTION Retropubic radical prostatectomy with pelvic lymphadenectomy. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Two Cox multivariable regression models were constructed to predict PCSS as a function of dichotomization of clinical stage (< cT3 vs cT3-4), Gleason score (GS) (2-7 vs 8-10), and prostate-specific antigen (PSA; ≤ 20 ng/ml vs > 20 ng/ml). The first "extended" model includes all seven possible combinations; the second "simplified" model includes three subgroups: a good prognosis subgroup (one single high-risk factor); an intermediate prognosis subgroup (PSA >20 ng/ml and stage cT3-4); and a poor prognosis subgroup (GS 8-10 in combination with at least one other high-risk factor). The predictive accuracy of the models was summarized and compared. Survival estimates and clinical and pathologic outcomes were compared between the three subgroups. RESULTS AND LIMITATIONS The simplified model yielded an R(2) of 33% with a 5-yr area under the curve (AUC) of 0.70 with no significant loss of predictive accuracy compared with the extended model (R(2): 34%; AUC: 0.71). The 5- and 10-yr PCSS rates were 98.7% and 95.4%, 96.5% and 88.3%, 88.8% and 79.7%, for the good, intermediate, and poor prognosis subgroups, respectively (p = 0.0003). Overall survival, clinical progression-free survival, and histopathologic outcomes significantly worsened in a stepwise fashion from the good to the poor prognosis subgroups. Limitations of the study are the retrospective design and the long study period. CONCLUSIONS This study presents an intuitive and easy-to-use stratification of high-risk PCa into three prognostic subgroups. The model is useful for counseling and decision making in the pretreatment setting.
Resumo:
Background: According to the World Health Organization, stroke is the 'incoming epidemic of the 21st century'. In light of recent data suggesting that 85% of all strokes may be preventable, strategies for prevention are moving to the forefront in stroke management. Summary: This review discusses the risk factors and provides evidence on the effective medical interventions and lifestyle modifications for optimal stroke prevention. Key Messages: Stroke risk can be substantially reduced using the medical measures that have been proven in many randomized trials, in combination with effective lifestyle modifications. The global modification of health and lifestyle is more beneficial than the treatment of individual risk factors. Clinical Implications: Hypertension is the most important modifiable risk factor for stroke. Efficacious reduction of blood pressure is essential for stroke prevention, even more so than the choice of antihypertensive drugs. Indications for the use of antihypertensive drugs depend on blood pressure values and vascular risk profile; thus, treatment should be initiated earlier in patients with diabetes mellitus or in those with a high vascular risk profile. Treatment of dyslipidemia with statins, anticoagulation therapy in atrial fibrillation, and carotid endarterectomy in symptomatic high-grade carotid stenosis are also effective for stroke prevention. Lifestyle factors that have been proven to reduce stroke risk include reducing salt, eliminating smoking, performing regular physical activity, and maintaining a normal body weight. © 2015 S. Karger AG, Basel.
Resumo:
BACKGROUND Exposure to medium or high doses of ionizing radiation is a known risk factor for cancer in children. The extent to which low dose radiation from natural sources contributes to the risk of childhood cancer remains unclear. OBJECTIVES In a nationwide census-based cohort study, we investigated whether the incidence of childhood cancer was associated with background radiation from terrestrial gamma and cosmic rays. METHODS Children aged <16 years in the Swiss National Censuses in 1990 and 2000 were included. The follow-up period lasted until 2008 and incident cancer cases were identified from the Swiss Childhood Cancer Registry. A radiation model was used to predict dose rates from terrestrial and cosmic radiation at locations of residence. Cox regression models were used to assess associations between cancer risk and dose rates and cumulative dose since birth. RESULTS Among 2,093,660 children included at census, 1,782 incident cases of cancer were identified including 530 with leukemia, 328 with lymphoma, and 423 with a tumor of the central nervous system (CNS). Hazard ratios for each mSv increase in cumulative dose of external radiation were 1.03 (95% CI: 1.01, 1.05) for any cancer, 1.04 (1.00, 1.08) for leukemia, 1.01 (0.96, 1.05) for lymphoma, and 1.04 (1.00, 1.08) for CNS tumors. Adjustment for a range of potential confounders had little effect on the results. CONCLUSIONS Our study suggests that background radiation may contribute to the risk of cancer in children including leukemia and CNS tumors.
Resumo:
Rolandic epilepsy (RE) is the most common idiopathic focal childhood epilepsy. Its molecular basis is largely unknown and a complex genetic etiology is assumed in the majority of affected individuals. The present study tested whether six large recurrent copy number variants at 1q21, 15q11.2, 15q13.3, 16p11.2, 16p13.11 and 22q11.2 previously associated with neurodevelopmental disorders also increase risk of RE. Our association analyses revealed a significant excess of the 600 kb genomic duplication at the 16p11.2 locus (chr16: 29.5-30.1 Mb) in 393 unrelated patients with typical (n = 339) and atypical (ARE; n = 54) RE compared with the prevalence in 65,046 European population controls (5/393 cases versus 32/65,046 controls; Fisher's exact test P = 2.83 × 10(-6), odds ratio = 26.2, 95% confidence interval: 7.9-68.2). In contrast, the 16p11.2 duplication was not detected in 1738 European epilepsy patients with either temporal lobe epilepsy (n = 330) and genetic generalized epilepsies (n = 1408), suggesting a selective enrichment of the 16p11.2 duplication in idiopathic focal childhood epilepsies (Fisher's exact test P = 2.1 × 10(-4)). In a subsequent screen among children carrying the 16p11.2 600 kb rearrangement we identified three patients with RE-spectrum epilepsies in 117 duplication carriers (2.6%) but none in 202 carriers of the reciprocal deletion. Our results suggest that the 16p11.2 duplication represents a significant genetic risk factor for typical and atypical RE.
Resumo:
OBJECTIVES The aim of this study was to identify common risk factors for patient-reported medical errors across countries. In country-level analyses, differences in risks associated with error between health care systems were investigated. The joint effects of risks on error-reporting probability were modelled for hypothetical patients with different health care utilization patterns. DESIGN Data from the Commonwealth Fund's 2010 lnternational Survey of the General Public's Views of their Health Care System's Performance in 11 Countries. SETTING Representative population samples of 11 countries were surveyed (total sample = 19,738 adults). Utilization of health care, coordination of care problems and reported errors were assessed. Regression analyses were conducted to identify risk factors for patients' reports of medical, medication and laboratory errors across countries and in country-specific models. RESULTS Error was reported by 11.2% of patients but with marked differences between countries (range: 5.4-17.0%). Poor coordination of care was reported by 27.3%. The risk of patient-reported error was determined mainly by health care utilization: Emergency care (OR = 1.7, P < 0.001), hospitalization (OR = 1.6, P < 0.001) and the number of providers involved (OR three doctors = 2.0, P < 0.001) are important predictors. Poor care coordination is the single most important risk factor for reporting error (OR = 3.9, P < 0.001). Country-specific models yielded common and country-specific predictors for self-reported error. For high utilizers of care, the probability that errors are reported rises up to P = 0.68. CONCLUSIONS Safety remains a global challenge affecting many patients throughout the world. Large variability exists in the frequency of patient-reported error across countries. To learn from others' errors is not only essential within countries but may also prove a promising strategy internationally.