939 resultados para Risk-based maintenance
Resumo:
Porcine reproductive and respiratory syndrome virus (PRRSV) is wide-spread in pig populations globally. In many regions of Europe with intensive pig production and high herd densities, the virus is endemic and can cause disease and production losses. This fuels discussion about the feasibility and sustainability of virus elimination from larger geographic regions. The implementation of a program aiming at virus elimination for areas with high pig density is unprecedented and its potential success is unknown. The objective of this work was to approach pig population data with a simple method that could support assessing the feasibility of a sustainable regional PRRSV elimination. Based on known risk factors such as pig herd structure and neighborhood conditions, an index characterizing individual herds' potential for endemic virus circulation and reinfection was designed. This index was subsequently used to compare data of all pig herds in two regions with different pig- and herd-densities in Lower Saxony (North-West Germany) where PRRSV is endemic. Distribution of the indexed herds was displayed using GIS. Clusters of high herd index densities forming potential risk hot spots were identified which could represent key target areas for surveillance and biosecurity measures under a control program aimed at virus elimination. In an additional step, for the study region with the higher pig density (2463 pigs/km(2) farmland), the potential distribution of PRRSV-free and non-free herds during the implementation of a national control program aiming at national virus elimination was modeled. Complex herd and trade network structures suggest that PRRSV elimination in regions with intensive pig farming like that of middle Europe would have to involve legal regulation and be accompanied by important trade and animal movement restrictions. The proposed methodology of risk index mapping could be adapted to areas varying in size, herd structure and density. Interpreted in the regional context, this could help to classify the density of risk and to accordingly target resources and measures for elimination.
Resumo:
BACKGROUND Exposure to medium or high doses of ionizing radiation is a known risk factor for cancer in children. The extent to which low dose radiation from natural sources contributes to the risk of childhood cancer remains unclear. OBJECTIVES In a nationwide census-based cohort study, we investigated whether the incidence of childhood cancer was associated with background radiation from terrestrial gamma and cosmic rays. METHODS Children aged <16 years in the Swiss National Censuses in 1990 and 2000 were included. The follow-up period lasted until 2008 and incident cancer cases were identified from the Swiss Childhood Cancer Registry. A radiation model was used to predict dose rates from terrestrial and cosmic radiation at locations of residence. Cox regression models were used to assess associations between cancer risk and dose rates and cumulative dose since birth. RESULTS Among 2,093,660 children included at census, 1,782 incident cases of cancer were identified including 530 with leukemia, 328 with lymphoma, and 423 with a tumor of the central nervous system (CNS). Hazard ratios for each mSv increase in cumulative dose of external radiation were 1.03 (95% CI: 1.01, 1.05) for any cancer, 1.04 (1.00, 1.08) for leukemia, 1.01 (0.96, 1.05) for lymphoma, and 1.04 (1.00, 1.08) for CNS tumors. Adjustment for a range of potential confounders had little effect on the results. CONCLUSIONS Our study suggests that background radiation may contribute to the risk of cancer in children including leukemia and CNS tumors.
Resumo:
OBJECTIVE Use of diuretics has been associated with an increased risk of gout. Data on different types of diuretics are scarce. We undertook this study to investigate the association between use of loop diuretics, thiazide or thiazide-like diuretics, and potassium-sparing agents and the risk of developing incident gout. METHODS We conducted a retrospective population-based case-control analysis using the General Practice Research Database established in the UK. We identified case patients who were diagnosed as having incident gout between 1990 and 2010. One control patient was matched to each case patient for age, sex, general practice, calendar time, and years of active history in the database. We used conditional logistic regression to calculate odds ratios (ORs) and 95% confidence intervals (95% CIs), and we adjusted for potential confounders. RESULTS We identified 91,530 incident cases of gout and the same number of matched controls. Compared to past use of diuretics from each respective drug class, adjusted ORs for current use of loop diuretics, thiazide diuretics, thiazide-like diuretics, and potassium-sparing diuretics were 2.64 (95% CI 2.47-2.83), 1.70 (95% CI 1.62-1.79), 2.30 (95% CI 1.95-2.70), and 1.06 (95% CI 0.91-1.23), respectively. Combined use of loop diuretics and thiazide diuretics was associated with the highest relative risk estimates of gout (adjusted OR 4.65 [95% CI 3.51-6.16]). Current use of calcium channel blockers or losartan slightly attenuated the risk of gout in patients who took diuretics. CONCLUSION Use of loop diuretics, thiazide diuretics, and thiazide-like diuretics was associated with an increased risk of incident gout, although use of potassium-sparing agents was not.
Resumo:
OBJECTIVE The aim of this study was to explore the risk of incident gout in patients with type 2 diabetes mellitus (T2DM) in association with diabetes duration, diabetes severity and antidiabetic drug treatment. METHODS We conducted a case-control study in patients with T2DM using the UK-based Clinical Practice Research Datalink (CPRD). We identified case patients aged ≥18 years with an incident diagnosis of gout between 1990 and 2012. We matched to each case patient one gout-free control patient. We used conditional logistic regression analysis to calculate adjusted ORs (adj. ORs) with 95% CIs and adjusted our analyses for important potential confounders. RESULTS The study encompassed 7536 T2DM cases with a first-time diagnosis of gout. Compared to a diabetes duration <1 year, prolonged diabetes duration (1-3, 3-6, 7-9 and ≥10 years) was associated with decreased adj. ORs of 0.91 (95% CI 0.79 to 1.04), 0.76 (95% CI 0.67 to 0.86), 0.70 (95% CI 0.61 to 0.86), and 0.58 (95% CI 0.51 to 0.66), respectively. Compared to a reference A1C level of <7%, the risk estimates of increasing A1C levels (7.0-7.9, 8.0-8.9 and ≥9%) steadily decreased with adj. ORs of 0.79 (95% CI 0.72 to 0.86), 0.63 (95% CI 0.55 to 0.72), and 0.46 (95% CI 0.40 to 0.53), respectively. Neither use of insulin, metformin, nor sulfonylureas was associated with an altered risk of incident gout. CONCLUSIONS Increased A1C levels, but not use of antidiabetic drugs, was associated with a decreased risk of incident gout among patients with T2DM.
Resumo:
BACKGROUND The association between combination antiretroviral therapy (cART) and cancer risk, especially regimens containing protease inhibitors (PIs) or nonnucleoside reverse transcriptase inhibitors (NNRTIs), is unclear. METHODS Participants were followed from the latest of D:A:D study entry or January 1, 2004, until the earliest of a first cancer diagnosis, February 1, 2012, death, or 6 months after the last visit. Multivariable Poisson regression models assessed associations between cumulative (per year) use of either any cART or PI/NNRTI, and the incidence of any cancer, non-AIDS-defining cancers (NADC), AIDS-defining cancers (ADC), and the most frequently occurring ADC (Kaposi sarcoma, non-Hodgkin lymphoma) and NADC (lung, invasive anal, head/neck cancers, and Hodgkin lymphoma). RESULTS A total of 41,762 persons contributed 241,556 person-years (PY). A total of 1832 cancers were diagnosed [incidence rate: 0.76/100 PY (95% confidence interval: 0.72 to 0.79)], 718 ADC [0.30/100 PY (0.28-0.32)], and 1114 NADC [0.46/100 PY (0.43-0.49)]. Longer exposure to cART was associated with a lower ADC risk [adjusted rate ratio: 0.88/year (0.85-0.92)] but a higher NADC risk [1.02/year (1.00-1.03)]. Both PI and NNRTI use were associated with a lower ADC risk [PI: 0.96/year (0.92-1.00); NNRTI: 0.86/year (0.81-0.91)]. PI use was associated with a higher NADC risk [1.03/year (1.01-1.05)]. Although this was largely driven by an association with anal cancer [1.08/year (1.04-1.13)], the association remained after excluding anal cancers from the end point [1.02/year (1.01-1.04)]. No association was seen between NNRTI use and NADC [1.00/year (0.98-1.02)]. CONCLUSIONS Cumulative use of PIs may be associated with a higher risk of anal cancer and possibly other NADC. Further investigation of biological mechanisms is warranted.
Resumo:
BACKGROUND & AIMS Pegylated interferon-based treatment is still the backbone of current hepatitis C therapy and is associated with bone marrow suppression and an increased risk of infections. The aim of this retrospective cohort study was to assess the risk of infections during interferon-based treatment among patients with chronic HCV infection and advanced hepatic fibrosis and its relation to treatment-induced neutropenia. METHODS This cohort study included all consecutive patients with chronic HCV infection and biopsy-proven bridging fibrosis or cirrhosis (Ishak 4-6) who started treatment between 1990 and 2003 in five large hepatology units in Europe and Canada. Neutrophil counts between 500/μL-749/μL and below 500/μL were considered as moderate and severe neutropenia, respectively. RESULTS This study included 723 interferon-based treatments, administered to 490 patients. In total, 113 infections were reported during 88 (12%) treatments, of which 24 (21%) were considered severe. Only one patient was found to have moderate neutropenia and three patients were found to have severe neutropenia at the visit before the infection. Three hundred and twelve (99.7%) visits with moderate neutropenia and 44 (93.6%) visits with severe neutropenia were not followed by an infection. Multivariable analysis showed that cirrhosis (OR 2.85, 95%CI 1.38-5.90, p=0.005) and severe neutropenia at the previous visit (OR 5.42, 95%CI 1.34-22.0, p=0.018) were associated with the occurrence of infection, while moderate neutropenia was not. Among a subgroup of patients treated with PegIFN, severe neutropenia was not significantly associated (OR 1.63, 95%CI 0.19-14.2, p=0.660). CONCLUSIONS In this large cohort of patients with bridging fibrosis and cirrhosis, infections during interferon-based therapy were generally mild. Severe interferon-induced neutropenia rarely occurred, but was associated with on-treatment infection. Moderate neutropenia was not associated with infection, suggesting that current dose reduction guidelines might be too strict.
Resumo:
Childhood leukaemia (CL) may have an infectious cause and population mixing may therefore increase the risk of CL. We aimed to determine whether CL was associated with population mixing in Switzerland. We followed children aged <16 years in the Swiss National Cohort 1990-2008 and linked CL cases from the Swiss Childhood Cancer Registry to the cohort. We calculated adjusted hazard ratios (HRs) for all CL, CL at age <5 years and acute lymphoblastic leukaemia (ALL) for three measures of population mixing (population growth, in-migration and diversity of origin), stratified by degree of urbanisation. Measures of population mixing were calculated for all municipalities for the 5-year period preceding the 1990 and 2000 censuses. Analyses were based on 2,128,012 children of whom 536 developed CL. HRs comparing highest with lowest quintile of population growth were 1.11 [95 % confidence interval (CI) 0.65-1.89] in rural and 0.59 (95 % CI 0.43-0.81) in urban municipalities (interaction: p = 0.271). Results were similar for ALL and for CL at age <5 years. For level of in-migration there was evidence of a negative association with ALL. HRs comparing highest with lowest quintile were 0.60 (95 % CI 0.41-0.87) in urban and 0.61 (95 % CI 0.30-1.21) in rural settings. There was little evidence of an association with diversity of origin. This nationwide cohort study of the association between CL and population growth, in-migration and diversity of origin provides little support for the population mixing hypothesis.
Resumo:
BACKGROUND The copy number variation (CNV) in beta-defensin genes (DEFB) on human chromosome 8p23 has been proposed to contribute to the phenotypic differences in inflammatory diseases. However, determination of exact DEFB CN is a major challenge in association studies. Quantitative real-time PCR (qPCR), paralog ratio tests (PRT) and multiplex ligation-dependent probe amplification (MLPA) have been extensively used to determine DEFB CN in different laboratories, but inter-method inconsistencies were observed frequently. In this study we asked which one is superior among the three methods for DEFB CN determination. RESULTS We developed a clustering approach for MLPA and PRT to statistically correlate data from a single experiment. Then we compared qPCR, a newly designed PRT and MLPA for DEFB CN determination in 285 DNA samples. We found MLPA had the best convergence and clustering results of the raw data and the highest call rate. In addition, the concordance rates between MLPA or PRT and qPCR (32.12% and 37.99%, respectively) were unacceptably low with underestimated CN by qPCR. Concordance rate between MLPA and PRT (90.52%) was high but PRT systematically underestimated CN by one in a subset of samples. In these samples a sequence variant which caused complete PCR dropout of the respective DEFB cluster copies was found in one primer binding site of one of the targeted paralogous pseudogenes. CONCLUSION MLPA is superior to PRT and even more to qPCR for DEFB CN determination. Although the applied PRT provides in most cases reliable results, such a test is particularly sensitive to low-frequency sequence variations preferably accumulating in loci like pseudogenes which are most likely not under selective pressure. In the light of the superior performance of multiplex assays, the drawbacks of such single PRTs could be overcome by combining more test markers.
Alcoholic Cirrhosis Increases Risk for Autoimmune Diseases: A Nationwide Registry-Based Cohort Study
Resumo:
BACKGROUND & AIMS Alcoholic cirrhosis is associated with hyperactivation and dysregulation of the immune system. In addition to its ability to increase risk for infections, it also may increase the risk for autoimmune diseases. We studied the incidence of autoimmune diseases among patients with alcoholic cirrhosis vs controls in Denmark. METHODS We collected data from nationwide health care registries to identify and follow up all citizens of Denmark diagnosed with alcoholic cirrhosis from 1977 through 2010. Each patient was matched with 5 random individuals from the population (controls) of the same sex and age. The incidence rates of various autoimmune diseases were compared between patients with cirrhosis and controls and adjusted for the number of hospitalizations in the previous year (a marker for the frequency of clinical examination). RESULTS Of the 24,679 patients diagnosed with alcoholic cirrhosis, 532 developed an autoimmune disease, yielding an overall increased adjusted incidence rate ratio (aIRR) of 1.36 (95% confidence interval [CI], 1.24-1.50). The strongest associations were with Addison's disease (aIRR, 2.47; 95% CI, 1.04-5.85), inflammatory bowel disease (aIRR, 1.56; 95% CI, 1.26-1.92), celiac disease (aIRR, 5.12; 95% CI, 2.58-10.16), pernicious anemia (aIRR, 2.35; 95% CI, 1.50-3.68), and psoriasis (aIRR, 4.06; 95% CI, 3.32-4.97). There was no increase in the incidence rate for rheumatoid arthritis (aIRR, 0.89; 95% CI, 0.69-1.15); the incidence rate for polymyalgia rheumatica decreased in patients with alcoholic cirrhosis compared with controls (aIRR, 0.47; 95% CI, 0.33-0.67). CONCLUSIONS Based on a nationwide cohort study of patients in Denmark, alcoholic cirrhosis is a risk factor for several autoimmune diseases.
Resumo:
Astronauts performing extravehicular activities (EVA) are at risk for occupational hazards due to a hypobaric environment, in particular Decompression Sickness (DCS). DCS results from nitrogen gas bubble formation in body tissues and venous blood. Denitrogenation achieved through lengthy staged decompression protocols has been the mainstay of prevention of DCS in space. Due to the greater number and duration of EVAs scheduled for construction and maintenance of the International Space Station, more efficient alternatives to accomplish missions without compromising astronaut safety are desirable. ^ This multi-center, multi-phase study (NASA-Prebreathe Reduction Protocol study, or PRP) was designed to identify a shorter denitrogenation protocol that can be implemented before an EVA, based on the combination of adynamia and exercise enhanced oxygen prebreathe. Human volunteers recruited at three sites (Texas, North Carolina and Canada) underwent three different combinations (“PRP phases”) of intense and light exercise prior to decompression in an altitude chamber. The outcome variables were detection of venous gas embolism (VGE) by precordial Doppler ultrasound, and clinical manifestations of DCS. Independent variables included age, gender, body mass index, oxygen consumption peak, peak heart rate, and PRP phase. Data analysis was performed both by pooling results from all study sites, and by examining each site separately. ^ Ten percent of the subjects developed DCS and 20% showed evidence of high grade VGE. No cases of DCS occurred in one particular PRP phase with use of the combination of dual-cycle ergometry (10 minutes at 75% of VO2 peak) plus 24 minutes of light EVA exercise (p = 0.04). No significant effects were found for the remaining independent variables on the occurrence of DCS. High grade VGE showed a strong correlation with subsequent development of DCS (sensitivity, 88.2%; specificity, 87.2%). In the presence of high grade VGE, the relative risk for DCS ranged from 7.52 to 35.0. ^ In summary, a good safety level can be achieved with exercise-enhanced oxygen denitrogenation that can be generalized to the astronaut population. Exercise is beneficial in preventing DCS if a specific schedule is followed, with an individualized VO2 prescription that provides a safety level that can then be applied to space operations. Furthermore, VGE Doppler detection is a useful clinical tool for prediction of altitude DCS. Because of the small number of high grade VGE episodes, the identification of a high probability DCS situation based on the presence of high grade VGE seems justified in astronauts. ^
Resumo:
Genital human papillomavirus (HPV) is of public health concern because persistent infection with certain HPV types can cause cervical cancer. In response to a nationwide push for cervical cancer legislation, Texas Governor Rick Perry bypassed the traditional legislative process and issued an executive order mandating compulsory HPV vaccinations for all female public school students prior to their entrance in the sixth grade. By bypassing the legislative process Governor Perry did not effectively mitigate the risk perception issues that arose around the need for and usefulness of the vaccine mandate. This policy paper uses a social policy paradigm to identify perception as the key intervening factor on how the public responds to risk information. To demonstrate how the HPV mandate failed, it analyzes four factors, economics, politics, knowledge and culture, that shape perception and influence the public's response. By understanding the factors that influence the public's perception, public health practitioners and policy makers can more effectively create preventive health policy at the state level. ^
Resumo:
Approximately one-third of US adults have metabolic syndrome, the clustering of cardiovascular risk factors that include hypertension, abdominal adiposity, elevated fasting glucose, low high-density lipoprotein (HDL)-cholesterol and elevated triglyceride levels. While the definition of metabolic syndrome continues to be much debated among leading health research organizations, the fact is that individuals with metabolic syndrome have an increased risk of developing cardiovascular disease and/or type 2 diabetes. A recent report by the Henry J. Kaiser Family Foundation found that the US spent $2.2 trillion (16.2% of the Gross Domestic Product) on healthcare in 2007 and cited that among other factors, chronic diseases, including type 2 diabetes and cardiovascular disease, are large contributors to this growing national expenditure. Bearing a substantial portion of this cost are employers, the leading providers of health insurance. In lieu of this, many employers have begun implementing health promotion efforts to counteract these rising costs. However, evidence-based practices, uniform guidelines and policy do not exist for this setting in regard to the prevention of metabolic syndrome risk factors as defined by the National Cholesterol Education Program (NCEP) Adult Treatment Panel III (ATP III). Therefore, the aim of this review was to determine the effects of worksite-based behavior change programs on reducing the risk factors for metabolic syndrome in adults. Using relevant search terms, OVID MEDLINE was used to search the peer-reviewed literature published since 1998, resulting in 23 articles meeting the inclusion criteria for the review. The American Dietetic Association's Evidence Analysis Process was used to abstract data from selected articles, assess the quality of each study, compile the evidence, develop a summarized conclusion, and assign a grade based upon the strength of supporting evidence. The results revealed that participating in a worksite-based behavior change program may be associated in one or more improved metabolic syndrome risk factors. Programs that delivered a higher dose (>22 hours), in a shorter duration (<2 years) using two or more behavior-change strategies were associated with more metabolic risk factors being positively impacted. A Conclusion Grade of III was obtained for the evidence, indicating that studies were of weak design or results were inconclusive due to inadequate sample sizes, bias and lack of generalizability. These results provide some support for the continued use of worksite-based health promotion and further research is needed to determine if multi-strategy, intense behavior change programs targeting multiple risk factors are able to sustain health improvements in the long-term.^
Resumo:
Background. It is important to understand the association between diet and risk of pancreatic cancer in order to better understand the etiology of pancreatic cancer.^ Objectives. Describe the dietary patterns of cases of adenocarcinoma of the pancreas and non-cancer controls and evaluate the odds of having a healthy eating pattern among cases and non-cancer controls.^ Design and Methods. An ongoing hospital-based case-control study was conducted in Houston, Texas from 2000-2008 with 678 pancreatic adenocarcinoma cases and 724 controls. Participants completed a food frequency questionnaire and a risk factor questionnaire. Dietary patterns were derived by principal component analysis and associations between dietary patterns and pancreatic cancer risk were assessed using unconditional logistic regression.^ Results. Two dietary patterns were derived: fruit-vegetable and high fat-meat. There were no statistically significant associations between the fruit-vegetable pattern and pancreatic cancer. An inverse association was seen between the high fat-meat pattern and pancreatic cancer risk when comparing those in the upper intake quintile to those scoring in the lowest quintile after adjusting for demographic and risk factor variables (OR=0.67, p=0.03). In sex-stratified analysis adjusted for demographic and risk factor variables, females scoring in the upper intake quintile of the fruit-vegetable pattern had a 49% lower risk of pancreatic cancer compared to females scoring in the lowest quintile (OR=0.51, p=0.03). An inverse relationship was also seen for the high fat-meat pattern when comparing females in the upper intake quintile to females in the lowest quintile (OR=0.50, p=0.03). In males, neither dietary pattern was significantly associated with pancreatic cancer.^ Conclusions. The current findings for the fruit-vegetable pattern are similar to those of previous studies and support the hypothesis that there is an inverse association between a “healthy” diet (comprised of fruits, vegetables, and whole grains) and risk of having pancreatic cancer (in females only). However, the inverse relationship with the high fat-meat pattern and risk of pancreatic cancer is contrary to other results. Further research on dietary patters and pancreatic cancer risk may lead to better understanding of the etiologic cause of pancreatic cancer.^