103 resultados para Risk-factor Profile


Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To examine the duration of methicillin-resistant Staphylococcus aureus (MRSA) carriage and its determinants and the influence of eradication regimens. DESIGN: Retrospective cohort study. SETTING: A 1,033-bed tertiary care university hospital in Bern, Switzerland, in which the prevalence of methicillin resistance among S. aureus isolates is less than 5%. PATIENTS: A total of 116 patients with first-time MRSA detection identified at University Hospital Bern between January 1, 2000, and December 31, 2003, were followed up for a mean duration of 16.2 months. RESULTS: Sixty-eight patients (58.6%) cleared colonization, with a median time to clearance of 7.4 months. Independent determinants for shorter carriage duration were the absence of any modifiable risk factor (receipt of antibiotics, use of an indwelling device, or presence of a skin lesion) (hazard ratio [HR], 0.20 [95% confidence interval {CI}, 0.09-0.42]), absence of immunosuppressive therapy (HR, 0.49 [95% CI, 0.23-1.02]), and hemodialysis (HR, 0.08 [95% CI, 0.01-0.66]) at the time MRSA was first MRSA detected and the administration of decolonization regimen in the absence of a modifiable risk factor (HR, 2.22 [95% CI, 1.36-3.64]). Failure of decolonization treatment was associated with the presence of risk factors at the time of treatment (P=.01). Intermittent screenings that were negative for MRSA were frequent (26% of patients), occurred early after first detection of MRSA (median, 31.5 days), and were associated with a lower probability of clearing colonization (HR, 0.34 [95% CI, 0.17-0.67]) and an increased risk of MRSA infection during follow-up. CONCLUSIONS: Risk factors for MRSA acquisition should be carefully assessed in all MRSA carriers and should be included in infection control policies, such as the timing of decolonization treatment, the definition of MRSA clearance, and the decision of when to suspend isolation measures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed in the early stage may render timely preventive measures difficult. In order to assess the risk factors, patient should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. Particularly, patients with more than four dietary acid intakes have a higher risk for erosion when other risk factors (such as holding the drink in the mouth) are present. Regurgitation of gastric acids (reflux, vomiting, alcohol abuse, etc.) is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, optimization of fluoride regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with a low abrasive toothpaste. The frequent use of fluoride gel and fluoride solution in addition to fluoride toothpaste offers the opportunity to reduce somewhat abrasion of tooth substance. It is also advisable to avoid abrasive tooth cleaning and whitening products, since they may remove the pellicle and may render teeth more susceptible to erosion. Since erosion, attrition and abrasion often occur simultaneously all causative components must be taken into consideration when planning preventive strategies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: To estimate changes in coronary risk factors and their implications for coronary heart disease (CHD) rates in men starting highly active antiretroviral therapy (HAART). METHODS: Men participating in the Swiss HIV Cohort Study with measurements of coronary risk factors both before and up to 3 years after starting HAART were identified. Fractional polynomial regression was used to graph associations between risk factors and time on HAART. Mean risk factor changes associated with starting HAART were estimated using multilevel models. A prognostic model was used to predict corresponding CHD rate ratios. RESULTS: Of 556 eligible men, 259 (47%) started a nonnucleoside reverse transcriptase inhibitor (NNRTI) and 297 a protease inhibitor (PI) based regimen. Levels of most risk factors increased sharply during the first 3 months on HAART, then more slowly. Increases were greater with PI- than NNRTI-based HAART for total cholesterol (1.18 vs. 0.98 mmol L(-1)), systolic blood pressure (3.6 vs. 0 mmHg) and BMI (1.04 vs. 0.55 kg m(2)) but not HDL cholesterol (0.24 vs. 0.32 mmol L(-1)) or glucose (1.02 vs. 1.03 mmol L(-1)). Predicted CHD rate ratios were 1.40 (95% CI 1.13-1.75) and 1.17 (0.95-1.47) for PI- and NNRTI-based HAART respectively. CONCLUSIONS: Coronary heart disease rates will increase in a majority of patients starting HAART: however the increases corresponding to typical changes in risk factors are relatively modest and could be offset by lifestyle changes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Mannose-binding lectin-associated serine protease-2 (MASP-2) is an essential component of the lectin pathway of complement activation. MASP-2 deficiency is common because of genetic polymorphisms, but its impact on susceptibility to infection is largely unknown. The aim of the present study was to determine whether children with cancer and MASP-2 deficiency develop more frequent or more severe episodes of fever and severe chemotherapy-induced neutropenia (FN). METHODS: Serum MASP-2 was measured by enzyme-linked immunosorbent assay at the time of diagnosis in children treated with chemotherapy for cancer. Association of FN episodes with MASP-2 concentration was analyzed using Poisson regression accounting for chemotherapy intensity and duration. RESULTS: Median MASP-2 in 94 children was 527 ng/mL (interquartile range, 367-686). Nine (10%) children had MASP-2 deficiency (<200 ng/mL). During a cumulative chemotherapy exposure time of 82 years, 177 FN episodes were recorded. MASP-2 deficient children had a significantly increased risk of developing FN (multivariate risk ratio, 2.08; 95% confidence interval, 1.31-3.21; P = 0.002), translating into significantly prolonged cumulative duration of hospitalization and of intravenous antimicrobial therapy. They experienced significantly more episodes of FN without a microbiologically defined etiology, and there was a trend toward more frequent episodes of FN with bacteremia. CONCLUSION: In this study, MASP-2 deficiency was associated with an increased risk of FN in children treated with chemotherapy for cancer. MASP-2 deficiency represents a novel risk factor for chemotherapy-related infections.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Respiratory syncytial virus (RSV) infection is an important cause of viral respiratory tract infection in children. In contrast to other confirmed risk factors that predispose to a higher morbidity and mortality, the particular risk of a preexisting neuromuscular impairment (NMI) in hospitalized children with RSV infection has not been prospectively studied in a multicenter trial. METHODS: The DMS RSV Paed database was designed for the prospective multicenter documentation and analysis of all clinically relevant aspects of the management of inpatients with RSV infection. Patients with clinically relevant NMI were identified according to the specific comments of the attending physicians and compared with those without NMI. RESULTS: This study covers 6 consecutive seasons; the surveillance took place in 14 pediatric hospitals in Germany from 1999 to 2005. In total, 1568 RSV infections were prospectively documented in 1541 pediatric patients. Of these, 73 (4.7%) patients displayed a clinically relevant NMI; 41 (56%) NMI patients had at least 1 additional risk factor for a severe course of the infection (multiple risk factors in some patients; prematurity in 30, congenital heart disease in 19, chronic lung disease 6 and immunodeficiency in 8). Median age at diagnosis was higher in NMI patients (14 vs. 5 months); NMI patients had a greater risk of seizures (15.1% vs. 1.6%), and a higher proportion in the NMI group had to be mechanically ventilated (9.6% vs. 1.9%). Eventually, the attributable mortality was significantly higher in the NMI group (5.5% vs. 0.2%; P < 0.001 for all). Multivariate logistic regression confirmed that NMI was independently associated with pediatric intensive care unit (PICU) admission (OR, 4.94; 95% CI, 2.69-8.94; P < 0.001] and mechanical ventilation (OR, 3.85; 95% CI, 1.28-10.22; P = 0.017). CONCLUSION: This is the first prospective multicenter study confirming the hypothesis that children with clinically relevant NMI face an increased risk for severe RSV-disease. It seems reasonable to include NMI as a cofactor into the decision algorithm of passive immunization.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Many epidemiological studies indicate a positive correlation between cataract surgery and the subsequent progression of age-related macular degeneration (AMD). Such a correlation would have far-reaching consequences. However, in epidemiological studies it is difficult to determine the significance of a single risk factor, such as cataract surgery. PATIENTS AND METHODS: We performed a retrospective case-control study of patients with new onset exudative age-related macular degeneration to determine if cataract surgery was a predisposing factor. A total of 1496 eyes were included in the study: 984 cases with new onset of exudative AMD and 512 control eyes with early signs of age-related maculopathy. Lens status (phakic or pseudophakic) was determined for each eye. RESULTS: There was no significant difference in lens status between study and control group (227/984 [23.1 %] vs. 112/512 [21.8 %] pseudophakic, p = 0.6487; OR = 1.071; 95 % CI = 0.8284-1.384). In cases with bilateral pseudophakia (n = 64) no statistically significant difference of the interval between cataract surgery in either eye and the onset of exudative AMD in the study eye was found (225.9 +/- 170.4 vs. 209.9 +/- 158.2 weeks, p = 0.27). CONCLUSIONS: Our results provide evidence that cataract surgery is not a major risk factor for the development of exudative AMD.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mass screening for osteoporosis using DXA measurements at the spine and hip is presently not recommended by health authorities. Instead, risk factor questionnaires and peripheral bone measurements may facilitate the selection of women eligible for axial bone densitometry. The aim of this study was to validate a case finding strategy for postmenopausal women who would benefit most from subsequent DXA measurement by using phalangeal radiographic absorptiometry (RA) alone or in combination with risk factors in a general practice setting. The sensitivity and specificity of this strategy in detecting osteoporosis (T-score < or =2.5 SD at the spine and/or the hip) were compared with those of the current reimbursement criteria for DXA measurements in Switzerland. Four hundred and twenty-three postmenopausal women with one or more risk factors for osteoporosis were recruited by 90 primary care physicians who also performed the phalangeal RA measurements. All women underwent subsequent DXA measurement of the spine and the hip at the Osteoporosis Policlinic of the University Hospital of Berne. They were allocated to one of two groups depending on whether they matched with the Swiss reimbursement conditions for DXA measurement or not. Logistic regression models were used to predict the likelihood of osteoporosis versus "no osteoporosis" and to derive ROC curves for the various strategies. Differences in the areas under the ROC curves (AUC) were tested for significance. In women lacking reimbursement criteria, RA achieved a significantly larger AUC (0.81; 95% CI 0.72-0.89) than the risk factors associated with patients' age, height and weight (0.71; 95% C.I. 0.62-0.80). Furthermore, in this study, RA provided a better sensitivity and specificity in identifying women with underlying osteoporosis than the currently accepted criteria for reimbursement of DXA measurement. In the Swiss environment, RA is a valid case finding tool for patients with risk factors for osteoporosis, especially for those who do not qualify for DXA reimbursement.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Extracapsular tumor spread (ECS) has been identified as a possible risk factor for breast cancer recurrence, but controversy exists regarding its role in decision making for regional radiotherapy. This study evaluates ECS as a predictor of local, axillary, and supraclavicular recurrence. PATIENTS AND METHODS: International Breast Cancer Study Group Trial VI accrued 1475 eligible pre- and perimenopausal women with node-positive breast cancer who were randomly assigned to receive three to nine courses of classical combination chemotherapy with cyclophosphamide, methotrexate, and fluorouracil. ECS status was determined retrospectively in 933 patients based on review of pathology reports. Cumulative incidence and hazard ratios (HRs) were estimated using methods for competing risks analysis. Adjustment factors included treatment group and baseline patient and tumor characteristics. The median follow-up was 14 years. RESULTS: In univariable analysis, ECS was significantly associated with supraclavicular recurrence (HR = 1.96; 95% confidence interval 1.23-3.13; P = 0.005). HRs for local and axillary recurrence were 1.38 (P = 0.06) and 1.81 (P = 0.11), respectively. Following adjustment for number of lymph node metastases and other baseline prognostic factors, ECS was not significantly associated with any of the three recurrence types studied. CONCLUSIONS: Our results indicate that the decision for additional regional radiotherapy should not be based solely on the presence of ECS.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: This study analyzed the impact of weight reduction method, preoperative, and intraoperative variables on the outcome of reconstructive body contouring surgery following massive weight reduction. METHODS: All patients presenting with a maximal BMI >/=35 kg/m(2) before weight reduction who underwent body contouring surgery of the trunk following massive weight loss (excess body mass index loss (EBMIL) >/= 30%) between January 2002 and June 2007 were retrospectively analyzed. Incomplete records or follow-up led to exclusion. Statistical analysis focused on weight reduction method and pre-, intra-, and postoperative risk factors. The outcome was compared to current literature results. RESULTS: A total of 104 patients were included (87 female and 17 male; mean age 47.9 years). Massive weight reduction was achieved through bariatric surgery in 62 patients (59.6%) and dietetically in 42 patients (40.4%). Dietetically achieved excess body mass index loss (EBMIL) was 94.20% and in this cohort higher than surgically induced reduction EBMIL 80.80% (p < 0.01). Bariatric surgery did not present increased risks for complications for the secondary body contouring procedures. The observed complications (26.9%) were analyzed for risk factors. Total tissue resection weight was a significant risk factor (p < 0.05). Preoperative BMI had an impact on infections (p < 0.05). No impact on the postoperative outcome was detected in EBMIL, maximal BMI, smoking, hemoglobin, blood loss, body contouring technique or operation time. Corrective procedures were performed in 11 patients (10.6%). The results were compared to recent data. CONCLUSION: Bariatric surgery does not increase risks for complications in subsequent body contouring procedures when compared to massive dietetic weight reduction.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cholesterol circulating levels are elevated in most of the patients with primary biliary cirrhosis. This review questions whether hypercholesterolaemia represents a cardiovascular risk in primary biliary cirrhosis and whether it should be treated. The published evidence indicates that hypercholesterolaemia in patients with primary biliary cirrhosis should be considered a cardiovascular risk factor only when other factors are present. Ursodeoxycholic acid the standard treatment of primary biliary cirrhosis improves the cholestasis and hereby lowers circulating levels of cholesterol. Primary biliary cirrhosis is not a contraindication to prescribe statins or fibrates to these patients. Interestingly, these two classes of drugs have been shown to improve not only the lipid profile but also the liver tests. In particular fibrates have been found to normalize liver tests in patients responding incompletely to ursodeoxycholic acid. Statins as well as fibrates possess specific anti-inflammatory properties which may be beneficial in primary biliary cirrhosis. In conclusion, hypercholesterolaemia in the absence of other cardiovascular risk factors does not require specific therapeutic intervention in patients with primary biliary cirrhosis. However, statins as well as fibrates seem to have beneficial effects on the primary biliary cirrhosis itself and deserve formal testing within clinical trials.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To investigate HIV-related immunodeficiency as a risk factor for hepatocellular carcinoma (HCC) among persons infected with HIV, while controlling for the effect of frequent coinfection with hepatitis C and B viruses. DESIGN: A case-control study nested in the Swiss HIV Cohort Study. METHODS: Twenty-six HCC patients were identified in the Swiss HIV Cohort Study or through linkage with Swiss Cancer Registries, and were individually matched to 251 controls according to Swiss HIV Cohort Study centre, sex, HIV-transmission category, age and year at enrollment. Odds ratios and corresponding confidence intervals were estimated by conditional logistic regression. RESULTS: All HCC patients were positive for hepatitis B surface antigen or antibodies against hepatitis C virus. HCC patients included 14 injection drug users (three positive for hepatitis B surface antigen and 13 for antibodies against hepatitis C virus) and 12 men having sex with men/heterosexual/other (11 positive for hepatitis B surface antigen, three for antibodies against hepatitis C virus), revealing a strong relationship between HIV transmission route and hepatitis viral type. Latest CD4+ cell count [Odds ratio (OR) per 100 cells/mul decrease = 1.33, 95% confidence interval (CI) 1.06-1.68] and CD4+ cell count percentage (OR per 10% decrease = 1.65, 95% CI 1.01-2.71) were significantly associated with HCC. The effects of CD4+ cell count were concentrated among men having sex with men/heterosexual/other rather than injecting drug users. Highly active antiretroviral therapy use was not significantly associated with HCC risk (OR for ever versus never = 0.59, 95% confidence interval 0.18-1.91). CONCLUSION: Lower CD4+ cell counts increased the risk for HCC among persons infected with HIV, an effect that was particularly evident for hepatitis B virus-related HCC arising in non-injecting drug users.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Context: In the Health Outcomes and Reduced Incidence with Zoledronic Acid Once Yearly - Pivotal Fracture Trial (HORIZON-PFT), zoledronic acid (ZOL) 5 mg significantly reduced fracture risk. Objective: To identify factors associated with greater efficacy during ZOL 5 mg treatment. Design, Setting and Patients: Subgroup analysis (preplanned and post hoc) of a multicenter, double-blind, placebo-controlled, 36-month trial in 7765 women with postmenopausal osteoporosis. Intervention: Single infusion of ZOL 5 mg or placebo at baseline, 12 and 24 months. Main Outcome Measures: Primary endpoints: new vertebral fracture and hip fracture. Secondary endpoints: non-vertebral fracture, change in femoral neck bone mineral density (BMD). Baseline risk factor subgroups: age, BMD T-score and vertebral fracture status, total hip BMD, race, weight, geographical region, smoking, height loss, history of falls, physical activity, prior bisphosphonates, creatinine clearance, body mass index (BMI), concomitant osteoporosis medications. Results: Greater ZOL induced effects on vertebral fracture risk with younger age (treatment-by-subgroup interaction P=0.05), normal creatinine clearance (P=0.04), and BMI >/=25 kg/m(2) (P=0.02). There were no significant treatment-factor interactions for hip or non-vertebral fracture or for change in BMD. Conclusions: ZOL appeared more effective in preventing vertebral fracture in younger women, overweight/obese women and women with normal renal function. ZOL had similar effects irrespective of fracture risk factors or femoral neck BMD.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Early onset neonatal sepsis due to Group B streptococci (GBS) is responsible for severe morbidity and mortality of newborns. While different preventive strategies to identify women at risk are being recommended, the optimal strategy depends on the incidence of GBS-sepsis and on the prevalence of anogenital GBS colonization. We therefore aimed to assess the Group B streptococci prevalence and its consequences on different prevention strategies. We analyzed 1316 pregnant women between March 2005 and September 2006 at our institution. The prevalence of GBS colonization was determined by selective cultures of anogenital smears. The presence of risk factors was analyzed. In addition, the direct costs of screening and intrapartum antibiotic prophylaxis were estimated for different preventive strategies. The prevalence of GBS colonization was 21%. Any maternal intrapartum risk factor was present in 37%. The direct costs of different prevention strategies have been estimated as follows: risk-based: 18,500 CHF/1000 live births, screening-based: 50,110 CHF/1000 live births, combined screening- and risk-based: 43,495/1000 live births. Strategies to prevent GBS-sepsis in newborn are necessary. With our colonization prevalence of 21%, and the intrapartum risk profile of women, the screening-based approach seems to be superior as compared to a risk-based approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hypertension is a known risk factor for cardiovascular disease. Hypertensive individuals show exaggerated norepinephrine (NE) reactivity to stress. Norepinephrine is a known lipolytic factor. It is unclear if, in hypertensive individuals, stress-induced increases in NE are linked with the elevations in stress-induced circulating lipid levels. Such a mechanism could have implications for atherosclerotic plaque formation. In a cross-sectional, quasi-experimentally controlled study, 22 hypertensive and 23 normotensive men (mean +/- SEM, 45 +/- 3 years) underwent an acute standardized psychosocial stress task combining public speaking and mental arithmetic in front of an audience. We measured plasma NE and the plasma lipid profile (total cholesterol [TC], low-density-lipoprotein cholesterol [LDL-C], high-density-lipoprotein cholesterol, and triglycerides) immediately before and after stress and at 20 and 60 minutes of recovery. All lipid levels were corrected for stress hemoconcentration. Compared with normotensives, hypertensives had greater TC (P = .030) and LDL-C (P = .037) stress responses. Independent of each other, mean arterial pressure (MAP) upon screening and immediate increase in NE predicted immediate stress change in TC (MAP: beta = .41, P = .003; NE: beta = .35, P = .010) and LDL-C (MAP: beta = .32, P = .024; NE: beta = .38, P = .008). Mean arterial pressure alone predicted triglycerides stress change (beta = .32, P = .043) independent of NE stress change, age, and BMI. The MAP-by-NE interaction independently predicted immediate stress change of high-density-lipoprotein cholesterol (beta = -.58, P < .001) and of LDL-C (beta = -.25, P < .08). We conclude that MAP and NE stress reactivity may elicit proatherogenic changes of plasma lipids in response to acute psychosocial stress, providing one mechanism by which stress might increase cardiovascular risk in hypertension.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The highly pathogenic avian influenza (HPAI) H5N1 virus that emerged in southern China in the mid-1990s has in recent years evolved into the first HPAI panzootic. In many countries where the virus was detected, the virus was successfully controlled, whereas other countries face periodic reoccurrence despite significant control efforts. A central question is to understand the factors favoring the continuing reoccurrence of the virus. The abundance of domestic ducks, in particular free-grazing ducks feeding in intensive rice cropping areas, has been identified as one such risk factor based on separate studies carried out in Thailand and Vietnam. In addition, recent extensive progress was made in the spatial prediction of rice cropping intensity obtained through satellite imagery processing. This article analyses the statistical association between the recorded HPAI H5N1 virus presence and a set of five key environmental variables comprising elevation, human population, chicken numbers, duck numbers, and rice cropping intensity for three synchronous epidemic waves in Thailand and Vietnam. A consistent pattern emerges suggesting risk to be associated with duck abundance, human population, and rice cropping intensity in contrast to a relatively low association with chicken numbers. A statistical risk model based on the second epidemic wave data in Thailand is found to maintain its predictive power when extrapolated to Vietnam, which supports its application to other countries with similar agro-ecological conditions such as Laos or Cambodia. The model’s potential application to mapping HPAI H5N1 disease risk in Indonesia is discussed.