995 resultados para Confidence interval
Resumo:
BACKGROUND: The strength of the association between intensive care unit (ICU)-acquired nosocomial infections (NIs) and mortality might differ according to the methodological approach taken. OBJECTIVE: To assess the association between ICU-acquired NIs and mortality using the concept of population-attributable fraction (PAF) for patient deaths caused by ICU-acquired NIs in a large cohort of critically ill patients. SETTING: Eleven ICUs of a French university hospital. DESIGN: We analyzed surveillance data on ICU-acquired NIs collected prospectively during the period from 1995 through 2003. The primary outcome was mortality from ICU-acquired NI stratified by site of infection. A matched-pair, case-control study was performed. Each patient who died before ICU discharge was defined as a case patient, and each patient who survived to ICU discharge was defined as a control patient. The PAF was calculated after adjustment for confounders by use of conditional logistic regression analysis. RESULTS: Among 8,068 ICU patients, a total of 1,725 deceased patients were successfully matched with 1,725 control patients. The adjusted PAF due to ICU-acquired NI for patients who died before ICU discharge was 14.6% (95% confidence interval [CI], 14.4%-14.8%). Stratified by the type of infection, the PAF was 6.1% (95% CI, 5.7%-6.5%) for pulmonary infection, 3.2% (95% CI, 2.8%-3.5%) for central venous catheter infection, 1.7% (95% CI, 0.9%-2.5%) for bloodstream infection, and 0.0% (95% CI, -0.4% to 0.4%) for urinary tract infection. CONCLUSIONS: ICU-acquired NI had an important effect on mortality. However, the statistical association between ICU-acquired NI and mortality tended to be less pronounced in findings based on the PAF than in study findings based on estimates of relative risk. Therefore, the choice of methods does matter when the burden of NI needs to be assessed.
Resumo:
BACKGROUND: Mental disorders in primary care patients are frequently associated with physical complaints that can mask the disorder. There is insufficient knowledge concerning the role of anxiety, depression, and somatoform disorders in patients presenting with physical symptoms. Our primary objective was to determine the prevalence of depression, anxiety, and somatoform disorders among primary care patients with a physical complaint. We also investigated the relationship between cumulated psychosocial stressors and mental disorders. METHODS: We conducted a multicentre cross-sectional study in twenty-one private practices and in one academic primary care centre in Western Switzerland. Randomly selected patients presenting with a spontaneous physical complaint were asked to complete the self-administered Patient Health Questionnaire (PHQ) between November 2004 and July 2005. The validated French version of the PHQ allowed the diagnosis of mental disorders (DSM-IV criteria) and the analyses of exposure to psychosocial stressors. RESULTS: There were 917 patients exhibiting at least one physical symptom included. The rate of depression, anxiety, and somatoform disorders was 20.0% (95% confidence interval [CI] = 17.4% to 22.7%), 15.5% (95% CI = 13.2% to 18.0%), and 15.1% (95% CI = 12.8% to 17.5%), respectively. Psychosocial stressors were significantly associated with mental disorders. Patients with an accumulation of psychosocial stressors were more likely to present anxiety, depression, or somatoform disorders, with an increase of 2.2 fold (95% CI = 2.0 to 2.5) for each additional stressor. CONCLUSIONS: The investigation of mental disorders and psychosocial stressors among patients with physical complaints is relevant in primary care. Psychosocial stressors should be explored as potential epidemiological causes of mental disorders.
Resumo:
OBJECTIVE: To assess whether lambda waves are elicited by watching television (TV) and their association with demographical and EEG features. METHODS: We retrospectively compared lambda wave occurrence in prolonged EEG monitorings of outpatients who were allowed to watch TV and in standard EEGs recorded in TV-free rooms. All EEGs were interpreted by the same two electroencephalographers. RESULTS: Of 2,072 standard EEG reports, 36 (1.7 %) mentioned lambda waves versus 46 (32.2%) of 143 prolonged EEG monitoring reports (P < 0.001). Multivariable comparison of prolonged EEG monitorings and standard EEGs disclosed that recordings performed in rooms with a TV (odds ratio, 20.6; 95% confidence interval, 4.8-88.0) and normal EEGs (odds ratio, 3.03; 95% confidence interval, 1.5-6.25) were independently associated with lambda waves. In the prolonged EEG monitoring group, all recordings with lambda waves also had positive occipital sharp transients of sleep. CONCLUSIONS: Watching TV likely represents a powerful and previously unrecognized stimulus for lambda waves. Furthermore, this study confirms the benign nature of this EEG variant and its strong association with positive occipital sharp transients of sleep.
Resumo:
OBJECTIVES: This study sought to assess outcomes in patients with ST-segment elevation myocardial infarction undergoing primary percutaneous coronary intervention (PCI) for unprotected left main (LM) disease. BACKGROUND: Limited data are available on outcomes in patients with ST-segment elevation myocardial infarction undergoing LM PCI. METHODS: Of 9,075 patients with ST-segment elevation myocardial infarction enrolled in the AMIS (Acute Myocardial Infarction in Switzerland) Plus registry between 2005 and June 30, 2010, 6,666 underwent primary PCI. Of them, 348 (5.2%; mean age: 63.5 ± 12.6 years) underwent LM PCI, either isolated (n = 208) or concomitant to PCI for other vessel segments (n = 140). They were compared with 6,318 patients (94.8%; mean age: 61.9 ± 12.5 years) undergoing PCI of non-LM vessel segments only. RESULTS: The LM patients had higher rates of cardiogenic shock (12.2% vs. 3.5%; p < 0.001), cardiac arrest (10.6% vs. 6.3%; p < 0.01), in-hospital mortality (10.9% vs. 3.8%; p < 0.001), and major adverse cardiac and cerebrovascular events (12.4% vs. 5.0%; p < 0.001) than non-LM PCI. Rates of mortality and major adverse cardiac and cerebrovascular events were highest for concurrent LM and non-LM PCI (17.9% and 18.6%, respectively), intermediate for isolated LM PCI (6.3% and 8.3%, respectively), and lowest for non-LM PCI (3.8% and 5.0%, respectively). Rates of mortality and major adverse cardiac and cerebrovascular events for LM PCI were higher than for non-LM multivessel PCI (10.9% vs. 4.9%, p < 0.001, and 12.4% vs. 6.4%, p < 0.001, respectively). LM disease independently predicted in-hospital death (odds ratio: 2.36; 95% confidence interval: 1.34 to 4.17; p = 0.003). CONCLUSIONS: Emergent LM PCI in the context of acute myocardial infarction, even including 12% cardiogenic shock, appears to have a remarkably high (89%) in-hospital survival. Concurrent LM and non-LM PCI has worse outcomes than isolated LM PCI.
Resumo:
Objective To compute the burden of cancer attributable to current and former alcohol consumption in eight European countries based on direct relative risk estimates from a cohort study. Design Combination of prospective cohort study with representative population based data on alcohol exposure. Setting Eight countries (France, Italy, Spain, United Kingdom, the Netherlands, Greece, Germany, Denmark) participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) study. Participants 109 118 men and 254 870 women, mainly aged 37-70. Main outcome measures Hazard rate ratios expressing the relative risk of cancer incidence for former and current alcohol consumption among EPIC participants. Hazard rate ratios combined with representative information on alcohol consumption to calculate alcohol attributable fractions of causally related cancers by country and sex. Partial alcohol attributable fractions for consumption higher than the recommended upper limit (two drinks a day for men with about 24 g alcohol, one for women with about 12 g alcohol) and the estimated total annual number of cases of alcohol attributable cancer. Results If we assume causality, among men and women, 10% (95% confidence interval 7 to 13%) and 3% (1 to 5%) of the incidence of total cancer was attributable to former and current alcohol consumption in the selected European countries. For selected cancers the figures were 44% (31 to 56%) and 25% (5 to 46%) for upper aerodigestive tract, 33% (11 to 54%) and 18% (−3 to 38%) for liver, 17% (10 to 25%) and 4% (−1 to 10%) for colorectal cancer for men and women, respectively, and 5.0% (2 to 8%) for female breast cancer. A substantial part of the alcohol attributable fraction in 2008 was associated with alcohol consumption higher than the recommended upper limit: 33 037 of 178 578 alcohol related cancer cases in men and 17 470 of 397 043 alcohol related cases in women. Conclusions In western Europe, an important proportion of cases of cancer can be attributable to alcohol consumption, especially consumption higher than the recommended upper limits. These data support current political efforts to reduce or to abstain from alcohol consumption to reduce the incidence of cancer.
Resumo:
OBJECTIVE: To estimate the impact of a national primary care pay for performance scheme, the Quality and Outcomes Framework in England, on emergency hospital admissions for ambulatory care sensitive conditions (ACSCs). DESIGN: Controlled longitudinal study. SETTING: English National Health Service between 1998/99 and 2010/11. PARTICIPANTS: Populations registered with each of 6975 family practices in England. MAIN OUTCOME MEASURES: Year specific differences between trend adjusted emergency hospital admission rates for incentivised ACSCs before and after the introduction of the Quality and Outcomes Framework scheme and two comparators: non-incentivised ACSCs and non-ACSCs. RESULTS: Incentivised ACSC admissions showed a relative reduction of 2.7% (95% confidence interval 1.6% to 3.8%) in the first year of the Quality and Outcomes Framework compared with ACSCs that were not incentivised. This increased to a relative reduction of 8.0% (6.9% to 9.1%) in 2010/11. Compared with conditions that are not regarded as being influenced by the quality of ambulatory care (non-ACSCs), incentivised ACSCs also showed a relative reduction in rates of emergency admissions of 2.8% (2.0% to 3.6%) in the first year increasing to 10.9% (10.1% to 11.7%) by 2010/11. CONCLUSIONS: The introduction of a major national pay for performance scheme for primary care in England was associated with a decrease in emergency admissions for incentivised conditions compared with conditions that were not incentivised. Contemporaneous health service changes seem unlikely to have caused the sharp change in the trajectory of incentivised ACSC admissions immediately after the introduction of the Quality and Outcomes Framework. The decrease seems larger than would be expected from the changes in the process measures that were incentivised, suggesting that the pay for performance scheme may have had impacts on quality of care beyond the directly incentivised activities.
Resumo:
Objective. To examine the association between pre-diagnostic circulating vitamin D concentration, dietary intake of vitamin D and calcium, and the risk of colorectal cancer in European populations. Design Nested case-control study. Setting. The study was conducted within the EPIC study, a cohort of more than 520 000 participants from 10 western European countries. Participants: 1248 cases of incident colorectal cancer, which developed after enrolment into the cohort, were matched to 1248 controls. Main outcome measures. Circulating vitamin D concentration (25-hydroxy-vitamin-D, 25-(OH)D) was measured by enzyme immunoassay. Dietary and lifestyle data were obtained from questionnaires. Incidence rate ratios and 95% confidence intervals for the risk of colorectal cancer by 25-(OH)D concentration and levels of dietary calcium and vitamin D intake were estimated from multivariate conditional logistic regression models, with adjustment for potential dietary and other confounders. Results. 25-(OH)D concentration showed a strong inverse linear dose-response association with risk of colorectal cancer (P for trend <0.001). Compared with a pre-defined mid-level concentration of 25-(OH)D (50.0-75.0 nmol/l), lower levels were associated with higher colorectal cancer risk (<25.0 nmol/l: incidence rate ratio 1.32 (95% confidence interval 0.87 to 2.01); 25.0-49.9 nmol/l: 1.28 (1.05 to 1.56), and higher concentrations associated with lower risk (75.0-99.9 nmol/l: 0.88 (0.68 to 1.13); ≥100.0 nmol/l: 0.77 (0.56 to 1.06)). In analyses by quintile of 25-(OH)D concentration, patients in the highest quintile had a 40% lower risk of colorectal cancer than did those in the lowest quintile (P<0.001). Subgroup analyses showed a strong association for colon but not rectal cancer (P for heterogeneity=0.048). Greater dietary intake of calcium was associated with a lower colorectal cancer risk. Dietary vitamin D was not associated with disease risk. Findings did not vary by sex and were not altered by corrections for season or month of blood donation. Conclusions The results of this large observational study indicate a strong inverse association between levels of pre-diagnostic 25-(OH)D concentration and risk of colorectal cancer in western European populations. Further randomised trials are needed to assess whether increases in circulating 25-(OH)D concentration can effectively decrease the risk of colorectal cancer.
Resumo:
Background. The study of the severity of occupational injuries is very important for the establishment of prevention plans. The aim of this paper is to analyze the distribution of occupational injuries by a) individual factors b) work place characteristics and c) working conditions and to analyze the severity of occupational injuries by this characteristics in men and women in Andalusia. Methods. Injury data came from the accident registry of the Ministry of Labor and Social Issues in 2003. Dependent variable: the severity of the injury: slight, serious, very serious and fatal; the independent variables: the characteristics of the worker, company data, and the accident itself. Bivariate and multivariate analysis were done to estimate the probability of serious, very serious and fatal injury, related to other variables, through odds ratio (OR), and using a 95% confidence interval (CI 95%). Results. The 82,4% of the records were men and 17,6% were women, of whom the 78,1% are unskilled manual workers, compared to 44,9% of men. The men belonging to class I have a higher probability of more severe lesions (OR = 1,67, 95% CI = 1,17 – 2,38). Conclusions. The severity of the injury is associated with sex, age and type of injury. In men it is also related with the professional situation, the place where the accident happened, an unusual job, the size and the characteristics of the company and the social class, and in women with the sector
Resumo:
Untreated acute toxoplasmosis among pregnant women can lead to serious sequelae among newborns, including neurological impairment and blindness. In Brazil, the risk of congenital toxoplasmosis (CTox) has not been fully evaluated. Our aim was to evaluate trends in acute toxoplasmosis prevalence from 1998-2005, the incidence of CTox and the rate of mother-to-child transmission (MTCT). A cross-sectional study was undertaken to dentify patients who fit the criteria for acute toxoplasmosis during pregnancy. Exposed newborns were included in a historical cohort, with a median follow-up time of 11 months, to establish definite diagnosis of CTox. Diagnoses for acute infection in pregnancy and CTox were based on European Research Network on Congenital Toxoplasmosis criteria. In 41,112 pregnant women, the prevalence of acute toxoplasmosis was 4.8/1,000 women. The birth prevalence of CTox was 0.6/1,000 newborns [95% confidence interval (CI): 0.4-0.9]. During the follow-up study, 12 additional cases were detected, increasing the CTox rate to 0.9/1,000 newborns (95% CI: 0.6-1.3). Among the 200 newborns exposed to Toxoplasma gondii,there were 37 babies presenting diagnostic criteria of CTox, leading to an MTCT rate of 18.5% (95% CI: 13.4-24.6%). The additional cases identified during follow-up reinforce the need for serological monitoring during the first year of life, even in the absence of evidence of congenital infection at birth.
Resumo:
BACKGROUND: Decreasing exposure to airborne particulates was previously associated with reduced age-related decline in lung function. However, whether the benefit from improved air quality depends on genetic background is not known. Recent evidence points to the involvement of the genes p53 and p21 and of the cell cycle control gene cyclin D1 (CCND1) in the response of bronchial cells to air pollution. OBJECTIVE: We determined in 4,326 participants of the Swiss Cohort Study on Air Pollution and Lung and Heart Diseases in Adults (SAPALDIA) whether four single-nucleotide polymorphisms in three genes [CCND1 (rs9344 [P242P], rs667515), p53 (rs1042522 [R72P]), and p21 (rs1801270 [S31R])] modified the previously observed attenuation of the decline in the forced expiratory flow between 25% and 75% of the forced vital capacity (FEF(25-75)) associated with improved air quality. METHODS: Subjects of the prospective population-based SAPALDIA cohort were assessed in 1991 and 2002 by spirometry, questionnaires, and biological sample collection for genotyping. We assigned spatially resolved concentrations of particulate matter with aerodynamic diameter < or = 10 microm (PM(10)) to each participant's residential history 12 months before the baseline and follow-up assessments. RESULTS: The effect of diminishing PM(10) exposure on FEF(25-75) decline appeared to be modified by p53 R72P, CCND1 P242P, and CCND1 rs667515. For example, a 10-microg/m(3) decline in average PM(10) exposure over an 11-year period attenuated the average annual decline in FEF(25-75) by 21.33 mL/year (95% confidence interval, 10.57-32.08) among participants homozygous for the CCND1 (P242P) GG genotype, by 13.72 mL/year (5.38-22.06) among GA genotypes, and by 6.00 mL/year (-4.54 to 16.54) among AA genotypes. CONCLUSIONS: Our results suggest that cell cycle control genes may modify the degree to which improved air quality may benefit respiratory function in adults.
Resumo:
Despite medical advances, mortality in infective endocarditis (IE) is still very high. Previous studies on prognosis in IE have observed conflicting results. The aim of this study was to identify predictors of in-hospital mortality in a large multicenter cohort of left-sided IE.Methods An observational multicenter study was conducted from January 1984 to December 2006 in seven hospitals in Andalusia, Spain. Seven hundred and five left-side IE patients were included. The main outcome measure was in-hospital mortality. Several prognostic factors were analysed by univariate tests and then by multilogistic regression model. Results.The overall mortality was 29.5% (25.5% from 1984 to 1995 and 31.9% from 1996 to 2006; Odds Ratio 1.25; 95% Confidence Interval: 0.97-1.60; p = 0.07). In univariate analysis, age, comorbidity, especially chronic liver disease, prosthetic valve, virulent microorganism such as Staphylococcus aureus, Streptococcus agalactiae and fungi, and complications (septic shock, severe heart failure, renal insufficiency, neurologic manifestations and perivalvular extension) were related with higher mortality. Independent factors for mortality in multivariate analysis were: Charlson comorbidity score (OR: 1.2; 95% CI: 1.1-1.3), prosthetic endocarditis (OR: 1.9; CI: 1.2-3.1), Staphylococcus aureus aetiology (OR: 2.1; CI: 1.3-3.5), severe heart failure (OR: 5.4; CI: 3.3-8.8), neurologic manifestations (OR: 1.9; CI: 1.2-2.9), septic shock (OR: 4.2; CI: 2.3-7.7), perivalvular extension (OR: 2.4; CI: 1.3-4.5) and acute renal failure (OR: 1.69; CI: 1.0-2.6). Conversely, Streptococcus viridans group etiology (OR: 0.4; CI: 0.2-0.7) and surgical treatment (OR: 0.5; CI: 0.3-0.8) were protective factors.Conclusions Several characteristics of left-sided endocarditis enable selection of a patient group at higher risk of mortality. This group may benefit from more specialised attention in referral centers and should help to identify those patients who might benefit from more aggressive diagnostic and/or therapeutic procedures.
Resumo:
Objective: To determine the values of, and study the relationships among, central corneal thickness (CCT), intraocular pressure (IOP), and degree of myopia (DM) in an adult myopic population aged 20 to 40 years in Almeria (southeast Spain). To our knowledge this is first study of this kind in this region. Methods: An observational, descriptive, cross-sectional study was done in which a sample of 310 myopic patients (620 eyes) aged 20 to 40 years was selected by gender- and age-stratified sampling, which was proportionally fixed to the size of the population strata for which a 20% prevalence of myopia, 5% epsilon, and a 95% confidence interval were hypothesized. We studied IOP, CCT, and DM and their relationships by calculating the mean, standard deviation, 95% confidence interval for the mean, median, Fisher’s asymmetry coefficient, range (maximum, minimum), and the Brown-Forsythe’s robust test for each variable (IOP, CCT, and DM). Results: In the adult myopic population of Almeria aged 20 to 40 years (mean of 29.8), the mean overall CCT was 550.12 μm. The corneas of men were thicker than those of women (P = 0.014). CCT was stable as no significant differences were seen in the 20- to 40-year-old subjects’ CCT values. The mean overall IOP was 13.60 mmHg. Men had a higher IOP than women (P = 0.002). Subjects over 30 years (13.83) had a higher IOP than those under 30 (13.38) (P = 0.04). The mean overall DM was −4.18 diopters. Men had less myopia than women (P < 0.001). Myopia was stable in the 20- to 40-year-old study population (P = 0.089). A linear relationship was found between CCT and IOP (R2 = 0.152, P ≤ 0.001). CCT influenced the IOP value by 15.2%. However no linear relationship between DM and IOP, or between CCT and DM, was found. Conclusions: CCT was found to be similar to that reported in other studies in different populations. IOP tends to increase after the age of 30 and is not accounted for by alterations in CCT values.
Resumo:
Goal: To learn more about the social support available to patients participating in a prison methadone maintenance program (PMM). Methodology: Descriptive, with controls. Setting: A penitentiary in Albolote (Granada) Population Sample: The total prison population was 1,579; 364 patients were included in the PMM; 35 were female and 329 were male. 60 patients, 7 women and 53 men, were used as cases. 30 non-drug dependent prisoners, 3 women and 27 men, were the control group. They had no antecedents of problems with drug addiction. Interventions: Interviews with cases and controls to learn about their addictive antecedents, family structure, socio-economic level, and a hetero-applied MOS questionnaire was completed. Percentages of each social support variable were obtained and compared using the chi-squared technique. Results: The overall support received is low in 38 cases (74.5%) and in 9 controls (30%): p = 0.0001. OR 0.1466, confidence interval at 95% (0.0538-0.3989). Support received is normal in 13 cases (25%) and 21 controls (70%): p = 0.0007. OR 0.69, confidence interval at 95% (0.44-0.93). All of the variables were statistically significant for non-drug addicts, except for emotional support, which was the same for both groups. Conclusion: The perception of inmates participating in the methadone maintenance program was that they received less social support than the non-drug dependent inmates.
Resumo:
To evaluate the long-term impact of successive interventions on rates of methicillin-resistant Staphylococcus aureus (MRSA) colonization or infection and MRSA bacteremia in an endemic hospital-wide situation. DESIGN:Quasi-experimental, interrupted time-series analysis. The impact of the interventions was analyzed by use of segmented regression. Representative MRSA isolates were typed by use of pulsed-field gel electrophoresis. SETTING:A 950-bed teaching hospital in Seville, Spain. PATIENTS:All patients admitted to the hospital during the period from 1995 through 2008. METHODS:Three successive interventions were studied: (1) contact precautions, with no active surveillance for MRSA; (2) targeted active surveillance for MRSA in patients and healthcare workers in specific wards, prioritized according to clinical epidemiology data; and (3) targeted active surveillance for MRSA in patients admitted from other medical centers. RESULTS:Neither the preintervention rate of MRSA colonization or infection (0.56 cases per 1,000 patient-days [95% confidence interval {CI}, 0.49-0.62 cases per 1,000 patient-days]) nor the slope for the rate of MRSA colonization or infection changed significantly after the first intervention. The rate decreased significantly to 0.28 cases per 1,000 patient-days (95% CI, 0.17-0.40 cases per 1,000 patient-days) after the second intervention and to 0.07 cases per 1,000 patient-days (95% CI, 0.06-0.08 cases per 1,000 patient-days) after the third intervention, and the rate remained at a similar level for 8 years. The MRSA bacteremia rate decreased by 80%, whereas the rate of bacteremia due to methicillin-susceptible S. aureus did not change. Eighty-three percent of the MRSA isolates identified were clonally related. All MRSA isolates obtained from healthcare workers were clonally related to those recovered from patients who were in their care. CONCLUSION:Our data indicate that long-term control of endemic MRSA is feasible in tertiary care centers. The use of targeted active surveillance for MRSA in patients and healthcare workers in specific wards (identified by means of analysis of clinical epidemiology data) and the use of decolonization were key to the success of the program.
Resumo:
The variation with latitude of incidence and mortality for cutaneous malignant melanoma (CMM) in the non-Maori population of New Zealand was assessed. For those aged 20 to 74 years, the effects of age, time period, birth-cohort, gender, and region (latitude), and some interactions between them were evaluated by log-linear regression methods. Increasing age-standardized incidence and mortality rates with increasing proximity to the equator were found for men and women. These latitude gradients were greater for males than females. The relative risk of melanoma in the most southern part of New Zealand (latitude 44 degrees S) compared with the most northern region (latitude 36 degrees S) was 0.63 (95 percent confidence interval [CI] = 0.60-0.67) for incidence and 0.76 (CI = 0.68-0.86) for mortality, both genders combined. The mean percentage change in CMM rates per degree of latitude for males was greater than those reported in other published studies. Differences between men and women in melanoma risk with latitude suggest that regional sun-behavior patterns or other risk factors may contribute to the latitude gradient observed.