981 resultados para Occupational Mortality
Resumo:
In occupational accidents involving health professionals handling potentially contaminated material, the decision to start or to continue prophylactic medication against infection by Human Immunodeficiency Virus (HIV) has been based on the ELISA test applied to a blood sample from the source patient. In order to rationalize the prophylactic use of antiretroviral agents, a rapid serologic diagnostic test of HIV infection was tested by the enzymatic immunoabsorption method (SUDS HIV 1+2, MUREX®) and compared to conventional ELISA (Abbott HIV-1/ HIV-2 3rd Generation plus EIA®). A total of 592 cases of occupational accidents were recorded at the University Hospital of Ribeirão Preto from July 1998 to April 1999. Of these, 109 were simultaneously evaluated by the rapid test and by ELISA HIV. The rapid test was positive in three cases and was confirmed by ELISA and in one the result was inconclusive and later found to be negative by ELISA. In the 106 accidents in which the rapid test was negative no prophylactic medication was instituted, with an estimated reduction in costs of US$ 2,889.35. In addition to this advantage, the good correlation of the rapid test with ELISA, the shorter duration of stress and the absence of exposure of the health worker to the adverse effects of antiretroviral agents suggest the adoption of this test in Programs of Attention to Accidents with Potentially Contaminated Material.
Resumo:
The use of appropriate acceptance criteria in the risk assessment process for occupational accidents is an important issue but often overlooked in the literature, particularly when new risk assessment methods are proposed and discussed. In most cases, there is no information on how or by whom they were defined, or even how companies can adapt them to their own circumstances. Bearing this in mind, this study analysed the problem of the definition of risk acceptance criteria for occupational settings, defining the quantitative acceptance criteria for the specific case study of the Portuguese furniture industrial sector. The key steps to be considered in formulating acceptance criteria were analysed in the literature review. By applying the identified steps, the acceptance criteria for the furniture industrial sector were then defined. The Cumulative Distribution Function (CDF) for the injury statistics of the industrial sector was identified as the maximum tolerable risk level. The acceptable threshold was defined by adjusting the CDF to the Occupational, Safety & Health (OSH) practitioners’ risk acceptance judgement. Adjustments of acceptance criteria to the companies’ safety cultures were exemplified by adjusting the Burr distribution parameters. An example of a risk matrix was also used to demonstrate the integration of the defined acceptance criteria into a risk metric. This work has provided substantial contributions to the issue of acceptance criteria for occupational accidents, which may be useful in overcoming the practical difficulties faced by authorities, companies and experts.
Resumo:
Leptospirosis, brucellosis and toxoplasmosis are widely-distributed zoonosis, being the man an accidental participant of their epidemiological chains. The aim of this paper was to make a seroepidemiological report and identify occupational and environmental variables related to these illnesses in 150 workers in a slaughterhouse in the Northern region of Paraná. For the diagnosis of leptospirosis a microscopical seroagglutination test was applied; for brucellosis, the tamponated acidified antigen test and the 2-mercaptoetanol tests were used, and for toxoplasmosis the indirect immunofluorescence reaction test. For each employee an epidemiological survey was filled, which investigated occupational and environmental variables which could be associated with these infections. Positive results for leptospirosis were found in 4.00% of the samples, for brucellosis in 0.66% of samples and toxoplasmosis in 70.00%. From the three diseases researched, only the results for leptospirosis suggest occupational infection.
Resumo:
Burn mortality statistics may be misleading unless they account properly for the many factors that can influence outcome. Such estimates are useful for patients and others making medical and financial decisions concerning their care. This study aimed to define the clinical, microbiological and laboratorial predictors of mortality with a view to focus on better burn care. Data were collected using independent variables, which were analyzed sequentially and cumulatively, employing univariate statistics and a pooled, cross-sectional, multivariate logistic regression to establish which variables better predict the probability of mortality. Survivors and non-survivors among burn patients were compared to define the predictive factors of mortality. Mortality rate was 5.0%. Higher age, larger burn area, presence of fungi in the wound, shorter length of stay and the presence of multi-resistant bacteria in the wound significantly predicted increased mortality. The authors conclude that those patients who are most apt to die are those with age > 50 years, with limited skin donor sites and those with multi-resistant bacteria and fungi in the wound.
Resumo:
A case-control study, involving patients with positive blood cultures for Klebsiella pneumoniae (KP) or Escherichia coli (EC) EC and controls with positive blood cultures for non-ESBL-KP or EC, was performed to assess risk factors for extended-spectrum-β-lactamase (ESBL) production from nosocomial bloodstream infections (BSIs). Mortality among patients with BSIs was also assessed. The study included 145 patients (81, 59.5% with K. pneumoniae and 64, 44.1% with E. coli BSI); 51 (35.2%) isolates were ESBL producers and 94 (64.8%) nonproducers. Forty-five (55.6%) K. pneumoniae isolates were ESBL producers, while only six (9.4%) E. coli isolates produced the enzyme. Multivariate analysis showed that recent exposure to piperacillin-tazobactam (adjusted Odds Ratio [aOR] 6.2; 95%CI 1.1-34.7) was a risk factor for ESBL BSI. K. pneumoniae was significantly more likely to be an ESBL-producing isolate than E. coli (aOR 6.7; 95%CI 2.3-20.2). No cephalosporin class was independently associated with ESBLs BSI; however, in a secondary model considering all oxymino-cephalosporins as a single variable, a significant association was demonstrated (aOR 3.7; 95%CI 1.3-10.8). Overall 60-day mortality was significantly higher among ESBL-producing organisms. The finding that piperacillin-tazobactam use is a risk factor for ESBL-production in KP or EC BSIs requires attention, since this drug can be recommended to limit the use of third-generation cephalosporins.
Resumo:
In the present study the frequencies of immunity against hepatitis B (HB) and of potentially contaminating accidents among medical students of a Brazilian public university were evaluated. Of all the 400 students who should have been immunized, 303 (75.7%), 66.3% of whom were women, answered an anonymous, self-administered questionnaire. Serum anti-HBs were determined in 205 of them and titers > 10 UI/L were considered to be protective. A total of 86.8% of students had received three doses of HB vaccine. The frequency of immunity among women (96.4%) was higher (p = 0.04) than that among men (87.7%). Among those who did not have immunity, 12/13 (92.3%) had been vaccinated before entering medical school. Only 11% of the students with complete vaccination had previously verified serological response to the vaccine. A total of 23.6% reported having been somehow exposed to blood or secretions. Among final-year students, this frequency was 45.0%, being similar among men (47.8%) and women (43.2%). Of all these accidents, 57.7% were due to body fluids coming in contact with mucosa and 42.3% due to cut and puncture accidents. The results from this study show that: 1) the frequency of immunity against HB is high among the evaluated medical students, although verification of response to vaccination is not a concern for them; 2) anti-HBs titers should be verified after complete vaccination and on a regular basis, especially by men; and 3) the frequency of potentially contaminating accidents is high.
Resumo:
OBJECTIVES: Mortality after ICU discharge accounts for approx. 20-30% of deaths. We examined whether post-ICU discharge mortality is associated with the presence and severity of organ dysfunction/failure just before ICU discharge. PATIENTS AND METHODS: The study used the database of the EURICUS-II study, with a total of 4,621 patients, including 2,958 discharged alive to the general wards (post-ICU mortality 8.6%). Over a 4-month period we collected clinical and demographic characteristics, including the Simplified Acute Physiology Score (SAPS II), Nine Equivalents of Nursing Manpower Use Score, and Sequential Organ Failure Assessment (SOFA) score. RESULTS: Those who died in the hospital after ICU discharge had a higher SAPS II score, were more frequently nonoperative, admitted from the ward, and had stayed longer in the ICU. Their degree of organ dysfunction/failure was higher (admission, maximum, and delta SOFA scores). They required more nursing workload resources while in the ICU. Both the amount of organ dysfunction/failure (especially cardiovascular, neurological, renal, and respiratory) and the amount of nursing workload that they required on the day before discharge were higher. The presence of residual CNS and renal dysfunction/failure were especially prognostic factors at ICU discharge. Multivariate analysis showed only predischarge organ dysfunction/failure to be important; thus the increased use of nursing workload resources before discharge probably reflects only the underlying organ dysfunction/failure. CONCLUSIONS: It is better to delay the discharge of a patient with organ dysfunction/failure from the ICU, unless adequate monitoring and therapeutic resources are available in the ward.
Resumo:
OBJECTIVE: To empirically test, based on a large multicenter, multinational database, whether a modified PIRO (predisposition, insult, response, and organ dysfunction) concept could be applied to predict mortality in patients with infection and sepsis. DESIGN: Substudy of a multicenter multinational cohort study (SAPS 3). PATIENTS: A total of 2,628 patients with signs of infection or sepsis who stayed in the ICU for >48 h. Three boxes of variables were defined, according to the PIRO concept. Box 1 (Predisposition) contained information about the patient's condition before ICU admission. Box 2 (Injury) contained information about the infection at ICU admission. Box 3 (Response) was defined as the response to the infection, expressed as a Sequential Organ Failure Assessment score after 48 h. INTERVENTIONS: None. MAIN MEASUREMENTS AND RESULTS: Most of the infections were community acquired (59.6%); 32.5% were hospital acquired. The median age of the patients was 65 (50-75) years, and 41.1% were female. About 22% (n=576) of the patients presented with infection only, 36.3% (n=953) with signs of sepsis, 23.6% (n=619) with severe sepsis, and 18.3% (n=480) with septic shock. Hospital mortality was 40.6% overall, greater in those with septic shock (52.5%) than in those with infection (34.7%). Several factors related to predisposition, infection and response were associated with hospital mortality. CONCLUSION: The proposed three-level system, by using objectively defined criteria for risk of mortality in sepsis, could be used by physicians to stratify patients at ICU admission or shortly thereafter, contributing to a better selection of management according to the risk of death.
Resumo:
A cross-sectional study was conducted to assess the frequencies and characteristics of occupational exposures among medical and nursing students at a Brazilian public university, in addition to their prevention and post-exposure behavior. During the second semester of 2010, a self-administered semi-structured questionnaire was completed by 253/320 (79.1%) medical students of the clinical course and 149/200 (74.5%) nursing students who were already performing practical activities. Among medical students, 53 (20.9%) suffered 73 injuries, which mainly occurred while performing extra-curricular activities (32.9%), with cutting and piercing objects (56.2%), in the emergency room (39.7%), and as a result of lack of technical preparation or distraction (54.8%). Among nursing students, 27 (18.1%) suffered 37 injuries, which mainly occurred with hollow needles (67.6%) in the operating room or wards (72.2%), and as a result of lack of technical preparation or distraction (62.1%). Among medical and nursing students, respectively, 96.4% and 48% were dissatisfied with the instructions on previously received exposure prevention; 48% and 18% did not always use personal protective equipment; 67.6% and 16.8% recapped used needles; 49.3% and 35.1% did not bother to find out the source patient's serological results post-exposure; and 1.4% and 18.9% officially reported injuries. In conclusion, this study found high frequencies of exposures among the assessed students, inadequate practices in prevention and post-exposure, and, consequently, the need for training in “standard precautions” to prevent such exposures.
Resumo:
Introduction: The purpose of measuring the burden of disease involves aggregating morbidity and mortality components into a single indicator, the disability-adjusted life year (DALY), to measure how much and how people live and suffer the impact of a disease. Objective: To estimate the global burden of disease due to AIDS in a municipality of southern Brazil. Methods: An ecological study was conducted in 2009 to examine the incidence and AIDS-related deaths among the population residing in the city of Tubarao, Santa Catarina State, Brazil. Data from the Mortality Information System in the National Health System was used to calculate the years of life lost (YLL) due to premature mortality. The calculation was based on the difference between a standardized life expectancy and age at death, with a discount rate of 3% per year. Data from the Information System for Notifiable Diseases were used to calculate the years lived with disability (YLD). The DALY was estimated by the sum of YLL and YLD. Indicator rates were estimated per 100,000 inhabitants, distributed by age and gender. Results: A total of 131 records were examined, and a 572.5 DALYs were estimated, which generated a rate of 593.1 DALYs/100,000 inhabitants. The rate among men amounted to 780.7 DALYs/100,000, whereas among women the rate was 417.1 DALYs/100,000. The most affected age groups were 30-44 years for men and 60-69 years for women. Conclusion: The burden of disease due to AIDS in the city of Tubarao was relatively high when considering the global trend. The mortality component accounted for more than 90% of the burden of disease.
Resumo:
SUMMARYAIDS-related cryptococcal meningitis continues to cause a substantial burden of death in low and middle income countries. The diagnostic use for detection of cryptococcal capsular polysaccharide antigen (CrAg) in serum and cerebrospinal fluid by latex agglutination test (CrAg-latex) or enzyme-linked immunoassay (EIA) has been available for over decades. Better diagnostics in asymptomatic and symptomatic phases of cryptococcosis are key components to reduce mortality. Recently, the cryptococcal antigen lateral flow assay (CrAg LFA) was included in the armamentarium for diagnosis. Unlike the other tests, the CrAg LFA is a dipstick immunochromatographic assay, in a format similar to the home pregnancy test, and requires little or no lab infrastructure. This test meets all of the World Health Organization ASSURED criteria (Affordable, Sensitive, Specific, User friendly, Rapid/robust, Equipment-free, and Delivered). CrAg LFA in serum, plasma, whole blood, or cerebrospinal fluid is useful for the diagnosis of disease caused by Cryptococcusspecies. The CrAg LFA has better analytical sensitivity for C. gattii than CrAg-latex or EIA. Prevention of cryptococcal disease is new application of CrAg LFA via screening of blood for subclinical infection in asymptomatic HIV-infected persons with CD4 counts < 100 cells/mL who are not receiving effective antiretroviral therapy. CrAg screening of leftover plasma specimens after CD4 testing can identify persons with asymptomatic infection who urgently require pre-emptive fluconazole, who will otherwise progress to symptomatic infection and/or die.
Resumo:
Background: Brain natriuretic peptide is a predictor of mortality in multiple cardiovascular diseases but its value in patients with chronic kidney disease is still a matter of debate. Patients and methods: We studied 48 haemodialysis patients with mean age 70.0±13.9 years,62.5% female, 43.8% diabetics, with a mean haemodialysis time of 38.1±29.3 months. To evaluate the role of brain natriuretic peptide as a prognostic factor in this population we performed a two-session evaluation of pre- and postmid-week haemodialysis plasma brain natriuretic peptide concentrations and correlated them with hospitalisation and overall and cardiovascular mortality over a two-year period. Results: There were no significant variations in pre– and post-haemodialysis plasma brain natriuretic peptide concentrations. Pre- and post-haemodialysis brain natriuretic peptide concentrations were significantly greater in patients who died from all causes(p=0.034 and p=0.001, respectively) and from cardiovascular causes (p=0.043 and p=0.001, respectively). Patients who were hospitalised in the two-year study period also presented greater pre- and posthaemodialysis brain natriuretic peptide concentrations(p=0.03 and p=0.036, respectively). Patients with mean brain natriuretic peptide concentrations ≥ 390 pg/mL showed a significantly lower survival at the end of the two-year study period. Conclusion: Brain natriuretic peptide was a good predictor of morbidity and mortality (overall and cardiovascular) in our population.
Resumo:
Although the protease inhibitors have revolutionized the therapy of chronic hepatitis C (CHC), the concomitant use of pegylated-interferon (PEG-IFN) and ribavirin (RBV) is associated to a high rate of adverse effects. In this study, we evaluated the consequences of PEG-IFN and RBV and their relationship with mortality in patients with cirrhosis. METHODS: Medical records of CHC who underwent treatment with PEG-IFN and RBV in a public hospital in Brazil were evaluated. All the patients with cirrhosis were selected, and their clinical and laboratory characteristics, response to treatment, side effects and mortality were evaluated. RESULTS: From the 1,059 patients with CHC, 257 cirrhotic patients were evaluated. Of these, 45 (17.5%) achieved sustained viral response (SVR). Early discontinuation of therapy occurred in 105 (40.8%) patients, of which 39 (15.2%) were due to serious adverse effects. The mortality rate among the 257 cirrhotic patients was 4.3%, occurring in 06/242 (2.4%) of the Child-A, and in 05/15 (33.3%) of the Child-B patients. In conclusion, the treatment of patients with cirrhosis due to HCV with PEG-IFN and RBV shows a low SVR rate and a high mortality, especially in patients with liver dysfunction.
Resumo:
OBJECTIVE: Statins are among the most prescribed drugs worldwide and their recently discovered anti-inflammatory effect seems to have an important role in inhibiting proinflammatory cytokine production, chemokines expression and counteracting the harmful effects of sepsis on the coagulation system. We decided to perform a meta-analysis of all randomized controlled trials ever published on statin therapy in septic patients to evaluate their effect on survival and length of hospital stay. DATA SOURCES AND STUDY SELECTION: Articles were assessed by four trained investigators, with divergences resolved by consensus. BioMedCentral, PubMed, Embase and the Cochrane Central Register of clinical trials were searched for pertinent studies. Inclusion criteria were random allocation to treatment and comparison of statins versus any comparator in septic patients. DATA EXTRACTION AND SYNTHESIS: Data from 650 patients in 5 randomized controlled studies were analyzed. No difference in mortality between patients receiving statins versus control (44/322 [14%] in the statins group vs 50/328 [15%] in the control arm, RR = 0.90 [95% CI 0.65 to 1.26], p = 0.6) was observed. No differences in hospital stay (p = 0.7) were found. CONCLUSIONS: Published data show that statin therapy has no effect on mortality in the overall population of adult septic patients. Scientific evidence on statins role in septic patients is still limited and larger randomized trials should be performed on this topic.
Resumo:
OBJECTIVE: Although evidence has shown that ischemic heart disease (IHD) in vascular surgery patients has a negative impact on the prognosis after surgery, it is unclear whether directed treatment of IHD may influence cause-specific and overall mortality. The objective of this study was to determine the prognostic implication of coronary revascularization (CR) on overall and cause-specific mortality in vascular surgery patients. METHODS: Patients undergoing surgery for abdominal aortic aneurysm, carotid artery stenosis, or peripheral artery disease in a university hospital in The Netherlands between January 2003 and December 2011 were retrospectively included. Survival estimates were obtained by Kaplan-Meier and Cox regression analysis. RESULTS: A total of 1104 patients were included. Adjusted survival analyses showed that IHD significantly increased the risk of overall mortality (hazard ratio [HR], 1.50; 95% confidence interval, 1.21-1.87) and cardiovascular death (HR, 1.93; 95% confidence interval, 1.35-2.76). Compared with those without CR, patients previously undergoing CR had similar overall mortality (HR, 1.38 vs 1.62; P = .274) and cardiovascular mortality (HR, 1.83 vs 2.02; P = .656). Nonrevascularized IHD patients were more likely to die of IHD (6.9% vs 35.7%), whereas revascularized IHD patients more frequently died of cardiovascular causes unrelated to IHD (39.1% vs 64.3%; P = .018). CONCLUSIONS: This study confirms the significance of IHD for postoperative survival of vascular surgery patients. CR was associated with lower IHD-related death rates. However, it failed to provide an overall survival benefit because of an increased rate of cardiovascular mortality unrelated to IHD. Intensification of secondary prevention regimens may be required to prevent this shift toward non-IHD-related death and thereby improve life expectancy.