14 resultados para Injury risk
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Background Recurrent nerve injury is 1 of the most important complications of thyroidectomy. During the last decade, nerve monitoring has gained increasing acceptance in several centers as a method to predict and to document nerve function at the end of the operation. We evaluated the efficacy of a nerve monitoring system in a series of patients who underwent thyroidectomy and critically analyzed the negative predictive value (NPV) and positive predictive value (PPV) of the method. Methods. NIM System efficacy was prospectively analyzed in 447 patients who underwent thyroidectomy between 2001 and 2008 (366 female/81 male; 420 white/47 nonwhite; 11 to 82 years of age; median, 43 years old). There were 421 total thyroidectomies and 26 partial thyroidectomies, leading to 868 nerves at risk. The gold standard to evaluate inferior laryngeal nerve function was early postoperative videolaryngoscopy, which was repeated after 4 to 6 months in all patients with abnormal endoscopic findings. Results. At the early evaluation, 858 nerves (98.8%) presented normal videolaryngoscopic features after surgery. Ten paretic/paralyzed nerves (1.2%) were detected (2 unexpected unilateral paresis, 2 unexpected bilateral paresis, 1 unexpected unilateral paralysis, 1 unexpected bilateral paralyses, and 1 expected unilateral paralysis). At the late videolaryngoscopy, only 2 permanent nerve paralyses were noted (0.2%), with an ultimate result of 99.8% functioning nerves. Nerve monitoring showed absent or markedly reduced electrical activity at the end of the operations in 25/868 nerves (2.9%), including all 10 endoscopically compromised nerves, with 15 false-positive results. There were no false-negative results. Therefore, the PPV was 40.0%, and the NPV was 100%. Conclusions. In the present series, nerve monitoring had a very high PPV but a low NPV for the detection of recurrent nerve injury. (C) 2011 Wiley Periodicals, Inc. Head Neck 34: 175-179, 2012
Resumo:
Judo competitions are divided into weight classes. However, most athletes reduce their body weight in a few days before competition in order to obtain a competitive advantage over lighter opponents. To achieve fast weight reduction, athletes use a number of aggressive nutritional strategies so many of them place themselves at a high health-injury risk. In collegiate wrestling, a similar problem has been observed and three wrestlers died in 1997 due to rapid weight loss regimes. After these deaths, the National Collegiate Athletic Association had implemented a successful weight management program which was proven to improve weight management behavior. No similar program has ever been discussed by judo federations even though judo competitors present a comparable inappropriate pattern of weight control. In view of this, the basis for a weight control program is provided in this manuscript, as follows: competition should begin within 1 hour after weigh-in, at the latest; each athlete is allowed to be weighed-in only once; rapid weight loss as well as artificial rehydration (i.e., saline infusion) methods are prohibited during the entire competition day; athletes should pass the hydration test to get their weigh-in validated; an individual minimum competitive weight (male athletes competing at no less than 7% and females at no less than 12% of body fat) should be determined at the beginning of each season; athletes are not allowed to compete in any weight class that requires weight reductions greater than 1.5% of body weight per week. In parallel, educational programs should aim at increasing the athletes', coaches' and parents' awareness about the risks of aggressive nutritional strategies as well as healthier ways to properly manage body weight.
Resumo:
de Araujo CC, Silva JD, Samary CS, Guimaraes IH, Marques PS, Oliveira GP, do Carmo LGRR, Goldenberg RC, Bakker-Abreu I, Diaz BL, Rocha NN, Capelozzi VL, Pelosi P, Rocco PRM. Regular and moderate exercise before experimental sepsis reduces the risk of lung and distal organ injury. J Appl Physiol 112: 1206-1214, 2012. First published January 19, 2012; doi:10.1152/japplphysiol.01061.2011.-Physical activity modulates inflammation and immune response in both normal and pathologic conditions. We investigated whether regular and moderate exercise before the induction of experimental sepsis reduces the risk of lung and distal organ injury and survival. One hundred twenty-four BALB/c mice were randomly assigned to two groups: sedentary (S) and trained (T). Animals in T group ran on a motorized treadmill, at moderate intensity, 5% grade, 30 min/day, 3 times a week for 8 wk. Cardiac adaptation to exercise was evaluated using echocardiography. Systolic volume and left ventricular mass were increased in T compared with S group. Both T and S groups were further randomized either to sepsis induced by cecal ligation and puncture surgery (CLP) or sham operation (control). After 24 h, lung mechanics and histology, the degree of cell apoptosis in lung, heart, kidney, liver, and small intestine villi, and interleukin (IL)-6, KC (IL-8 murine functional homolog), IL-1 beta, IL-10, and number of cells in bronchoalveolar lavage (BALF) and peritoneal lavage (PLF) fluids as well as plasma were measured. In CLP, T compared with S groups showed: 1) improvement in survival; 2) reduced lung static elastance, alveolar collapse, collagen and elastic fiber content, number of neutrophils in BALF, PLF, and plasma, as well as lung and distal organ cell apoptosis; and 3) increased IL-10 in BALF and plasma, with reduced IL-6, KC, and IL-1 beta in PLF. In conclusion, regular and moderate exercise before the induction of sepsis reduced the risk of lung and distal organ damage, thus increasing survival.
Resumo:
Increased reactive oxygen species (ROS) promote matrix metalloproteinase (MMP) activities and may underlie cardiomyocyte injury and the degradation of cardiac troponin I (cTI) during acute pulmonary thromboembolism (APT). We examined whether pretreatment or therapy with tempol (a ROS scavenger) prevents MMP activation and cardiomyocyte injury of APT. Anesthetized sheep received tempol infusion (1.0 mg kg(-1) min(-1), i.v.) or saline starting 30 min before or 30 min after APT (autologous blood clots). Control animals received saline. Hemodynamic measurements were performed. MMPs were studied in the right ventricle (RV) by gelatin zymography, fluorimetric activity assay, and in situ zymography. The ROS levels were determined in the RV and cTI were measured in serum samples. APT increased the pulmonary arterial pressure and pulmonary vascular resistance by 146 and 164 %, respectively. Pretreatment or therapy with tempol attenuated these increases. While APT increased RV + dP/dt (max), tempol infusions had no effects. APT increased RV MMP-9 (but not MMP-2) levels. In line with these findings, APT increased RV MMP activities, and this finding was confirmed by in situ zymography. APT increased the RV ROS levels and tempol infusion, before or after APT, and blunted APT-induced increases in MMP-9 levels, MMP activities, in situ MMP activities, and ROS levels in the RV. cTI concentrations increased after APT, and tempol attenuated these increases. RV oxidative stress after APT increases the RV MMP activities, leading to the degradation of sarcomeric proteins, including cTI. Antioxidant treatment may prevent MMP activation and protect against cardiomyocyte injury after APT.
Resumo:
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Resumo:
Background and Objectives: Patients who survive acute kidney injury (AKI), especially those with partial renal recovery, present a higher long-term mortality risk. However, there is no consensus on the best time to assess renal function after an episode of acute kidney injury or agreement on the definition of renal recovery. In addition, only limited data regarding predictors of recovery are available. Design, Setting, Participants, & Measurements: From 1984 to 2009, 84 adult survivors of acute kidney injury were followed by the same nephrologist (RCRMA) for a median time of 4.1 years. Patients were seen at least once each year after discharge until end stage renal disease (ESRD) or death. In each consultation serum creatinine was measured and glomerular filtration rate estimated. Renal recovery was defined as a glomerular filtration rate value >= 60 mL/min/1.73 m2. A multiple logistic regression was performed to evaluate factors independently associated with renal recovery. Results: The median length of follow-up was 50 months (30-90 months). All patients had stabilized their glomerular filtration rates by 18 months and 83% of them stabilized earlier: up to 12 months. Renal recovery occurred in 16 patients (19%) at discharge and in 54 (64%) by 18 months. Six patients died and four patients progressed to ESRD during the follow up period. Age (OR 1.09, p < 0.0001) and serum creatinine at hospital discharge (OR 2.48, p = 0.007) were independent factors associated with non renal recovery. The acute kidney injury severity, evaluated by peak serum creatinine and need for dialysis, was not associated with non renal recovery. Conclusions: Renal recovery must be evaluated no earlier than one year after an acute kidney injury episode. Nephrology referral should be considered mainly for older patients and those with elevated serum creatinine at hospital discharge.
Resumo:
Background: The role of an impaired estimated glomerular filtration rate (eGFR) at hospital admission in the outcome of acute kidney injury (AKI) after acute myocardial infarction (AMI) has been underreported. The aim of this study was to assess the influence of an admission eGFR<60 mL/min/1.73 m(2) on the incidence and early and late mortality of AMI-associated AKI. Methods: A prospective study of 828 AMI patients was performed. AKI was defined as a serum creatinine increase of >= 50% from the time of admission (RIFLE criteria) in the first 7 days of hospitalization. Patients were divided into subgroups according to their eGFR upon hospital admission (MDRD formula, mL/min/1.73 m(2)) and the development of AKI: eGFR >= 60 without AKI, eGFR<60 without AKI, eGFR >= 60 with AKI and eGFR<60 with AKI. Results: Overall, 14.6% of the patients in this study developed AKI. The admission eGFR had no impact on the incidence of AKI. However, the admission eGFR was associated with the outcome of AMI-associated AKI. The adjusted hazard ratios (AHR, Cox multivariate analysis) for 30-day mortality were 2.00 (95% CI 1.11-3.61) for eGFR, 60 without AKI, 4.76 (95% CI 2.45-9.26) for eGFR >= 60 with AKI and 6.27 (95% CI 3.20-12.29) for eGFR, 60 with AKI. Only an admission eGFR of <60 with AKI was significantly associated with a 30-day to 1-year mortality hazard (AHR 3.05, 95% CI 1.50-6.19). Conclusions: AKI development was associated with an increased early mortality hazard in AMI patients with either preserved or impaired admission eGFR. Only the association of impaired admission eGFR and AKI was associated with an increased hazard for late mortality among these patients.
Resumo:
Arterial hypertension is a major risk factor for ischemic stroke. However, the management of preexisting hypertension is still controversial in the treatment of acute stroke in hypertensive patients. The present study evaluates the influence of preserving hypertension during focal cerebral ischemia on stroke outcome in a rat model of chronic hypertension, the spontaneously hypertensive rats (SHR). Focal cerebral ischemia was induced by transient (1 h) occlusion of the middle cerebral artery, during which mean arterial blood pressure was maintained at normotension (110-120 mm Hg, group 1, n=6) or hypertension (160-170 mm Hg, group 2, n=6) using phenylephrine. T2-, diffusion- and perfusion-weighted MRI were performed serially at five different time points: before and during ischemia, and at 1, 4 and 7 days after ischemia. Lesion volume and brain edema were estimated from apparent diffusion coefficient maps and T2-weighted images. Regional cerebral blood flow (rCBF) was measured within and outside the perfusion deficient lesion and in the corresponding regions of the contralesional hemisphere. Neurological deficits were evaluated after reperfusion. Infarct volume, edema, and neurological deficits were significantly reduced in group 2 vs. group 1. In addition, higher values and rapid restoration of rCBF were observed in group 2, while rCBF in both hemispheres was significantly decreased in group 1. Maintaining preexisting hypertension alleviates ischemic brain injury in SHR by increasing collateral circulation to the ischemic region and allowing rapid restoration of rCBF. The data suggest that maintaining preexisting hypertension is a valuable approach to managing hypertensive patients suffering from acute ischemic stroke. Published by Elsevier B.V.
Resumo:
Background and Purpose-The pattern of antenatal brain injury varies with gestational age at the time of insult. Deep brain nuclei are often injured at older gestational ages. Having previously shown postnatal hypertonia after preterm fetal rabbit hypoxia-ischemia, the objective of this study was to investigate the causal relationship between the dynamic regional pattern of brain injury on MRI and the evolution of muscle tone in the near-term rabbit fetus. Methods-Serial MRI was performed on New Zealand white rabbit fetuses to determine equipotency of fetal hypoxia-ischemia during uterine ischemia comparing 29 days gestation (E29, 92% gestation) with E22 and E25. E29 postnatal kits at 4, 24, and 72 hours after hypoxia-ischemia underwent T2- and diffusion-weighted imaging. Quantitative assessments of tone were made serially using a torque apparatus in addition to clinical assessments. Results-Based on the brain apparent diffusion coefficient, 32 minutes of uterine ischemia was selected for E29 fetuses. At E30, 58% of the survivors manifested hind limb hypotonia. By E32, 71% of the hypotonic kits developed dystonic hypertonia. Marked and persistent apparent diffusion coefficient reduction in the basal ganglia, thalamus, and brain stem was predictive of these motor deficits. Conclusions-MRI observation of deep brain injury 6 to 24 hours after near-term hypoxia-ischemia predicts dystonic hypertonia postnatally. Torque-displacement measurements indicate that motor deficits in rabbits progressed from initial hypotonia to hypertonia, similar to human cerebral palsy, but in a compressed timeframe. The presence of deep brain injury and quantitative shift from hypo-to hypertonia may identify patients at risk for developing cerebral palsy. (Stroke. 2012;43:2757-2763.)
Resumo:
The objective of this study was to identify, among motorcyclists involved in traffic incidents, the factors associated with risk of injuries. In 2004, in the city of Maringa-PR, it was determined that there were a total of 2,362 motorcyclists involved in traffic incidents, according to records from the local Military Police. Multivariate analysis was applied to identify the factors associated with the presence of injury. A significantly higher probability of injury was observed among motorcyclists involved in collisions (odds Ratio = 11.19) and falls (odds Ratio = 3.81); the estimated odds ratio for females was close to four, and those involved in incidents including up to two vehicles were 2.63 times more likely to have injuries. Women involved in motorcycle falls and collisions with up to two vehicles stood out as a high-risk group for injuries.
Resumo:
Background. Lung transplantation has become a standard procedure for some end-stage lung diseases, but primary graft dysfunction (PGD) is an inherent problem that impacts early and late outcomes. The aim of this study was to define the incidence, risk factors, and impact of mechanical ventilation time on mortality rates among a retrospective cohort of lung transplantations performed in a single institution. Methods. We performed a retrospective study of 118 lung transplantations performed between January 2003 and July 2010. The most severe form of PGD (grade III) as defined at 48 and 72 hours was examined for risk factors by multivariable logistic regression models using donor, recipient, and transplant variables. Results. The overall incidence of PGD at 48 hours was 19.8%, and 15.4% at 72 hours. According multivariate analysis, risk factors associated with PGD were donor smoking history for 48 hours (adjusted odds ratio [OR], 4.83; 95% confidence interval [CI], 1.236-18.896; P = .022) and older donors for 72 hours (adjusted OR, 1.046; 95% CI, 0.997-1.098; P = .022). The operative mortality was 52.9% among patients with PGD versus 20.3% at 48 hours (P = .012). At 72 hours, the mortality rate was 58.3% versus 21.2% (P = .013). The 90-days mortality was also higher among patients with PGD. The mechanical ventilation time was longer in patients with PGD III at 48 hours namely, a mean time of 72 versus 24 hours (P = .001). When PGD was defined at 72 hours, the mean ventilation time was even longer, namely 151 versus 24 hours (P < .001). The mean overall survival for patients who developed PGD at 48 hours was 490.9 versus 1665.5 days for subjects without PGD (P = .001). Considering PGD only at 72 hours, the mean survival was 177.7 days for the PGD group and 1628.9 days for the other patients (P < .001). Conclusion. PGD showed an important impacts on operative and 90-day mortality rates, mechanical ventilation time, and overall survival among lung transplant patients. PGD at 72 hours was a better predictor of lung transplant outcomes than at 48 hours. The use of donors with a smoking history or of advanced age were risk factors for the development of PGD.
Resumo:
Background: The causes of death on long-term mortality after acute kidney injury (AKI) have not been well studied. The purpose of the study was to evaluate the role of comorbidities and the causes of death on the long-term mortality after AKI. Methodology/Principal Findings: We retrospectively studied 507 patients who experienced AKI in 2005-2006 and were discharged free from dialysis. In June 2008 (median: 21 months after AKI), we found that 193 (38%) patients had died. This mortality is much higher than the mortality of the population of Sao Paulo City, even after adjustment for age. A multiple survival analysis was performed using Cox proportional hazards regression model and showed that death was associated with Khan's index indicating high risk [adjusted hazard ratio 2.54 (1.38-4.66)], chronic liver disease [1.93 (1.15-3.22)], admission to non-surgical ward [1.85 (1.30-2.61)] and a second AKI episode during the same hospitalization [1.74 (1.12-2.71)]. The AKI severity evaluated either by the worst stage reached during AKI (P=0.20) or by the need for dialysis (P=0.12) was not associated with death. The causes of death were identified by a death certificate in 85% of the non-survivors. Among those who died from circulatory system diseases (the main cause of death), 59% had already suffered from hypertension, 34% from diabetes, 47% from heart failure, 38% from coronary disease, and 66% had a glomerular filtration rate <60 previous to the AKI episode. Among those who died from neoplasms, 79% already had the disease previously. Conclusions: Among AKI survivors who were discharged free from dialysis the increased long-term mortality was associated with their pre-existing chronic conditions and not with the severity of the AKI episode. These findings suggest that these survivors should have a medical follow-up after hospital discharge and that all efforts should be made to control their comorbidities.
Resumo:
It has recently been suggested that regular exercise reduces lung function decline and risk of chronic obstructive pulmonary disease (COPD) among active smokers; however, the mechanisms involved in this effect remain poorly understood. The present study evaluated the effects of regular exercise training in an experimental mouse model of chronic cigarette smoke exposure. Male C57BL/6 mice were divided into four groups (control, exercise, smoke and smoke+exercise). For 24 weeks, we measured respiratory mechanics, mean linear intercept, inflammatory cells and reactive oxygen species (ROS) in bronchoalveolar lavage (BAL) fluid, collagen deposition in alveolar walls, and the expression of antioxidant enzymes, matrix metalloproteinase 9, tissue inhibitor of metalloproteinase (TIMP) 1, interleukin (IL)-10 and 8-isoprostane in alveolar walls. Exercise attenuated the decrease in pulmonary elastance (p<0.01) and the increase in mean linear intercept (p=0.003) induced by cigarette smoke exposure. Exercise substantially inhibited the increase in ROS in BAL fluid and 8-isoprostane expression in lung tissue induced by cigarette smoke. In addition, exercise significantly inhibited the decreases in IL-10, TIMP1 and CuZn superoxide dismutase induced by exposure to cigarette smoke. Exercise also increased the number of cells expressing glutathione peroxidase. Our results suggest that regular aerobic physical training of moderate intensity attenuates the development of pulmonary disease induced by cigarette smoke exposure.
Resumo:
Coffee intake has been inversely related to the incidence of liver diseases, although there are controversies on whether these beneficial effects on human health are because of caffeine or other specific components in this popular beverage. Thus, this study evaluated the protective effects of coffee or caffeine intake on liver injury induced by repeated thioacetamide (TAA) administration in male Wistar rats. Rats were randomized into five groups: one untreated group (G1) and four groups (G2G5) treated with the hepatotoxicant TAA (200 similar to mg/kg b.w., i.p.) twice a week for 8 similar to weeks. Concomitantly, rats received tap water (G1 and G2), conventional coffee (G3), decaffeinated coffee (G4) or 0.1% caffeine (G5). After 8 similar to weeks of treatment, rats were killed and blood and liver samples were collected. Conventional and decaffeinated coffee and caffeine intake significantly reduced serum levels of alanine aminotransferase (ALT) (p similar to<similar to 0.001) and oxidized glutathione (p similar to<similar to 0.05), fibrosis/inflammation scores (p similar to<similar to 0.001), collagen volume fraction (p similar to<similar to 0.01) and transforming growth factor beta-1 (TGF-beta 1) protein expression (p similar to=similar to 0.001) in the liver from TAA-treated groups. In addition, conventional coffee and caffeine intake significantly reduced proliferating cellular nuclear antigen (PCNA) S-phase indexes (p similar to<similar to 0.001), but only conventional coffee reduced cleaved caspase-3 indexes (p similar to<similar to 0.001), active metalloproteinase 2 (p similar to=similar to 0.004) and the number of glutathione S-transferase placental form (GST-P)-positive preneoplastic lesions (p similar to<similar to 0.05) in the liver from TAA-treated groups. In conclusion, conventional coffee and 0.1% caffeine intake presented better beneficial effects than decaffeinated coffee against liver injury induced by TAA in male Wistar rats.