65 resultados para Pressure ulcer risk,
Resumo:
AIMS No standardized local thrombolysis regimen exists for the treatment of pulmonary embolism (PE). We retrospectively investigated efficacy and safety of fixed low-dose ultrasound-assisted catheter-directed thrombolysis (USAT) for intermediate- and high-risk PE. METHODS AND RESULTS Fifty-two patients (65 ± 14 years) of whom 14 had high-risk PE (troponin positive in all) and 38 intermediate-risk PE (troponin positive in 91%) were treated with intravenous unfractionated heparin and USAT using 10 mg of recombinant tissue plasminogen activator per device over the course of 15 h. Bilateral USAT was performed in 83% of patients. During 3-month follow-up, two [3.8%; 95% confidence interval (CI) 0.5-13%] patients died (one from cardiogenic shock and one from recurrent PE). Major non-fatal bleeding occurred in two (3.8%; 95% CI, 0.5-13%) patients: one intrathoracic bleeding after cardiopulmonary resuscitation requiring transfusion, one intrapulmonary bleeding requiring lobectomy. Mean pulmonary artery pressure decreased from 37 ± 9 mmHg at baseline to 25 ± 8 mmHg at 15 h (P < 0.001) and cardiac index increased from 2.0 ± 0.7 to 2.7 ± 0.9 L/min/m(2) (P < 0.001). Echocardiographic right-to-left ventricular end-diastolic dimension ratio decreased from 1.42 ± 0.21 at baseline to 1.06 ± 0.23 at 24 h (n = 21; P < 0.001). The greatest haemodynamic benefit from USAT was found in patients with high-risk PE and in those with symptom duration < 14 days. CONCLUSION A standardized catheter intervention approach using fixed low-dose USAT for the treatment of intermediate- and high-risk PE was associated with rapid improvement in haemodynamic parameters and low rates of bleeding complications and mortality.
Resumo:
Polycyclic aromatic compounds (PACs) in air particulate matter contribute considerably to the health risk of air pollution. The objectives of this study were to assess the occurrence and variation in concentrations and sources of PM2.5-bound PACs [Oxygenated PAHs (OPAHs), nitro-PAHs and parent-PAHs] sampled from the atmosphere of a typical Chinese megacity (Xi'an), to study the influence of meteorological conditions on PACs and to estimate the lifetime excess cancer risk to the residents of Xi'an (from inhalation of PM2.5-bound PACs). To achieve these objectives, we sampled 24-h PM2.5 aerosols (once in every 6 days, from 5 July 2008 to 8 August 2009) from the atmosphere of Xi'an and measured the concentrations of PACs in them. The PM2.5-bound concentrations of Σcarbonyl-OPAHs, ∑ hydroxyl + carboxyl-OPAHs, Σnitro-PAHs and Σalkyl + parent-PAHs ranged between 5–22, 0.2–13, 0.3–7, and 7–387 ng m− 3, respectively, being markedly higher than in most western cities. This represented a range of 0.01–0.4% and 0.002–0.06% of the mass of organic C in PM2.5 and the total mass of PM2.5, respectively. The sums of the concentrations of each compound group had winter-to-summer ratios ranging from 3 to 8 and most individual OPAHs and nitro-PAHs had higher concentrations in winter than in summer, suggesting a dominant influence of emissions from household heating and winter meteorological conditions. Ambient temperature, air pressure, and wind speed explained a large part of the temporal variation in PACs concentrations. The lifetime excess cancer risk from inhalation (attributable to selected PAHs and nitro-PAHs) was six fold higher in winter (averaging 1450 persons per million residents of Xi'an) than in summer. Our results call for the development of emission control measures.
Resumo:
Definitions of shock and resuscitation endpoints traditionally focus on blood pressures and cardiac output. This carries a high risk of overemphasizing systemic hemodynamics at the cost of tissue perfusion. In line with novel shock definitions and evidence of the lack of a correlation between macro- and microcirculation in shock, we recommend that macrocirculatory resuscitation endpoints, particularly arterial and central venous pressure as well as cardiac output, be reconsidered. In this viewpoint article, we propose a three-step approach of resuscitation endpoints in shock of all origins. This approach targets only a minimum individual and context-sensitive mean arterial blood pressure (for example, 45 to 50 mm Hg) to preserve heart and brain perfusion. Further resuscitation is exclusively guided by endpoints of tissue perfusion irrespectively of the presence of arterial hypotension ('permissive hypotension'). Finally, optimization of individual tissue (for example, renal) perfusion is targeted. Prospective clinical studies are necessary to confirm the postulated benefits of targeting these resuscitation endpoints.
Resumo:
Extracorporeal shock waves are defined as a sequence of sonic pulses characterized by high peak pressure over 100 MPa, fast pressure rise, and short lifecycle. In the 1980s extracorporeal shock wave lithotripsy (ESWL) was first used for the treatment of urolithiasis. Orthopedic surgeons use extracorporeal shock wave therapy (ESWT) to treat non-union fractures, tendinopathies and osteonecrosis. The first application of ESWT in dermatology was for recalcitrant skin ulcers. Several studies in the last 10 years have shown that ESWT promotes angiogenesis, increases perfusion in ischemic tissues, decreases inflammation, enhances cell differentiation and accelerates wound healing. We successfully treated a non-healing chronic venous leg ulcer with ESWT. Furthermore we observed an improvement of the lymphatic drainage after application of ESWT. We are confident that ESWT is a non-invasive, practical, safe and efficient physical treatment modality for recalcitrant leg ulcers.
Resumo:
Canine acute gastric dilatation-volvulus (GDV) is a life-threatening condition of multifactorial origin. The risk of developing GDV is influenced by a variety of factors, including breed, age, gender, temperament, diet and management. A relationship between seasonal variations and the frequency of GDV has been previously documented although no association was found with any specific climatic event. Variables in weather conditions within a defined geographic region were investigated in a retrospective study of 287 client-owned dogs diagnosed with GDV between 1992 and 1999. Monthly incidences were evaluated and differences in atmospheric temperature, humidity and pressure between days in which GDV cases were observed and days in which no case was presented were examined. Although temperature was significantly associated with the occurrence of GDV, the difference in temperatures between days with and days without GDV cases was so small that it is unlikely to be of clinical relevance. Moreover, no significant association was found between GDV occurrence and atmospheric pressure or humidity, and a seasonal variation in GDV incidence was not observed.
Resumo:
BACKGROUND AND PURPOSE Visit-to-visit variability in systolic blood pressure (SBP) is associated with an increased risk of stroke and was reduced in randomized trials by calcium channel blockers and diuretics but not by renin-angiotensin system inhibitors. However, time of day effects could not be determined. Day-to-day variability on home BP readings predicts stroke risk and potentially offers a practical method of monitoring response to variability-directed treatment. METHODS SBP mean, maximum, and variability (coefficient of variation=SD/mean) were determined in 500 consecutive transient ischemic attack or minor stroke patients on 1-month home BP monitoring (3 BPs, 3× daily). Hypertension was treated to a standard protocol. Differences in SBP variability from 3 to 10 days before to 8 to 15 days after starting or increasing calcium channel blockers/diuretics versus renin-angiotensin system inhibitors versus both were compared by general linear models, adjusted for risk factors and baseline BP. RESULTS Among 288 eligible interventions, variability in SBP was reduced after increased treatment with calcium channel blockers/diuretics versus both versus renin-angiotensin system inhibitors (-4.0 versus 6.9 versus 7.8%; P=0.015), primarily because of effects on maximum SBP (-4.6 versus -1.0 versus -1.0%; P=0.001), with no differences in effect on mean SBP. Class differences were greatest for early-morning SBP variability (3.6 versus 17.0 versus 38.3; P=0.002) and maximum (-4.8 versus -2.0 versus -0.7; P=0.001), with no effect on midmorning (P=0.29), evening (P=0.65), or diurnal variability (P=0.92). CONCLUSIONS After transient ischemic attack or minor stroke, calcium channel blockers and diuretics reduced variability and maximum home SBP, primarily because of effects on morning readings. Home BP readings enable monitoring of response to SBP variability-directed treatment in patients with recent cerebrovascular events.
Resumo:
Sodium is the most abundant extracellular cation and therefore pivotal in determining fluid balance. At the beginning of life, a positive sodium balance is needed to grow. Newborns and preterm infants tend to lose sodium via their kidneys and therefore need adequate sodium intake. Among older children and adults, however, excessive salt intake leads to volume expansion and arterial hypertension. Children who are overweight, born preterm, or small for gestational age and African American children are at increased risk of developing high blood pressure due to a high salt intake because they are more likely to be salt sensitive. In the developed world, salt intake is generally above the recommended intake also among children. Although a positive sodium balance is needed for growth during the first year of life, in older children, a sodium-poor diet seems to have the same cardiovascular protective effects as among adults. This is relevant, since: (1) a blood pressure tracking phenomenon was recognized; (2) the development of taste preferences is important during childhood; and (3) salt intake is often associated with the consumption of sugar-sweetened beverages (predisposing children to weight gain).
Resumo:
Although employees are encouraged to take exercise after work to keep physically fit, they should not suffer injury. Some sports injuries that occur after work appear to be work-related and preventable. This study investigated whether cognitive failure mediates the influence of mental work demands and conscientiousness on risk-taking and risky and unaware behaviour during after-work sports activities. Participants were 129 employees (36% female) who regularly took part in team sports after work. A structural equation model showed that work-related cognitive failure significantly mediated the influence of mental work demands on risky behaviour during sports (p < .05) and also mediated the directional link between conscientiousness and risky behaviour during sports (p < .05). A path from risky behaviour during sports to sports injuries in the last four weeks was also significant (p < .05). Performance constraints, time pressure, and task uncertainty are likely to increase cognitive load and thereby boost cognitive failures both during work and sports activities after work. Some sports injuries after work could be prevented by addressing the issue of work redesign.
Resumo:
Background. In the field of information technology (IT) time pressure is common. Working with tight deadlines together on the same task increases the risk of social stressors referring to tensions and conflicts at work. Purpose. This field study tested both the association of time pressure and social stressors with blood pressure during work. Method. Seven employees – staff of a small IT enterprise – participated in repeated ambulatory blood pressure measurements over the course of one week. Time pressure and social stressors at work were assessed by questionnaire at the beginning of the study. Results. Multilevel regression analyses of 138 samples revealed higher levels of time pressure to be related to marginally significant increases in mean arterial blood pressure at noon and in the afternoon. In addition, higher levels of social stressors at work were significantly associated to elevated mean arterial pressure in the afternoon. Conclusion. Findings support the view that threats to the social self play an important role in occupational health.
Resumo:
PURPOSE Rapid assessment and intervention is important for the prognosis of acutely ill patients admitted to the emergency department (ED). The aim of this study was to prospectively develop and validate a model predicting the risk of in-hospital death based on all available information available at the time of ED admission and to compare its discriminative performance with a non-systematic risk estimate by the triaging first health-care provider. METHODS Prospective cohort analysis based on a multivariable logistic regression for the probability of death. RESULTS A total of 8,607 consecutive admissions of 7,680 patients admitted to the ED of a tertiary care hospital were analysed. Most frequent APACHE II diagnostic categories at the time of admission were neurological (2,052, 24 %), trauma (1,522, 18 %), infection categories [1,328, 15 %; including sepsis (357, 4.1 %), severe sepsis (249, 2.9 %), septic shock (27, 0.3 %)], cardiovascular (1,022, 12 %), gastrointestinal (848, 10 %) and respiratory (449, 5 %). The predictors of the final model were age, prolonged capillary refill time, blood pressure, mechanical ventilation, oxygen saturation index, Glasgow coma score and APACHE II diagnostic category. The model showed good discriminative ability, with an area under the receiver operating characteristic curve of 0.92 and good internal validity. The model performed significantly better than non-systematic triaging of the patient. CONCLUSIONS The use of the prediction model can facilitate the identification of ED patients with higher mortality risk. The model performs better than a non-systematic assessment and may facilitate more rapid identification and commencement of treatment of patients at risk of an unfavourable outcome.
Resumo:
BACKGROUND The copy number variation (CNV) in beta-defensin genes (DEFB) on human chromosome 8p23 has been proposed to contribute to the phenotypic differences in inflammatory diseases. However, determination of exact DEFB CN is a major challenge in association studies. Quantitative real-time PCR (qPCR), paralog ratio tests (PRT) and multiplex ligation-dependent probe amplification (MLPA) have been extensively used to determine DEFB CN in different laboratories, but inter-method inconsistencies were observed frequently. In this study we asked which one is superior among the three methods for DEFB CN determination. RESULTS We developed a clustering approach for MLPA and PRT to statistically correlate data from a single experiment. Then we compared qPCR, a newly designed PRT and MLPA for DEFB CN determination in 285 DNA samples. We found MLPA had the best convergence and clustering results of the raw data and the highest call rate. In addition, the concordance rates between MLPA or PRT and qPCR (32.12% and 37.99%, respectively) were unacceptably low with underestimated CN by qPCR. Concordance rate between MLPA and PRT (90.52%) was high but PRT systematically underestimated CN by one in a subset of samples. In these samples a sequence variant which caused complete PCR dropout of the respective DEFB cluster copies was found in one primer binding site of one of the targeted paralogous pseudogenes. CONCLUSION MLPA is superior to PRT and even more to qPCR for DEFB CN determination. Although the applied PRT provides in most cases reliable results, such a test is particularly sensitive to low-frequency sequence variations preferably accumulating in loci like pseudogenes which are most likely not under selective pressure. In the light of the superior performance of multiplex assays, the drawbacks of such single PRTs could be overcome by combining more test markers.
Resumo:
In recent decades, a number of global frameworks have been developed for disaster risk reduction (DRR). The Hyogo Framework for Action 2005–2015 and its successor document, the Sendai Framework for Disaster Risk Reduction, adopted in Japan in March 2015, provide general guidance for reducing risks from natural hazards. This is particularly important for mountainous areas, but DRR for mountain areas and sustainable mountain development received little attention in the recent policy debate. The question remains whether the Hyogo and Sendai frameworks can provide guidance for sustainable mountain development. This article evaluates the 2 frameworks in light of the special challenges of DRR in mountain areas and argues that, while the frameworks offer valuable guidance, they need to be further adapted for local contexts—particularly for mountain areas, which require special attention because of changing risk patterns like the effects of climate change and high land-use pressure.
Resumo:
Snow avalanches pose a threat to settlements and infrastructure in alpine environments. Due to the catastrophic events in recent years, the public is more aware of this phenomenon. Alpine settlements have always been confronted with natural hazards, but changes in land use and in dealing with avalanche hazards lead to an altering perception of this threat. In this study, a multi-temporal risk assessment is presented for three avalanche tracks in the municipality of Galtür, Austria. Changes in avalanche risk as well as changes in the risk-influencing factors (process behaviour, values at risk (buildings) and vulnerability) between 1950 and 2000 are quantified. An additional focus is put on the interconnection between these factors and their influence on the resulting risk. The avalanche processes were calculated using different simulation models (SAMOS as well as ELBA+). For each avalanche track, different scenarios were calculated according to the development of mitigation measures. The focus of the study was on a multi-temporal risk assessment; consequently the used models could be replaced with other snow avalanche models providing the same functionalities. The monetary values of buildings were estimated using the volume of the buildings and average prices per cubic meter. The changing size of the buildings over time was inferred from construction plans. The vulnerability of the buildings is understood as a degree of loss to a given element within the area affected by natural hazards. A vulnerability function for different construction types of buildings that depends on avalanche pressure was used to assess the degree of loss. No general risk trend could be determined for the studied avalanche tracks. Due to the high complexity of the variations in risk, small changes of one of several influencing factors can cause considerable differences in the resulting risk. This multi-temporal approach leads to better understanding of the today's risk by identifying the main changes and the underlying processes. Furthermore, this knowledge can be implemented in strategies for sustainable development in Alpine settlements.
Resumo:
OBJECTIVE Telomere length is a marker of biological aging that has been linked to cardiovascular disease risk. The black South African population is witnessing a tremendous increase in the prevalence of cardiovascular disease, part of which might be explained through urbanization. We compared telomere length between black South Africans and white South Africans and examined which biological and psychosocial variables played a role in ethnic difference in telomere length. METHODS We measured leukocyte telomere length in 161 black South African teachers and 180 white South African teachers aged 23 to 66 years without a history of atherothrombotic vascular disease. Age, sex, years having lived in the area, human immunodeficiency virus (HIV) infection, hypertension, body mass index, dyslipidemia, hemoglobin A1c, C-reactive protein, smoking, physical activity, alcohol abuse, depressive symptoms, psychological distress, and work stress were considered as covariates. RESULTS Black participants had shorter (median, interquartile range) relative telomere length (0.79, 0.70-0.95) than did white participants (1.06, 0.87-1.21; p < .001), and this difference changed very little after adjusting for covariates. In fully adjusted models, age (p < .001), male sex (p = .011), and HIV positive status (p = .023) were associated with shorter telomere length. Ethnicity did not significantly interact with any covariates in determining telomere length, including psychosocial characteristics. CONCLUSIONS Black South Africans showed markedly shorter telomeres than did white South African counterparts. Age, male sex, and HIV status were associated with shorter telomere length. No interactions between ethnicity and biomedical or psychosocial factors were found. Ethnic difference in telomere length might primarily be explained by genetic factors.
Resumo:
Recent work identified a high prevalence of modifiable risk factors for cardiovascular disease (CVD) among urban black South Africans. The aim was to track the progression of CVD risk factors in a multi-ethnic sample of South Africans. Participants were 173 black (aged 47.5 ± 7.8 yrs) and 186 white teachers (aged 49.6 ± 9.9 yrs) that were examined at baseline and 3 years follow-up. Blacks demonstrated a substantially higher prevalence of composite CVD burden (defined as history of physician diagnosed heart disease, use of anti-hypertensives, anti-diabetic, or statin medications at either time point) compared to whites (49.1 vs. 32.0%, p = 0.012) respectively. After controlling for baseline, the black participants demonstrated greater increases in 24 h systolic and diastolic blood pressure, total cholesterol, fasting glucose, fibrinogen, D-dimer, and waist circumference in comparison with whites. In summary, an adverse progression of CVD risk factors was observed in the whole sample, although to a larger degree in black participants. Aggressive treatment strategies for controlling risk factors in black Africans are needed to reduce the increasing burden of CVD in South Africa.