74 resultados para Fluid loss control

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Body height decreases throughout the day due to fluid loss from the intervertebral disk. This study investigated whether spinal shrinkage was greater during workdays compared with nonwork days, whether daily work stressors were positively related to spinal shrinkage, and whether job control was negatively related to spinal shrinkage. In a consecutive 2-week ambulatory field study, including 39 office employees and 512 days of observation, spinal shrinkage was measured by a stadiometer, and calculated as body height in the morning minus body height in the evening. Physical activity was monitored throughout the 14 days by accelerometry. Daily work stressors, daily job control, biomechanical workload, and recreational activities after work were measured with daily surveys. Multilevel regression analyses showed that spinal disks shrank more during workdays than during nonwork days. After adjustment for sex, age, body weight, smoking status, biomechanical work strain, and time spent on physical and low-effort activities during the day, lower levels of daily job control significantly predicted increased spinal shrinkage. Findings add to knowledge on how work redesign that increases job control may possibly contribute to preserving intervertebral disk function and preventing occupational back pain.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Spiders, as all other arthropods, have an open circulatory system, and their body fluid, the hemolymph, freely moves between lymphatic vessels and the body cavities (see Wirkner and Huckstorf 2013). The hemolymph can be considered as a multifunctional organ, central for locomotion (Kropf 2013), respiration (Burmester 2013) and nutrition, and it amounts to approximately 20 % of a spider’s body weight. Any injury includes not only immediate hemolymph loss but also pathogen attacks and subsequent infections. Therefore spiders have to react to injuries in a combined manner to stop fluid loss and to defend against microbial invaders. This is achieved by an innate immune system which involves several host defence systems such as hemolymph coagulation and the production of a variety of defensive substances (Fukuzawa et al.2008). In spiders, the immune system is localised in hemocytes which are derived from the myocardium cells of the heart wall where they are produced as prohemocytes and from where they are released as different cell types into the hemolymph (Seitz 1972). They contribute to the defence against pathogens by phagocytosis, nodulation and encapsulation of invaders. The humoral response includes mechanisms which induce melanin production to destroy pathogens, a clotting cascade to stop hemolymph loss and the constitutive production of several types of antimicrobial peptides, which are stored in hemocyte granules and released into the hemolymph (Fukuzawa et al.2008) (Fig.7.1). The immune system of spiders is an innate immune system. It is hemolymph-based and characterised by a broad but not very particular specificity. Its advantage is a fast response within minutes to a few hours. This is in contrast to the adaptive immune system of vertebrates which can react to very specific pathogens, thus resulting in much more specific responses. Moreover, it creates an immunological memory during the lifetime of the species. The disadvantage is that it needs more time to react with antibody production, usually many hours to a few days, and needs to be built up during early ontogenesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Leptospiral pulmonary haemorrhage syndrome (LPHS) is a severe form of leptospirosis. Pathogenic mechanisms are poorly understood. Lung tissues from 26 dogs with LPHS, 5 dogs with pulmonary haemorrhage due to other causes and 6 healthy lungs were labelled for IgG (n=26), IgM (n=25) and leptospiral antigens (n=26). Three general staining patterns for IgG/IgM were observed in lungs of dogs with LPHS with most tissues showing more than one staining pattern: (1) alveolar septal wall staining, (2) staining favouring alveolar surfaces and (3) staining of intra-alveolar fluid. Healthy control lung showed no staining, whereas haemorrhagic lung from dogs not infected with Leptospira showed staining of intra-alveolar fluid and occasionally alveolar septa. Leptospiral antigens were not detected. We conclude that deposition of IgG/IgM is demonstrable in the majority of canine lungs with naturally occurring LPHS, similar to what has been described in other species. Our findings suggest involvement of the host humoral immunity in the pathogenesis of LPHS and provide further evidence to support the dog as a natural disease model for human LPHS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND One aspect of a multidimensional approach to understanding asthma as a complex dynamic disease is to study how lung function varies with time. Variability measures of lung function have been shown to predict response to beta(2)-agonist treatment. An investigation was conducted to determine whether mean, coefficient of variation (CV) or autocorrelation, a measure of short-term memory, of peak expiratory flow (PEF) could predict loss of asthma control following withdrawal of regular inhaled corticosteroid (ICS) treatment, using data from a previous study. METHODS 87 adult patients with mild to moderate asthma who had been taking ICS at a constant dose for at least 6 months were monitored for 2-4 weeks. ICS was then withdrawn and monitoring continued until loss of control occurred as per predefined criteria. Twice-daily PEF was recorded during monitoring. Associations between loss of control and mean, CV and autocorrelation of morning PEF within 2 weeks pre- and post-ICS withdrawal were assessed using Cox regression analysis. Predictive utility was assessed using receiver operator characteristics. RESULTS 53 out of 87 patients had sufficient PEF data over the required analysis period. The mean (389 vs 370 l/min, p<0.0001) and CV (4.5% vs 5.6%, p=0.007) but not autocorrelation of PEF changed significantly from prewithdrawal to postwithdrawal in subjects who subsequently lost control, and were unaltered in those who did not. These changes were related to time to loss of control. CV was the most consistent predictor, with similar sensitivity and sensitivity to exhaled nitric oxide. CONCLUSION A simple, easy to obtain variability measure of daily lung function such as the CV may predict loss of asthma control within the first 2 weeks of ICS withdrawal.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this study was to compare the effects of 3 different fluid types for resuscitation after experimentally induced hemorrhagic shock in anesthetized chickens and to evaluate partial pressures of carbon dioxide measured in arterial blood (Paco2), with a transcutaneous monitor (TcPco2), with a gastric intraluminal monitor (GiPco2), and by end tidal measurements (Etco2) under stable conditions and after induced hemorrhagic shock. Hemorrhagic shock was induced in 40 white leghorn chickens by removing 50% of blood volume by phlebotomy under general anesthesia. Birds were divided into 4 groups: untreated (control group) and treated with intravenous hetastarch (haes group), with a hemoglobin-based oxygen carrier (hemospan group), or by autotransfusion (blood group). Respiratory rates, heart rates, and systolic arterial blood pressure (SAP) were compared at 8 time points (baseline [T0]; at the loss of 10% [T10%], 20% [T20%], 30% [T30%], 40% [T40%], and 50% [T50%] of blood volume; at the end of resuscitation [RES]; and at the end of anesthesia [END]). Packed cell volume (PCV) and blood hemoglobin content were compared at 6 time points (T0, T50%, RES, and 1, 3, and 7 days after induced hemorrhagic shock). Measurements of Paco2, TcPco2, GiPco2, and Etco2 were evaluated at 2 time points (T0 and T50%), and venous lactic acid concentrations were evaluated at 3 time points (T0, T50%, and END). No significant differences were found in mortality, respiratory rate, heart rate, PCV, or hemoglobin values among the 4 groups. Birds given fluid resuscitation had significantly higher SAPs after fluid administration than did birds in the control group. In all groups, PCV and hemoglobin concentrations began to rise by day 3 after phlebotomy, and baseline values were reached 7 days after blood removal. At T0, TcPco2 did not differ significantly from Paco2, but GiPco2 and Etco2 differed significantly from Paco2. After hemorrhagic shock, GiPco2 and TcPco2 differed significantly from Paco2. The TcPco2 or GiPco2 values did not differ significantly at any time point in birds that survived or died in any of the groups and across all groups. These results showed no difference in mortality in leghorn chickens treated with fluid resuscitation after hemorrhagic shock and that the PCV and hemoglobin concentrations increased by 3 days after acute hemorrhage with or without treatment. The different CO2 measurements document changes in CO2-values consistent with poor perfusion and may prove useful for serial evaluation of responses to shock and shock treatment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE Blood loss and blood substitution are associated with higher morbidity after major abdominal surgery. During major liver resection, low local venous pressure, has been shown to reduce blood loss. Ambiguity persists concerning the impact of local venous pressure on blood loss during open radical cystectomy. We aimed to determine the association between intraoperative blood loss and pelvic venous pressure (PVP) and determine factors affecting PVP. MATERIAL AND METHODS In the frame of a single-center, double-blind, randomized trial, PVP was measured in 82 patients from a norepinephrine/low-volume group and in 81 from a control group with liberal hydration. For this secondary analysis, patients from each arm were stratified into subgroups with PVP <5 mmHg or ≥5 mmHg measured after cystectomy (optimal cut-off value for discrimination of patients with relevant blood loss according to the Youden's index). RESULTS Median blood loss was 800 ml [range: 300-1600] in 55/163 patients (34%) with PVP <5 mmHg and 1200 ml [400-3000] in 108/163 patients (66%) with PVP ≥5 mmHg; (P<0.0001). A PVP <5 mmHg was measured in 42/82 patients (51%) in the norepinephrine/low-volume group and 13/81 (16%) in the control group (P<0.0001). PVP dropped significantly after removal of abdominal packing and abdominal lifting in both groups at all time points (at begin and end of pelvic lymph node dissection, end of cystectomy) (P<0.0001). No correlation between PVP and central venous pressure could be detected. CONCLUSIONS Blood loss was significantly reduced in patients with low PVP. Factors affecting PVP were fluid management and abdominal packing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the current study we investigated whether ego depletion negatively affects attention regulation under pressure in sports by assessing participants' dart throwing performance and accompanying gaze behavior. According to the strength model of self-control, the most important aspect of self-control is attention regulation. Because higher levels of state anxiety are associated with impaired attention regulation, we chose a mixed design with ego depletion (yes vs. no) as between-subjects and anxiety level (high vs. low) as within-subjects factor. Participants performed a perceptual-motor task requiring selective attention, namely, dart throwing. In line with our expectations, depleted participants in the high-anxiety condition performed worse and displayed a shorter final fixation on bull's eye, demonstrating that when one's self-control strength is depleted, attention regulation under pressure cannot be maintained. This is the first study that directly supports the general assumption that ego depletion is a major factor in influencing attention regulation under pressure.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cystectomy and urinary diversion have high morbidity, and strategies to reduce complications are of utmost importance. Epidural analgesia and optimized fluid management are considered key factors contributing to successful enhanced recovery after surgery. In colorectal surgery, there is strong evidence that an intraoperative fluid management aiming for a postoperative zero fluid balance results in lower morbidity including a faster return of bowel function. Recently, a randomized clinical trial focusing on radical cystectomy demonstrated that a restrictive intraoperative hydration combined with a concomitant administration of norepinephrine reduced intraoperative blood loss, the need for blood transfusion and morbidity. The purpose of this review is to highlight specific anesthesiological aspects which have been shown to improve outcome after RC with urinary diversion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present study we investigated whether ego depletion negatively affects attention regulation under pressure in sports by assessing participants’ dart throwing performance and accompanying gaze behavior. According to the strength model of self-control the most important aspect of self-control is attention regulation (Schmeichel & Baumeister, 2010). As higher levels of state anxiety are associated with impaired attention regulation (Nieuwenhuys & Oudejans, 2012) we chose a mixed design with ego depletion (yes vs. no) as between-subjects and anxiety level (high vs. low) as within-subjects factor. A total of 28 right-handed students participated in our study (Mage = 23.4, SDage = 2.5; 10 female; no professional dart experience). Participants performed a perceptual-motor task requiring selective attention, namely, dart throwing. The task was performed while participants were positioned high and low on a climbing wall (i.e., with high and low levels of anxiety). In line with our expectations, a mixed-design ANOVA revealed that depleted participants in the high anxiety condition performed worse (p < .001) and displayed a shorter final fixation on bull’s eye (p < .01) than in the low anxiety condition, demonstrating that when one is depleted attention regulation under pressure cannot be maintained. This is the first study that directly supports the general assumption that ego depletion is a major factor in influencing attention regulation under pressure.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES A dissociation between behavioural (in-control) and physiological parameters (indicating loss-of-control) is associated with cardiovascular risk in defensive coping (DefS) Africans. We evaluated relationships between DefS, sub-clinical atherosclerosis, low-grade inflammation and hypercoagulation in a bi-ethnic sex cohort. METHODS Black (Africans) and white Africans (Caucasians) (n = 375; aged 44.6 ± 9.7 years) were included. Ambulatory BP, vascular structure (left carotid cross-sectional wall area (L-CSWA) and plaque counts), and markers of coagulation and inflammation were quantified. Ethnicity/coping style interaction was revealed only in DefS participants. RESULTS A hypertensive state, less plaque, low-grade inflammation, and hypercoagulation were more prevalent in DefS Africans (27-84%) than DefS Caucasians (18-41%). Regression analyses demonstrated associations between L-CSWA and 24 hour systolic BP (R(2) = 0.38; β = 0.78; p < 0.05) in DefS African men but not in DefS African women or Caucasians. No associations between L-CSWA and coagulation markers were evident. CONCLUSION Novel findings revealed hypercoagulation, low-grade inflammation and hyperkinetic BP (physiological loss-of-control responses) in DefS African men. Coupled to a self-reported in-control DefS behavioural profile, this reflects dissociation between behaviour and physiology. It may explain changes in vascular structure, increasing cerebrovascular disease risk in a state of hyper-vigilant coping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: The aim of this study was to compare the long-term outcomes of implants placed in patients treated for periodontitis periodontally compromised patients (PCP) and in periodontally healthy patients (PHP) in relation to adhesion to supportive periodontal therapy (SPT). MATERIAL AND METHODS: One hundred and twelve partially edentulous patients were consecutively enrolled in private specialist practice and divided into three groups according to their initial periodontal condition: PHP, moderate PCP and severe PCP. Perio and implant treatment was carried out as needed. Solid screws (S), hollow screws (HS) and hollow cylinders (HC) were installed to support fixed prostheses, after successful completion of initial periodontal therapy (full-mouth plaque score <25% and full-mouth bleeding score <25%). At the end of treatment, patients were asked to follow an individualized SPT program. At 10 years, clinical measures and radiographic bone changes were recorded by two calibrated operators, blinded to the initial patient classification. RESULTS: Eleven patients were lost to follow-up. During the period of observation, 18 implants were removed because of biological complications. The implant survival rate was 96.6%, 92.8% and 90% for all implants and 98%, 94.2% and 90% for S-implants only, respectively, for PHP, moderate PCP and severe PCP. The mean bone loss was 0.75 (+/- 0.88) mm in PHP, 1.14 (+/- 1.11) mm in moderate PCP and 0.98 (+/- 1.22) mm in severe PCP, without any statistically significant difference. The percentage of sites, with bone loss > or =3 mm, was, respectively, 4.7% for PHP, 11.2% for moderate PCP and 15.1% for severe PCP, with a statistically significant difference between PHP and severe PCP (P<0.05). Lack of adhesion to SPT was correlated with a higher incidence of bone loss and implant loss. CONCLUSION: Patients with a history of periodontitis presented a lower survival rate and a statistically significantly higher number of sites with peri-implant bone loss. Furthermore, PCP, who did not completely adhere to the SPT, were found to present a higher implant failure rate. This underlines the value of the SPT in enhancing the long-term outcomes of implant therapy, particularly in subjects affected by periodontitis, in order to control reinfection and limit biological complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The goal when resuscitating trauma patients is to achieve adequate tissue perfusion. One parameter of tissue perfusion is tissue oxygen saturation (StO2), as measured by near infrared spectroscopy. Using a commercially available device, we investigated whether clinically relevant blood loss of 500 ml in healthy volunteers can be detected by changes in StO2 after a standardized ischemic event. Methods We performed occlusion of the brachial artery for 3 minutes in 20 healthy female blood donors before and after blood donation. StO2 and total oxygenated tissue hemoglobin (O2Hb) were measured continuously at the thenar eminence. 10 healthy volunteers were assessed in the same way, to examine whether repeated vascular occlusion without blood donation exhibits time dependent effects. Results Blood donation caused a substantial decrease in systolic blood pressure, but did not affect resting StO2 and O2Hb values. No changes were measured in the blood donor group in the reaction to the vascular occlusion test, but in the control group there was an increase in the O2Hb rate of recovery during the reperfusion phase. Conclusion StO2 measured at the thenar eminence seems to be insensitive to blood loss of 500 ml in this setting. Probably blood loss greater than this might lead to detectable changes guiding the treating physician. The exact cut off for detectable changes and the time effect on repeated vascular occlusion tests should be explored further. Until now no such data exist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To evaluate the extent of bone fill over 3 years following the surgical treatment of peri-implantitis with bone grafting with or without a membrane. Material and Methods: In a non-submerged wound-healing mode, 15 subjects with 27 implants were treated with a bone substitute (Algipore®) alone and 17 subjects with 29 implants were treated with the bone substitute and a resorbable membrane (Osseoquest®). Implants with radiographic bone loss ≥1.8 mm following the first year in function and with bleeding and/or pus on probing were included. Following surgery, subjects were given systemic antibiotics (10 days) and rinsed with chlorhexidine. After initial healing, the subjects were enrolled in a strict maintenance programme. Results: Statistical analysis failed to demonstrate changes in bone fill between 1 and 3 years both between and within procedure groups. The mean defect fill at 3 years was 1.3 ± (SD) 1.3 mm if treated with the bone substitute alone and 1.6 ± (SD) 1.2 mm if treated with an adjunct resorbable membrane, (p=0.40). The plaque index decreased from approximately 40–10%, remaining stable during the following 2 years. Conclusion: Defect fill using a bone substitute with or without a membrane technique in the treatment of peri-implantitis can be maintained over 3 years.