948 resultados para Cardiac Output, Low


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cyclical recruitment of atelectasis with each breath is thought to contribute to ventilator-associated lung injury. Extrinsic positive end-expiratory pressure (PEEPe) can maintain alveolar recruitment at end exhalation, but PEEPe depresses cardiac output and increases overdistension. Short exhalation times can also maintain end-expiratory recruitment, but if the mechanism of this recruitment is generation of intrinsic PEEP (PEEPi), there would be little advantage compared with PEEPe. In seven New Zealand White rabbits, we compared recruitment from increased respiratory rate (RR) to recruitment from increased PEEPe after saline lavage. Rabbits were ventilated in pressure control mode with a fraction of inspired O(2) (Fi(O(2))) of 1.0, inspiratory-to-expiratory ratio of 2:1, and plateau pressure of 28 cmH(2)O, and either 1) high RR (24) and low PEEPe (3.5) or 2) low RR (7) and high PEEPe (14). We assessed cyclical lung recruitment with a fast arterial Po(2) probe, and we assessed average recruitment with blood gas data. We measured PEEPi, cardiac output, and mixed venous saturation at each ventilator setting. Recruitment achieved by increased RR and short exhalation time was nearly equivalent to recruitment achieved by increased PEEPe. The short exhalation time at increased RR, however, did not generate PEEPi. Cardiac output was increased on average 13% in the high RR group compared with the high PEEPe group (P < 0.001), and mixed venous saturation was consistently greater in the high RR group (P < 0.001). Prevention of end-expiratory derecruitment without increased end-expiratory pressure suggests that another mechanism, distinct from intrinsic PEEP, plays a role in the dynamic behavior of atelectasis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Exertional oscillatory ventilation (EOV) in heart failure may potentiate the negative effects of low cardiac output and high ventilation on exercise performance. We hypothesized that the presence of EOV might, per se, influence exercise capacity as evaluated by maximal cardiopulmonary exercise test. METHODS AND RESULTS: We identified 78 severe chronic heart failure patient pairs with and without EOV. Patients were matched for sex, age and peak oxygen consumption (VO2). Patients with EOV showed, for the same peak VO2, a lower workload (WL) at peak (DeltaWatts=5.8+/-23.0, P=0.027), a less efficient ventilation (higher VE/VCO2 slope: 38.0+/-8.3 vs. 32.8+/-6.3, P<0.001), lower peak exercise tidal volume (1.49+/-0.36 L vs. 1.61+/-0.46 L, P=0.015) and higher peak respiratory rate (34+/-7/min vs. 31+/-6/min, P=0.002). In 33 patients, EOV disappeared during exercise, whereas in 45 patients EOV persisted. Fifty percent of EOV disappearing patients had an increase in the VO2/WL relationship after EOV regression, consistent with a more efficient oxygen delivery to muscles. No cardiopulmonary exercise test parameter was associated with the different behaviour of VO2/WL. CONCLUSION: The presence of EOV negatively influences exercise performance of chronic heart failure patients likely because of an increased cost of breathing. EOV disappearance during exercise is associated with a more efficient oxygen delivery in several cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: The primary objective of this nationwide survey carried out in department of cardiac anesthesia in Germany was to identify current practice with regard to neuromonitoring und neuroprotection. METHODOLOGY: The data are based on a questionnaire sent out to all departments of cardiac anesthesia in Germany between October 2007 und January 2008. The anonymized questionnaire contained 26 questions about the practice of preoperative evaluation of cerebral vessels, intra-operative use of neuromonitoring, the nature und application of cerebral protective measures, perfusion management during cardiopulmonary bypass, postoperative evaluation of neurological status, and training in the field of cerebral monitoring. RESULTS: Of the 80 mailed questionnaires 55% were returned and 90% of department evaluated cerebral vessels preoperatively with duplex ultrasound. The methods used for intra-operative neuromonitoring are electroencephalography (EEG, 60%) for type A dissections (38.1%), for elective surgery on the thoracic and thoraco-abdominal aorta (34.1% and 31.6%, respectively) and in carotid surgery (43.2%) near infrared spectroscopy (40%), evoked potentials (30%) and transcranial Doppler sonography (17.5%), with some centers using combined methods. In most departments the central nervous system is not subjected to monitoring during bypass surgery, heart valve surgery, or minimally invasive surgery. Cerebral protective measures used comprise patient cooling on cardio-pulmonary bypass (CPB 100%), extracorporeal cooling of the head (65%) and the administration of corticosteroids (58%), barbiturates (50%) and antiepileptic drugs (10%). Neuroprotective anesthesia consists of administering inhalation anesthetics (32.5%; sevoflurane 76.5%) and intravenous anesthesia (20%; propofol and barbiturates each accounting for 46.2%). Of the departments 72.5% cool patients as a standard procedure for surgery involving cardiovascular arrest and 37.5% during all surgery using CPB. In 84.6% of department CPB flow equals calculated cardiac output (CO) under normothermia, while the desired mean arterial pressure (MAP) varies between 60 and 70 mmHg (43.9%) and between 50 and 60 mmHg (41.5%), respectively. At body temperatures less than 18 degrees C CPB flow is reduced below the calculated CO (70%) while 27% of departments use normothermic flow rates. The preferred MAP under hypothermia is between 50 and 60 mmHg (59%). The results of intra-operative neuromonitoring are documented on the anesthesia record (77%). In 42.5% of the departments postoperative neurological function is estimated by the anesthesiologist. Continuing education sessions pertaining to neuromonitoring are organized on a regular basis in 32.5% of the departments and in 37.5% individual physicians are responsible for their own neuromonitoring education. CONCLUSION: The present survey data indicate that neuromonitoring and neuroprotective therapy during CPB is not standardized in cardiac anesthesiology departments in Germany. The systemic use of available methods to implement multimodal neuromonitoring would be desirable.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Acute cardiogenic shock after myocardial infarction is associated with high in-hospital mortality attributable to persisting low-cardiac output. The Impella-EUROSHOCK-registry evaluates the safety and efficacy of the Impella-2.5-percutaneous left-ventricular assist device in patients with cardiogenic shock after acute myocardial infarction. METHODS AND RESULTS This multicenter registry retrospectively included 120 patients (63.6±12.2 years; 81.7% male) with cardiogenic shock from acute myocardial infarction receiving temporary circulatory support with the Impella-2.5-percutaneous left-ventricular assist device. The primary end point evaluated mortality at 30 days. The secondary end point analyzed the change of plasma lactate after the institution of hemodynamic support, and the rate of early major adverse cardiac and cerebrovascular events as well as long-term survival. Thirty-day mortality was 64.2% in the study population. After Impella-2.5-percutaneous left-ventricular assist device implantation, lactate levels decreased from 5.8±5.0 mmol/L to 4.7±5.4 mmol/L (P=0.28) and 2.5±2.6 mmol/L (P=0.023) at 24 and 48 hours, respectively. Early major adverse cardiac and cerebrovascular events were reported in 18 (15%) patients. Major bleeding at the vascular access site, hemolysis, and pericardial tamponade occurred in 34 (28.6%), 9 (7.5%), and 2 (1.7%) patients, respectively. The parameters of age >65 and lactate level >3.8 mmol/L at admission were identified as predictors of 30-day mortality. After 317±526 days of follow-up, survival was 28.3%. CONCLUSIONS In patients with acute cardiogenic shock from acute myocardial infarction, Impella 2.5-treatment is feasible and results in a reduction of lactate levels, suggesting improved organ perfusion. However, 30-day mortality remains high in these patients. This likely reflects the last-resort character of Impella-2.5-application in selected patients with a poor hemodynamic profile and a greater imminent risk of death. Carefully conducted randomized controlled trials are necessary to evaluate the efficacy of Impella-2.5-support in this high-risk patient group.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studies of thermal tolerance in marine ectotherms are key in understanding climate effects on ecosystems; however, tolerance of their larval stages has rarely been analyzed. Larval stages are expected to be particularly sensitive. Thermal stress may affect their potential for dispersal and zoogeographical distribution. A mismatch between oxygen demand and the limited capacity of oxygen supply to tissues has been hypothesized to be the first mechanism restricting survival at thermal extremes. Therefore, thermal tolerance of stage zoea I larvae was examined in two populations of the Chilean kelp crab Taliepus dentatus, which are separated by latitude and the thermal regime. We measured temperature-dependent activity, oxygen consumption, cardiac performance, body mass and the carbon (C) and nitrogen (N) composition in order to: (1) examine thermal effects from organismal to cellular levels, and (2) compare the thermal tolerance of larvae from two environmental temperature regimes. We found that larval performance is affected at thermal extremes indicated by decreases in activity, mainly in maxilliped beat rates, followed by decreases in oxygen consumption rates. Cardiac stroke volume was almost temperature-independent. Through changes in heart rate, cardiac output supported oxygen demand within the thermal window whereas at low and high temperature extremes heart rate declined. The comparison between southern and central populations suggests the adaptation of southern larvae to a colder temperature regime, with higher cardiac outputs due to increased cardiac stroke volumes, larger body sizes but similar body composition as indicated by similar C:N ratios. This limited but clear differentiation of thermal windows between populations allows the species to widen its biogeographical range.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introducción: Diversos cambios ocurren en el sistema cardiovascular materno durante el embarazo, lo que genera un gran estrés sobre este sistema especialmente durante el tercer trimestre, pudiendo acentuarse en presencia de determinados factores de riesgo. Los objetivos de este estudio fueron, valorar las adaptaciones cardiovasculares producidas por un programa específico de ejercicio físico; su seguridad sobre el sistema cardiovascular materno y los resultados del embarazo; y su eficacia en el control de los factores de riesgo cardiovascular. Material y métodos: El diseño del estudio fue un ensayo clínico aleatorizado. 151 gestantes sanas fueron evaluadas mediante un ecocardiograma y un electrocardiograma en la semana 20 y 34 de gestación. Un total de 89 gestantes participaron en un programa de ejercicio físico (GE) desde el primer hasta el tercer trimestre de embarazo, constituido principalmente por 25-30 minutos de trabajo aeróbico (55-60% de la frecuencia cardiaca de reserva), trabajo de fortalecimiento general y específico, y un trabajo de tonificación del suelo pélvico; desarrollado 3 días a la semana con una duración de 55-60 minutos cada sesión. Las gestantes aleatoriamente asignadas al grupo de control (GC; n=62) permanecieron sedentarias durante el embarazo. El estudio fue aprobado por el Comité Ético de investigación clínica del Hospital Universitario de Fuenlabrada. Resultados: Las características basales fueron similares entre ambos grupos. A diferencia del GC, las gestantes del GE evitaron el descenso significativo del gasto cardiaco indexado, entre el 2º y 3ºT de embarazo, y conservaron el patrón geométrico normal del ventrículo izquierdo; mientras que en el GC cambió hacia un patrón de remodelado concéntrico. En la semana 20, las gestantes del GE presentaron valores significativamente menores de frecuencia cardiaca (GC: 79,56±10,76 vs. GE: 76,05±9,34; p=0,04), tensión arterial sistólica (GC: 110,19±10,23 vs. GE: 106,04±12,06; p=0,03); tensión arterial diastólica (GC: 64,56±7,88 vs. GE: 61,81±7,15; p=0,03); tiempo de relajación isovolumétrica (GC: 72,94±14,71 vs. GE: 67,05±16,48; p=0,04); y un mayor tiempo de deceleración de la onda E (GC: 142,09±39,11 vs. GE: 162,10±48,59; p=0,01). En la semana 34, el GE presentó valores significativamente superiores de volumen sistólico (GC: 51,13±11,85 vs. GE: 56,21±12,79 p=0,04), de llenado temprano del ventrículo izquierdo (E) (GC: 78,38±14,07 vs. GE: 85,30±16,62; p=0,02) y de tiempo de deceleración de la onda E (GC: 130,35±37,11 vs. GE: 146,61±43,40; p=0,04). Conclusión: La práctica regular de ejercicio físico durante el embarazo puede producir adaptaciones positivas sobre el sistema cardiovascular materno durante el tercer trimestre de embarazo, además de ayudar en el control de sus factores de riesgo, sin alterar la salud materno-fetal. ABSTRACT Background: Several changes occur in the maternal cardiovascular system during pregnancy. These changes produce a considerable stress in this system, especially during the third trimester, which can be increased in presence of some risk factors. The aims of this study were, to assess the maternal cardiac adaptations in a specific exercise program; its safety on the maternal cardiovascular system and pregnancy outcomes; and its effectiveness in the control of cardiovascular risk factors. Material and methods: A randomized controlled trial was designed. 151 healthy pregnant women were assessed by an echocardiography and electrocardiography at 20 and 34 weeks of gestation. A total of 89 pregnant women participated in a physical exercise program (EG) from the first to the third trimester of pregnancy. It consisted of 25-30 minutes of aerobic conditioning (55-60% of their heart rate reserve), general and specific strength exercises, and a pelvic floor muscles training; 3 times per weeks during 55-60 minutes per session. Pregnant women randomized allocated to the control group (CG) remained sedentary during pregnancy. The study was approved by the Research Ethics Committee of Hospital Universitario de Fuenlabrada. Results: Baseline characteristics were similar between groups. Difference from the CG, pregnant women from the EG prevented the significant decrease of the cardiac output index, between the 2nd and 3rd trimester of pregnancy, and preserved the normal left ventricular pattern; whereas in the CG shifted to concentric remodeling pattern. At 20 weeks, women in the EG had significant lower heart rate (CG: 79,56±10,76 vs. EG: 76,05±9,34; p=0,04), systolic blood pressure (CG: 110,19±10,23 vs. EG: 106,04±12,06; p=0,03); diastolic blood pressure (CG: 64,56±7,88 vs. EG: 61,81±7,15; p=0,03); isovolumetric relaxation time (GC: 72,94±14,71 vs. GE: 67,05±16,48; p=0,04); and a higher deceleration time of E Wave (GC: 142,09±39,11 vs. GE: 162,10±48,59; p=0,01). At 34 weeks, the EG had a significant higher stroke volume (CG: 51,13±11,85 vs. EG: 56,21±12,79 p=0,04), early filling of left ventricular (E) (CG: 78,38±14,07 vs. EG: 85,30±16,62; p=0,02) and deceleration time of E wave (CG: 130,35±37,11 vs. EG:146,61±43,40; p=0,04). Conclusion: Physical regular exercise program during pregnancy may produce positive maternal cardiovascular adaptations during the third trimester of pregnancy. In addition, it helps to control the cardiovascular risk factors without altering maternal and fetus health.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Endurance exercise is widely assumed to improve cardiac function in humans. This project has determined cardiac function following endurance exercise for 6 (n = 30) or 12 ( n = 25) weeks in male Wistar rats (8 weeks old). The exercise protocol was 30 min/day at 0.8 km/h for 5 days/week with an endurance test on the 6th day by running at 1.2 km/h until exhaustion. Exercise endurance increased by 318% after 6 weeks and 609% after 12 weeks. Heart weight/kg body weight increased by 10.2% after 6 weeks and 24.1% after 12 weeks. Echocardiography after 12 weeks showed increases in left ventricular internal diameter in diastole (6.39 +/- 0.32 to 7.90 +/- 0.17 mm), systolic volume (49 +/- 7 to 83 +/- 11 mul) and cardiac output (75 +/- 3 to 107 +/- 8 ml/min) but not left wall thickness in diastole (1.74 +/- 0.07 to 1.80 +/- 0.06 mm). Isolated Langendorff hearts from trained rats displayed decreased left ventricular myocardial stiffness (22 +/- 1.1 to 19.1 +/- 0.3) and reduced purine efflux during pacing-induced workload increases. P-31-NMR spectroscopy in isolated hearts from trained rats showed decreased PCr and PCr/ATP ratios with increased creatine, AMP and ADP concentrations. Thus, this endurance exercise protocol resulted in physiological hypertrophy while maintaining or improving cardiac function.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cardiovascular diseases (CVD) contributed to almost 30% of worldwide mortality; with heart failure being one class of CVD. One popular and widely available treatment for heart failure is the intra-aortic balloon pump (IABP). This heart assist device is used in counterpulsation to improve myocardial function by increasing coronary perfusion, and decreasing aortic end-diastolic pressure (i.e. the resistance to blood ejection from the heart). However, this device can only be used acutely, and patients are bedridden. The subject of this research is a novel heart assist treatment called the Chronic Intermittent Mechanical Support (CIMS) which was conceived to offer advantages of the IABP device chronically, whilst overcoming its disadvantages. The CIMS device comprises an implantable balloon pump, a percutaneous drive line, and a wearable driver console. The research here aims to determine the haemodynamic effect of balloon pump activation under in vitro conditions. A human mock circulatory loop (MCL) with systemic and coronary perfusion was constructed, capable of simulating various degrees of heart failure. Two prototypes of the CIMS balloon pump were made with varying stiffness. Several experimental factors (balloon inflation/deflation timing, Helium gas volume, arterial compliance, balloon pump stiffness and heart valve type) form the factorial design experiments. A simple modification to the MCL allowed flow visualisation experiments using video recording. Suitable statistical tests were used to analyse the data obtained from all experiments. Balloon inflation and deflation in the ascending aorta of the MCL yielded favourable results. The sudden balloon deflation caused the heart valve to open earlier, thus causing longer valve opening duration in a cardiac cycle. It was also found that pressure augmentation in diastole was significantly correlated with increased cardiac output and coronary flowrate. With an optimum combination (low arterial compliance and low balloon pump stiffness), systemic and coronary perfusions were increased by 18% and 21% respectively, while the aortic end-diastolic pressure (forward flow resistance) decreased by 17%. Consequently, the ratio of oxygen supply and demand to myocardium (endocardial viability ratio, EVR) increased between 33% and 75%. The increase was mostly attributed to diastolic augmentation rather than systolic unloading.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective - The purpose of this study was to assess cardiac function and cell damage in intrauterine growth-restricted (IUGR) fetuses across clinical Doppler stages of deterioration. Study Design - One hundred twenty appropriate-for-gestational-age and 81 IUGR fetuses were classified in stages 1/2/3 according umbilical artery present/absent/reversed end-diastolic blood flow, respectively. Cardiac function was assessed by modified-myocardial performance index, early-to-late diastolic filling ratios, cardiac output, and cord blood B-type natriuretic peptide; myocardial cell damage was assessed by heart fatty acid–binding protein, troponin-I, and high-sensitivity C-reactive protein. Results - Modified-myocardial performance index, blood B-type natriuretic peptide, and early-to-late diastolic filling ratios were increased in a stage-dependent manner in IUGR fetuses, compared with appropriate-for-gestational-age fetuses. Heart fatty acid–binding protein levels were higher in IUGR fetuses at stage 3, compared with control fetuses. Cardiac output, troponin-I, and high-sensitivity C-reactive protein did not increase in IUGR fetuses at any stage. Conclusion - IUGR fetuses showed signs of cardiac dysfunction from early stages. Cardiac dysfunction deteriorates further with the progression of fetal compromise, together with the appearance of biochemical signs of cell damage.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: It is well known that sprint interval training (SIT), induces significant increases in peak oxygen uptake (VO2peak) at the group level. However, there have been only a few studies that have addressed the variability of VO2peak response following SIT, and precise mechanism(s) that may explain individual magnitude of response are unknown. Purpose: Therefore, the purpose of this thesis was to: 1) examine the inter-individual variability of the VO2peak response following SIT, 2) to inspect the relationship between changes in both central and peripheral measures and changes in VO2peak, and 3) to assess if peripheral or central adaptations play a role in whether an individual is a high or low responder with respect to VO2peak. Subjects: Twenty-two young, recreationally active males (age: 20.4 1.7 years; weight: 78.4 10.2 kg; VO2peak: 3.7 0.62 L/min) Methods: VO2peak (L/min), peak cardiac output (Qpeak [L/min]), and peak deoxygenated hemoglobin (HHbpeak [mM]) were measured before and after 16 sessions of SIT (Tabata Protocol) over four weeks. Peak a-vO2diff was calculated using a derivation of the Fick equation. Results: Due to a systematic error, HHbpeak could not be used to differentiate between individual responses. There was a large range of VO2peak response from pre to post testing (-4.75 to 32.18% change) and there was a significant difference between the Low Response Group (LRG) (n=8) and the High Response Group (HRG) (n=8) [f(1, 14)= 64.27, p<0.001]. Furthermore, there was no correlation between delta () VO2peak and Qpeak (r=-0.18, p=0.46) for all participants, nor was there an interaction effect between the Low and High Response Groups [f(1,11)=0.572, p=0.47]. Lastly, there was a significant correlation between VO2peak and peak a-vO2diff [r=0.692, p<0.001], and a significant interaction effect with peak a-vO2diff [f(1, 14)= 13.27, p<0.004] when comparing the HRG to the LRG. Conclusions: There was inter-individual variability of VO2peak response following 4 weeks of SIT, but central adaptations did not influence this variation. This suggests that peripheral adaptations may be responsible for VO2peak adaptation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This laboratory session provides hands-on experience for students to visualize the beating human heart with ultrasound imaging. Simple views are obtained from which students can directly measure important cardiac dimensions in systole and diastole. This allows students to derive, from first principles, important measures of cardiac function, such as stroke volume, ejection fraction, and cardiac output. By repeating the measurements from a subject after a brief exercise period, an increase in stroke volume and ejection fraction are easily demonstrable, potentially with or without an increase in left ventricular end-diastolic volume (which indicates preload). Thus, factors that affect cardiac performance can readily be discussed. This activity may be performed as a practical demonstration and visualized using an overhead projector or networked computers, concentrating on using the ultrasound images to teach basic physiological principles. This has proved to be highly popular with students, who reported a significant improvement in their understanding of Frank-Starling's law of the heart with ultrasound imaging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Apesar dos avanços na sua abordagem terapêutica, a hemorragia severa continua a ser a principal causa de morbilidade e mortalidade em animais vítimas de trauma ou sujeitos a intervenção cirúrgica. O aparecimento de lesões decorrentes, ou da morte consequente, deve-se ao deficit de volume de fluidos intravasculares e subsequente desenvolvimento do estado hipovolémico. Em termos fisiológicos, a consequência mais devastadora desta condição é a diminuição, absoluta ou relativa, da pré-carga cardíaca, resultando num baixo débito cardíaco, perfusão tecidular inadequada e diminuição do aporte de oxigénio aos tecidos, o qual compromete, inequivocamente, a função celular. O controlo da hipovolémia passa pela resolução da hemorragia e pela correção do deficit de volume intravascular causado e envolve, obrigatoriamente, o recurso à administração de fluidos intravenosos. A escolha do tipo de fluido mais adequado para a terapia intravenosa, em cada ocorrência, é uma tarefa que exige reflexão e ponderação. A seleção dos fluidos apropriados é da responsabilidade do médico veterinário, sendo, no entanto, fundamental que o enfermeiro veterinário detenha conhecimentos básicos sobre as diferenças entre os fluidos disponíveis para a fluidoterapia. O objetivo deste projeto é determinar qual o tipo de fluido mais adequado para ajudar a preservar a integridade e funcionalidade hepática, em situações de hipoperfusão, e assim ajudar a padronizar a sua escolha no momento da decisão pela fluidoterapia. Para atingir este objetivo recorreu-se ao modelo suíno, a fim de recrear a situação de hipoperfusão e posteriormente avaliar os efeitos de dois fluidos diferentes administrados na reposição volémica, o lactato de Ringer e hidroxietilamido 130/0,4. Os animais foram sujeitos a uma hemorragia controlada, após a qual foi reposta a volémia com os respetivos fluidos. Após esta reposição volémica os animais foram eutanaziados e foram obtidas amostras de vários órgãos, incluindo fígado, objeto do presente estudo, alvo de diversas técnicas histopatológicas, nomeadamente o estudo histopatológico de rotina, através de hematoxilina e eosina, e diversos métodos para deteção de eventos apoptóticos, incluindo citocromo c, TUNEL e M30.Após a avaliação exaustiva dos resultados obtidos através das técnicas realizadas, foi possível concluir que o lactato de Ringer confere uma maior proteção contra a lesão de reperfusão, quando comparado com o hidroxietilamido 130/0,4.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Master, Kinesiology & Health Studies) -- Queen's University, 2016-09-27 19:34:16.86

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To investigate the hemodynamic and ventilatory changes associated with the creation of an experimental bronchopleural fistula (BPF) treated by mechanical ventilation and thoracic drainage with or without a water seal. Six large white pigs weighing 25 kg each which, after general anesthesia, underwent endotracheal intubation (6mm), and mechanically ventilation. Through a left thoracotomy, a resection of the lingula was performed in order to create a BPF with an output exceeding 50% of the inspired volume. The chest cavity was closed and drained into the water sealed system for initial observation of the high output BPF. Significant reduction in BPF output and PaCO2 was related after insertion of a water-sealed thoracic drain, p< 0.05. Insertion of a water-sealed thoracic drain resulted in reduction in bronchopleural fistula output and better CO2 clearance without any drop in cardiac output or significant changes in mean arterial pressure.