898 resultados para Vital parameters
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
Standardization is a common method for adjusting confounding factors when comparing two or more exposure category to assess excess risk. Arbitrary choice of standard population in standardization introduces selection bias due to healthy worker effect. Small sample in specific groups also poses problems in estimating relative risk and the statistical significance is problematic. As an alternative, statistical models were proposed to overcome such limitations and find adjusted rates. In this dissertation, a multiplicative model is considered to address the issues related to standardized index namely: Standardized Mortality Ratio (SMR) and Comparative Mortality Factor (CMF). The model provides an alternative to conventional standardized technique. Maximum likelihood estimates of parameters of the model are used to construct an index similar to the SMR for estimating relative risk of exposure groups under comparison. Parametric Bootstrap resampling method is used to evaluate the goodness of fit of the model, behavior of estimated parameters and variability in relative risk on generated sample. The model provides an alternative to both direct and indirect standardization method. ^
Resumo:
This work presents the first application of total-reflection X-ray fluorescence (TXRF) spectrometry, a new and powerful alternative analytical method, to evaluation of the bioaccumulation kinetics of gold nanorods (GNRs) in various tissues upon intravenous administration in mice. The analytical parameters for developed methodology by TXRF were evaluated by means of the parallel analysis of bovine liver certified reference material samples (BCR-185R) doped with 10 μg/g gold. The average values (n = 5) achieved for gold measurements in lyophilized tissue weight were as follows: recovery 99.7%, expanded uncertainty (k = 2) 7%, repeatability 1.7%, detection limit 112 ng/g, and quantification limit 370 ng/g. The GNR bioaccumulation kinetics was analyzed in several vital mammalian organs such as liver, spleen, brain, and lung at different times. Additionally, urine samples were analyzed to study the kinetics of elimination of the GNRs by this excretion route. The main achievement was clearly differentiating two kinds of behaviors. GNRs were quickly bioaccumulated by highly vascular filtration organs such as liver and spleen, while GNRs do not show a bioaccumulation rates in brain and lung for the period of time investigated. In parallel, urine also shows a lack of GNR accumulation. TXRF has proven to be a powerful, versatile, and precise analytical technique for the evaluation of GNRs content in biological systems and, in a more general way, for any kind of metallic nanoparticles.
Resumo:
The distribution of optimal local alignment scores of random sequences plays a vital role in evaluating the statistical significance of sequence alignments. These scores can be well described by an extreme-value distribution. The distribution’s parameters depend upon the scoring system employed and the random letter frequencies; in general they cannot be derived analytically, but must be estimated by curve fitting. For obtaining accurate parameter estimates, a form of the recently described ‘island’ method has several advantages. We describe this method in detail, and use it to investigate the functional dependence of these parameters on finite-length edge effects.
Resumo:
Background: The aging process involves a decline in immune functioning that renders elderly people more vulnerable to disease. In residential programs for the aged, it is vital to diminish their risk of disease, promote their independence, and augment their psychological well-being and quality of life. Methods: We performed a randomized controlled study, evaluating the ability of a relaxation technique based on Benson’s relaxation response to enhance psychological well-being and modulate the immune parameters of elderly people living in a geriatric residence when compared to a waitlist control group. The study included a 2-week intervention period and a 3-month follow-up period. The main outcome variables were psychological well-being and quality of life, biomedical variables, immune changes from the pre-treatment to post-treatment and follow-up periods. Results: Our findings reveal significant differences between the experimental and control groups in CD19, CD71, CD97, CD134, and CD137 lymphocyte subpopulations at the end of treatment. Furthermore, there was a decrease in negative affect, psychological discomfort, and symptom perception in the treatment group, which increased participants’ quality of life scores at the three-month follow-up. Conclusions: This study represents a first approach to the application of a passive relaxation technique in residential programs for the elderly. The method appears to be effective in enhancing psychological well-being and modulating immune activity in a group of elderly people. This relaxation technique could be considered an option for achieving health benefits with a low cost for residential programs, but further studies using this technique in larger samples of older people are needed to confirm the trends observed in the present study. Trial registration: International Standard Randomised Controlled Trial Number Register ISRCTN85410212.
Resumo:
In aquatic ecosystems, hydrological fluctuation may generate a gradient of lifehistory responses associated with marsh drying. This study was conducted in the Florida Everglades to document spatial and temporal variability in growth and survivorship of the bluefin killifish (Lucania goodei) from six populations along a hydroperiod gradient. The otolith-microstructure analysis of field-collected fish was used to estimate growth rate and those data were combined with field-density estimates for survivorship analysis. Otolith analysis revealed that L. goodei is extremely short-lived with no variation in growth rates and very little spatial or temporal variation in survivorship. These results suggest that bluefin killifish populations experience similar life histories across a diversity of hydroperiods either through well-mixed populations homogenizing these vital rates, or more likely, that a multitude of factors force L. goodei to respond to these "stressors" in a similar fashion across hydroperiod gradients.
Resumo:
Hypertensive patients exhibit higher cardiovascular risk and reduced lung function compared with the general population. Whether this association stems from the coexistence of two highly prevalent diseases or from direct or indirect links of pathophysiological mechanisms is presently unclear. This study investigated the association between lung function and carotid features in non-smoking hypertensive subjects with supposed normal lung function. Hypertensive patients (n = 67) were cross-sectionally evaluated by clinical, hemodynamic, laboratory, and carotid ultrasound analysis. Forced vital capacity, forced expired volume in 1 second and in 6 seconds, and lung age were estimated by spirometry. Subjects with ventilatory abnormalities according to current guidelines were excluded. Regression analysis adjusted for age and prior smoking history showed that lung age and the percentage of predicted spirometric parameters associated with common carotid intima-media thickness, diameter, and stiffness. Further analyses, adjusted for additional potential confounders, revealed that lung age was the spirometric parameter exhibiting the most significant regression coefficients with carotid features. Conversely, plasma C-reactive protein and matrix-metalloproteinases-2/9 levels did not influence this relationship. The present findings point toward lung age as a potential marker of vascular remodeling and indicate that lung and vascular remodeling might share common pathophysiological mechanisms in hypertensive subjects.
Resumo:
to investigate the pulmonary response to exercise of non-morbidly obese adolescents, considering the gender. a prospective cross-sectional study was conducted with 92 adolescents (47 obese and 45 eutrophic), divided in four groups according to obesity and gender. Anthropometric parameters, pulmonary function (spirometry and oxygen saturation [SatO2]), heart rate (HR), blood pressure (BP), respiratory rate (RR), and respiratory muscle strength were measured. Pulmonary function parameters were measured before, during, and after the exercise test. BP and HR were higher in obese individuals during the exercise test (p = 0.0001). SatO2 values decreased during exercise in obese adolescents (p = 0.0001). Obese males had higher levels of maximum inspiratory and expiratory pressures (p = 0.0002) when compared to obese and eutrophic females. Obese males showed lower values of maximum voluntary ventilation, forced vital capacity, and forced expiratory volume in the first second when compared to eutrophic males, before and after exercise (p = 0.0005). Obese females had greater inspiratory capacity compared to eutrophic females (p = 0.0001). Expiratory reserve volume was lower in obese subjects when compared to controls (p ≤ 0,05). obese adolescents presented changes in pulmonary function at rest and these changes remained present during exercise. The spirometric and cardiorespiratory values were different in the four study groups. The present data demonstrated that, in spite of differences in lung growth, the model of fat distribution alters pulmonary function differently in obese female and male adolescents.
Resumo:
Frailty and anemia in the elderly appear to share a common pathophysiology associated with chronic inflammatory processes. This study uses an analytical, cross-sectional, population-based methodology to investigate the probable relationships between frailty, red blood cell parameters and inflammatory markers in 255 community-dwelling elders aged 65 years or older. The frailty phenotype was assessed by non-intentional weight loss, fatigue, low grip strength, low energy expenditure and reduced gait speed. Blood sample analyses were performed to determine hemoglobin level, hematocrit and reticulocyte count, as well as the inflammatory variables IL-6, IL-1ra and hsCRP. In the first multivariate analysis (model I), considering only the erythroid parameters, Hb concentration was a significant variable for both general frailty status and weight loss: a 1.0g/dL drop in serum Hb concentration represented a 2.02-fold increase (CI 1.12-3.63) in an individual's chance of being frail. In the second analysis (model II), which also included inflammatory cytokine levels, hsCRP was independently selected as a significant variable. Each additional year of age represented a 1.21-fold increase in the chance of being frail, and each 1-unit increase in serum hsCRP represented a 3.64-fold increase in the chance of having the frailty phenotype. In model II reticulocyte counts were associated with weight loss and reduced metabolic expenditure criteria. Our findings suggest that reduced Hb concentration, reduced RetAbs count and elevated serum hsCRP levels should be considered components of frailty, which in turn is correlated with sarcopenia, as evidenced by weight loss.
Resumo:
Bariatric surgery is considered an effective method for sustained weight loss, but may cause various nutritional complications. The aim of this study was to evaluate the nutritional status of minerals and vitamins, food consumption, and to monitor physiologic parameters in patients with obesity before and 6 months after Roux-en-Y gastric bypass surgery (RYGB). Thirty-six patients who had undergone RYGB were prospectively evaluated before and 6 months after surgery. At each phase their weight, height, body mass index (BMI), Electro Sensor Complex (ES Complex) data, food consumption, and total protein serum levels, albumin, prealbumin, parathyroid hormone (PTH), zinc (Zn), B12 vitamin (VitB12), iron (Fe), ferritin, copper (Cu), ionic calcium (CaI), magnesium (Mg), and folic acid were assessed. The mean weight loss from baseline to 6 months after surgery was 35.34±4.82%. Markers of autonomic nervous system balance (P<.01), stiffness index (P<.01), standard deviation of normal-to-normal R-R intervals (SDNN) (P<.01), and insulin resistance (P<.001) were also improved. With regard to the micronutrients measured, 34 patients demonstrated some kind of deficiency. There was a high percentage of Zn deficiency in both pre- (55.55%) and postoperative (61.11%) patients, and 33.33% of the patients were deficient in prealbumin postoperatively. The protein intake after 6 months of surgery was below the recommended intake (<70 g/d) for 88.88% of the patients. Laboratory analyses demonstrated an average decrease in total protein (P<.05), prealbumin (P = .002), and PTH (P = .008) between pre- and postsurgery, and a decrease in the percentage of deficiencies for Mg (P<.05), CaI (P<.05), and Fe (P = .021). Despite improvements in the autonomic nervous system balance, stiffness index markers and insulin resistance, we found a high prevalence of hypozincemia at 6 months post-RYGB. Furthermore, protein supplements were needed to maintain an adequate protein intake up to 6 months postsurgery.
Resumo:
Crotamine is one of the main constituents of the venom of the South American rattlesnake Crotalus durissus terrificus. Here we sought to investigate the inflammatory and toxicological effects induced by the intrahippocampal administration of crotamine isolated from Crotalus whole venom. Adult rats received an intrahippocampal infusion of crotamine or vehicle and were euthanized 24 h or 21 days after infusion. Plasma and brain tissue were collected for biochemical analysis. Complete blood count, creatinine, urea, glutamic oxaloacetic transaminase (GOT), glutamic pyruvic transaminase (GPT), creatine-kinase (CK), creatine kinase-muscle B (CK-MB) and oxidative parameters (assessed by DNA damage and micronucleus frequency in leukocytes, lipid peroxidation and protein carbonyls in plasma and brain) were quantified. Unpaired and paired t-tests were used for comparisons between saline and crotamine groups, and within groups (24 h vs. 21 days), respectively. After 24 h crotamine infusion promoted an increase of urea, GOT, GPT, CK, and platelets values (p ≤ 0.01), while red blood cells, hematocrit and leukocytes values decreased (p ≤ 0.01). Additionally, 21 days after infusion crotamine group showed increased creatinine, leukocytes, TBARS (plasma and brain), carbonyl (plasma and brain) and micronucleus compared to the saline-group (p ≤ 0.01). Our findings show that crotamine infusion alter hematological parameters and cardiac markers, as well as oxidative parameters, not only in the brain, but also in the blood, indicating a systemic pro-inflammatory and toxicological activity. A further scientific attempt in terms of preserving the beneficial activity over toxicity is required.
Resumo:
The purpose of this study was to evaluate the effectiveness of mature red cell and reticulocyte parameters under three conditions: iron deficiency anemia, anemia of chronic disease, and anemia of chronic disease associated with absolute iron deficiency. Peripheral blood cells from 117 adult patients with anemia were classified according to iron status, and inflammatory activity, and the results of a hemoglobinopathy investigation as: iron deficiency anemia (n=42), anemia of chronic disease (n=28), anemia of chronic disease associated with iron deficiency anemia (n=22), and heterozygous β thalassemia (n=25). The percentage of microcytic red cells, hypochromic red cells, and levels of hemoglobin content in both reticulocytes and mature red cells were determined. Receiver operating characteristic analysis was used to evaluate the accuracy of the parameters in differentiating between the different types of anemia. There was no significant difference between the iron deficient group and anemia of chronic disease associated with absolute iron deficiency in respect to any parameter. The percentage of hypochromic red cells was the best parameter to discriminate anemia of chronic disease with and without absolute iron deficiency (area under curve=0.785; 95% confidence interval: 0.661-0.909, with sensitivity of 72.7%, and specificity of 70.4%; cut-off value 1.8%). The formula microcytic red cells minus hypochromic red cells was very accurate in differentiating iron deficiency anemia and heterozygous β thalassemia (area under curve=0.977; 95% confidence interval: 0.950-1.005; with sensitivity of 96.2%, and specificity of 92.7%; cut-off value 13.8). The indices related to red cells and reticulocytes have a moderate performance in identifying absolute iron deficiency in patients with anemia of chronic disease.
Resumo:
The main aim of this investigation was to verify the relationship of the variables measured during a 3-minute all-out test with aerobic (i.e., peak oxygen uptake [(Equation is included in full-text article.)] and intensity corresponding to the lactate minimum [LMI]) and anaerobic parameters (i.e., anaerobic work) measured during a 400-m maximal performance. To measure force continually and to avoid the possible influences caused by turns, the 3-minute all-out effort was performed in tethered swimming. Thirty swimmers performed the following tests: (a) a 3-minute all-out tethered swimming test to determine the final force (equivalent to critical force: CF3-MIN) and the work performed above CF3-MIN (W'3-MIN), (b) a LMI protocol to determine the LMI during front crawl swimming, and (c) a 400-m maximal test to determine the (Equation is included in full-text article.)and total anaerobic contribution (WANA). Correlations between the variables were tested using the Pearson's correlation test (p ≤ 0.05). CF3-MIN (73.9 ± 13.2 N) presented a high correlation with the LMI (1.33 ± 0.08 m·s; p = 0.01) and (Equation is included in full-text article.)(4.5 ± 1.2 L·min; p = 0.01). However, the W'3-MIN (1,943.2 ± 719.2 N·s) was only moderately correlated with LMI (p = 0.02) and (Equation is included in full-text article.)(p = 0.01). In summary, CF3-MIN determined during the 3-minute all-out effort is associated with oxidative metabolism and can be used to estimate the aerobic capacity of swimmers. In contrast, the anaerobic component of this model (W'3-MIN) is not correlated with WANA.
Resumo:
Current data indicate that the size of high-density lipoprotein (HDL) may be considered an important marker for cardiovascular disease risk. We established reference values of mean HDL size and volume in an asymptomatic representative Brazilian population sample (n=590) and their associations with metabolic parameters by gender. Size and volume were determined in HDL isolated from plasma by polyethyleneglycol precipitation of apoB-containing lipoproteins and measured using the dynamic light scattering (DLS) technique. Although the gender and age distributions agreed with other studies, the mean HDL size reference value was slightly lower than in some other populations. Both HDL size and volume were influenced by gender and varied according to age. HDL size was associated with age and HDL-C (total population); non- white ethnicity and CETP inversely (females); HDL-C and PLTP mass (males). On the other hand, HDL volume was determined only by HDL-C (total population and in both genders) and by PLTP mass (males). The reference values for mean HDL size and volume using the DLS technique were established in an asymptomatic and representative Brazilian population sample, as well as their related metabolic factors. HDL-C was a major determinant of HDL size and volume, which were differently modulated in females and in males.
Resumo:
The aim of this study was to evaluate by clinical and laboratory parameters how cystic fibrosis (CF) affects growth and nutritional status of children who were undergoing CF treatment but did not receive newborn screening. A historical cohort study of 52 CF patients younger than 10 years of age were followed in a reference center in Campinas, Southeast Brazil. Anthropometric measurements were abstracted from medical records until March/2010, when neonatal screening program was implemented. Between September/2009 and March/2010, parental height of the 52 CF patients were also measured. Regarding nutritional status, four patients had Z-scores ≤ -2 for height/age (H/A) and body mass index/age (BMI/A). The following variables were associated with improved H/A ratio: fewer hospitalizations, longer time from first appointment to diagnosis, longer time from birth to diagnosis and later onset of respiratory disease. Forced vital capacity [FVC(%)], forced expiratory flow between 25-75% of FVC [FEF25-75(%)], forced expiratory volume in the first second [FEV1(%)], gestational age, birth weight and early respiratory symptoms were associated with IMC/A. Greater number of hospitalizations, diagnosis delay and early onset of respiratory disease had a negative impact on growth. Lower spirometric values, lower gestational age, lower birth weight, and early onset of respiratory symptoms had negative impact on nutritional status. Malnutrition was observed in 7.7% of cases, but 23% of children had nutritional risk.