6 resultados para Vital parameters

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

60.00% 60.00%

Publicador:

Resumo:

REASONS FOR PERFORMING STUDY: Efficacy of medications for recurrent airway obstruction is typically tested using clinical, cytological and lung function examinations of severely affected animals. These trials are technically challenging and may not adequately reflect the spectrum of disease and owner complaints encountered in clinical practice. OBJECTIVE: To determine if owners of horses with chronic airway disease are better able to detect drug efficacy than a veterinarian who clinically examines horses infrequently. METHOD: In a double-blinded randomised controlled trial, owners and a veterinarian compared the efficacy of dexamethasone (0.1 mg/kg bwt per os, q. 24 h, for 3 weeks; n = 9) to placebo (n = 8) in horses with chronic airway disease. Before and after treatment, owners scored performance, breathing effort, coughing and nasal discharge using a visual analogue scale (VAS). The clinician recorded vital parameters, respiratory distress, auscultation findings, cough and nasal discharge, airway mucus score, bronchoalveolar lavage fluid (BALF) cytology and arterial blood gases. RESULTS: The VAS score improved significantly in dexamethasone- but not placebo-treated horses. In contrast, the clinician failed to differentiate between dexamethasone- and placebo-treated animals based on clinical observations, BALF cytology or endoscopic mucus score. Respiratory rate (RR) and arterial oxygen pressure (PaO(2)) improved with dexamethasone but not placebo. CONCLUSIONS AND CLINICAL RELEVANCE: In the design of clinical trials of airway disease treatments, more emphasis should be placed on owner-assessed VAS than on clinical, cytological and endoscopic observations made during brief examinations by a veterinarian. Quantifiable indicators reflecting lung function such as RR and PaO(2) provide a good assessment of drug efficacy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In her book 'Living on Light', Jasmuheen tries to animate people worldwide to follow her drastic nutrition rules in order to boost their quality of life. Several deaths have been reported as a fatal consequence. A doctor of chemistry who believably claimed to have been 'living on light' for 2 years, except for the daily intake of up to 1.5 l of fluid containing no or almost no calories was interested in a scientific study on this phenomenon. PARTICIPANT AND METHODS: The 54-year-old man was subjected to a rigorous 10-day isolation study with complete absence of nutrition. During the study he obtained an unlimited amount of tea and mineral water but had no caloric intake. Parameters to monitor his metabolic and psychological state and vital parameters were measured regularly and the safety of the individual was ensured throughout the study. The subject agreed on these terms and the study was approved by the local ethics committee.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Complex pelvic traumas, i.e., pelvic fractures accompanied by pelvic soft tissue injuries, still have an unacceptably high mortality rate of about 18 %. PATIENTS AND METHODS We retrospectively evaluated an intersection set of data from the TraumaRegister DGU® and the German Pelvic Injury Register from 2004-2009. Patients with complex and noncomplex pelvic traumas were compared regarding their vital parameters, emergency management, stay in the ICU, and outcome. RESULTS From a total of 344 patients with pelvic injuries, 21 % of patients had a complex and 79 % a noncomplex trauma. Complex traumas were significantly less likely to survive (16.7 % vs. 5.9 %). Whereas vital parameters and emergency treatment in the preclinical setting did not differ substantially, patients with complex traumas were more often in shock and showed acute traumatic coagulopathy on hospital arrival, which resulted in more fluid volumes and transfusions when compared to patients with noncomplex traumas. Furthermore, patients with complex traumas had more complications and longer ICU stays. CONCLUSION Prevention of exsanguination and complications like multiple organ dysfunction syndrome still pose a major challenge in the management of complex pelvic traumas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Treatment of retinopathy of prematurity (ROP) stage 3 plus with bevacizumab is still very controversial. We report the outcome of 6 eyes of 4 premature infants with ROP stage 3 plus disease treated with ranibizumab monotherapy. METHODS Six eyes of 4 premature infants with threshold ROP 3 plus disease in zone II, were treated with one intravitreal injection of 0.03 ml ranibizumab. No prior laser or other intravitreal therapy was done. Fundus examination was performed prior to the intervention and at each follow-up visit. Changes in various mean vital parameters one week post intervention compared to one week pre-intervention were assessed. RESULTS The gestational age (GA) of patient 1, 2, 3, and 4 at birth was 24 5/7, 24 5/7, 24 4/7, and 26 1/7 weeks, respectively. The birth weight was 500 grams, 450 grams, 665 grams, and 745 grams, respectively. The GA at the date of treatment ranged from 34 3/7 to 38 6/7 weeks. In one infant, upper air way infection was observed 2 days post injection of the second eye. Three eyes required paracentesis to reduce the intraocular pressure after injection and to restore central artery perfusion. After six months, all eyes showed complete retinal vascularisation without any signs of disease recurrence. CONCLUSIONS Treatment of ROP 3 plus disease with intravitreal ranibizumab was effective in all cases and should be considered for treatment. One infant developed an upper air way infection suspicious for nasopharyngitis, which might be a possible side effect of ranibizumab. Another frequent complication was intraocular pressure rise after injection. More patients with longer follow-up duration are mandatory to confirm the safety and efficacy of this treatment. TRIAL REGISTRATION NUMBER NCT02164604 ; Date of registration: 13.06.2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND There is confusion over the definition of the term "viability state(s)" of microorganisms. "Viability staining" or "vital staining techniques" are used to distinguish live from dead bacteria. These stainings, first established on planctonic bacteria, may have serious shortcomings when applied to multispecies biofilms. Results of staining techniques should be compared with appropriate microbiological data. DISCUSSION Many terms describe "vitality states" of microorganisms, however, several of them are misleading. Authors define "viable" as "capable to grow". Accordingly, staining methods are substitutes, since no staining can prove viability.The reliability of a commercial "viability" staining assay (Molecular Probes) is discussed based on the corresponding product information sheet: (I) Staining principle; (II) Concentrations of bacteria; (III) Calculation of live/dead proportions in vitro. Results of the "viability" kit are dependent on the stains' concentration and on their relation to the number of bacteria in the test. Generally this staining system is not suitable for multispecies biofilms, thus incorrect statements have been published by users of this technique.To compare the results of the staining with bacterial parameters appropriate techniques should be selected. The assessment of Colony Forming Units is insufficient, rather the calculation of Plating Efficiency is necessary. Vital fluorescence staining with Fluorescein Diacetate and Ethidium Bromide seems to be the best proven and suitable method in biofilm research.Regarding the mutagenicity of staining components users should be aware that not only Ethidium Bromide might be harmful, but also a variety of other substances of which the toxicity and mutagenicity is not reported. SUMMARY - The nomenclature regarding "viability" and "vitality" should be used carefully.- The manual of the commercial "viability" kit itself points out that the kit is not suitable for natural multispecies biofilm research, as supported by an array of literature.- Results obtained with various stains are influenced by the relationship between bacterial counts and the amount of stain used in the test. Corresponding vitality data are prone to artificial shifting.- As microbiological parameter the Plating Efficiency should be used for comparison.- Ethidium Bromide is mutagenic. Researchers should be aware that alternative staining compounds may also be or even are mutagenic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.