57 resultados para Vital signs
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
BACKGROUND: Physiologic data display is essential to decision making in critical care. Current displays echo first-generation hemodynamic monitors dating to the 1970s and have not kept pace with new insights into physiology or the needs of clinicians who must make progressively more complex decisions about their patients. The effectiveness of any redesign must be tested before deployment. Tools that compare current displays with novel presentations of processed physiologic data are required. Regenerating conventional physiologic displays from archived physiologic data is an essential first step. OBJECTIVES: The purposes of the study were to (1) describe the SSSI (single sensor single indicator) paradigm that is currently used for physiologic signal displays, (2) identify and discuss possible extensions and enhancements of the SSSI paradigm, and (3) develop a general approach and a software prototype to construct such "extended SSSI displays" from raw data. RESULTS: We present Multi Wave Animator (MWA) framework-a set of open source MATLAB (MathWorks, Inc., Natick, MA, USA) scripts aimed to create dynamic visualizations (eg, video files in AVI format) of patient vital signs recorded from bedside (intensive care unit or operating room) monitors. Multi Wave Animator creates animations in which vital signs are displayed to mimic their appearance on current bedside monitors. The source code of MWA is freely available online together with a detailed tutorial and sample data sets.
Resumo:
A 2-year-old Red Holstein cow was presented with uterine torsion at 235 days of pregnancy. The fetus extracted by cesarean section had weak vital signs and marked abdominal distention. An edematous pouch that contained tubular structures with peristaltic activity was associated with the umbilical cord. Because of poor prognosis, both dam and fetus were euthanized. At necropsy, the fetus had severe distention of the forestomachs, abomasum, and proximal small intestine; absence of distal small intestine, cecum, and proximal colon; atresia of the 2 blind ends of the intestine; and atrophy of distal colon and rectum. The tubular structures associated with the umbilical cord were identified as the segments of intestine that were absent in the fetus. Intestinal atresia combined with ectopia may be caused by local ischemia during temporary herniation and rotation of the fetal gut into the extraembryonic coelom. The close connection between ectopic intestine and amniotic sheath of the umbilical cord in this case may have facilitated vascularization and allowed development and viability of the ectopic intestine.
Resumo:
OBJECTIVES: Residual airspace following thoracic resections is a common clinical problem. Persistent air leak, prolonged drainage time, and reduced hemostasis extend hospital stay and morbidity. We report a trial of pharmacologic-induced diaphragmatic paralysis through continuous paraphrenic injection of lidocaine to reduced residual airspace. The objectives were confirmation of diaphragmatic paralysis and possible procedure related complications. METHODS: Six eligible patients undergoing resectional surgery (lobectomy or bilobectomy) were included. Inclusion criteria consisted of: postoperative predicted FEV1 greater than 1300 ml, right-sided resection, absence of parenchymal lung disease, no class III antiarrhythmic therapy, absence of hypersensitivity reactions to lidocaine, no signs of infection, and informed consent. Upon completion of resection an epidural catheter was attached in the periphrenic tissue on the proximal pericardial surface, externalized through a separate parasternal incision, and connected to a perfusing system injecting lidocaine 1% at a rate of 3 ml/h (30 mg/h). Postoperative ICU surveillance for 24h and daily measurement of vital signs, drainage output, and bedside spirometry were performed. Within 48 h fluoroscopic confirmation of diaphragmatic paralysis was obtained. The catheter removal coincided with the chest tube removal when no procedural related complications occurred. RESULTS: None of the patients reported respiratory impairment. Diaphragmatic paralysis was documented in all patients. Upon removal of catheter or discontinuation of lidocaine prompt return of diaphragmatic motility was noticed. Two patients showed postoperative hemodynamic irrelevant atrial fibrillation. CONCLUSION: Postoperative paraphrenic catheter administration of lidocaine to ensure reversible diaphragmatic paralysis is safe and reproducible. Further studies have to assess a benefit in terms of reduction in morbidity, drainage time, and hospital stay, and determine the patients who will profit.
Resumo:
The child who presents with acute coma runs a high risk of cardiopulmonary insufficiency, direct brain injury or even cerebral herniation. The case-management of such child requires a coma-specific emergent evaluation, immediate treatment of any hypoxicischemic insults and of the underlying cause. The coma-specific examination includes performance of child-adapted Glasgow Coma Score, the evaluation of brain stem functions such as pupillary response to light, cough- and gag reflex, and determination of all vital signs including body temperature. Treatment of hypoxicischemic insults includes control of airways and ventilation in patient with coma defined as GCS <8; liberal treatment of impaired cardiovascular states with isotonic fluids such as 0.9% sodium chloride; and treatment of cerebral herniation with head elevation, mannitol, hypertonic sodium chlorid fluids, steroids and hyperventilation. Immediately treatable causes are hypoglycemia, meningitis/encephalitis, opioid overdose and status epilepticus. Exclusion of rapidly progressive intracranial lesions almost always requires referral to the tertiary centre with head CT-scan facilities. Finally, an extensive etiology search of the stable coma is performed by looking for disease or trauma of the brain, for metabolic causes, for intoxications and for cardiopulmonary problems.
Resumo:
BACKGROUND: The Prevention of cerebrovascular and cardiovascular Events of ischemic origin with teRutroban in patients with a history oF ischemic strOke or tRansient ischeMic attack (PERFORM) study is an international double-blind, randomized controlled trial designed to investigate the superiority of the specific TP receptor antagonist terutroban (30 mg/day) over aspirin (100 mg/day), in reducing cerebrovascular and cardiovascular events in patients with a recent history of ischemic stroke or transient ischemic attack. Here we describe the baseline characteristics of the population. METHODS AND RESULTS: Parameters recorded at baseline included vital signs, risk factors, medical history, and concomitant treatments, as well as stroke subtype, stroke-associated disability on the modified Rankin scale, and scores on scales for cognitive function and dependency. Eight hundred and two centers in 46 countries recruited a total of 19,119 patients between February 2006 and April 2008. The population is evenly distributed and is not dominated by any one country or region. The mean +/- SD age was 67.2 +/- 7.9 years, 63% were male, and 83% Caucasian; 83% had hypertension, and about half the population smoked or had quit smoking. Ninety percent of the qualifying events were ischemic stroke, 67% of which were classified as atherothrombotic or likely atherothrombotic (pure or coexisting with another cause). Modified Rankin scale scores showed slight or no disability in 83% of the population, while the scores on the Mini-Mental State Examination, Isaacs' Set Test, Zazzo's Cancellation Test, and the instrumental activities of daily living scale showed a good level of cognitive function and autonomy. CONCLUSIONS: The PERFORM study population is homogeneous in terms of demographic and disease characteristics. With 19,119 patients, the PERFORM study is powered to test the superiority of terutroban over aspirin in the secondary prevention of cerebrovascular and cardiovascular events in patients with a recent history of ischemic stroke or transient ischemic attack.
Resumo:
Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.
Resumo:
BACKGROUND Clinical trials yielded conflicting data about the benefit of adding systemic corticosteroids for treatment of community-acquired pneumonia. We assessed whether short-term corticosteroid treatment reduces time to clinical stability in patients admitted to hospital for community-acquired pneumonia. METHODS In this double-blind, multicentre, randomised, placebo-controlled trial, we recruited patients aged 18 years or older with community-acquired pneumonia from seven tertiary care hospitals in Switzerland within 24 h of presentation. Patients were randomly assigned (1:1 ratio) to receive either prednisone 50 mg daily for 7 days or placebo. The computer-generated randomisation was done with variable block sizes of four to six and stratified by study centre. The primary endpoint was time to clinical stability defined as time (days) until stable vital signs for at least 24 h, and analysed by intention to treat. This trial is registered with ClinicalTrials.gov, number NCT00973154. FINDINGS From Dec 1, 2009, to May 21, 2014, of 2911 patients assessed for eligibility, 785 patients were randomly assigned to either the prednisone group (n=392) or the placebo group (n=393). Median time to clinical stability was shorter in the prednisone group (3·0 days, IQR 2·5-3·4) than in the placebo group (4·4 days, 4·0-5·0; hazard ratio [HR] 1·33, 95% CI 1·15-1·50, p<0·0001). Pneumonia-associated complications until day 30 did not differ between groups (11 [3%] in the prednisone group and 22 [6%] in the placebo group; odds ratio [OR] 0·49 [95% CI 0·23-1·02]; p=0·056). The prednisone group had a higher incidence of in-hospital hyperglycaemia needing insulin treatment (76 [19%] vs 43 [11%]; OR 1·96, 95% CI 1·31-2·93, p=0·0010). Other adverse events compatible with corticosteroid use were rare and similar in both groups. INTERPRETATION Prednisone treatment for 7 days in patients with community-acquired pneumonia admitted to hospital shortens time to clinical stability without an increase in complications. This finding is relevant from a patient perspective and an important determinant of hospital costs and efficiency. FUNDING Swiss National Science Foundation, Viollier AG, Nora van Meeuwen Haefliger Stiftung, Julia und Gottfried Bangerter-Rhyner Stiftung.
Resumo:
The aim of this article is to disclose the characteristics of postmortem forensic imaging; give an overview of the several possible findings in postmortem imaging, which are uncommon or new to clinical radiologists; and discuss the possible pitfalls. Unspecific postmortem signs are enlisted and specific signs shall be presented, which are typical for one cause of death. Unspecific signs. Livor mortis may not only be seen from the outside, but also inside the body in the lungs: in chest CT internal livor mortis appear as ground glass opacity in the dependent lower lobes. The aortic wall is often hyperdense in postmortem CT due to wall contraction and loss of luminal pressure. Gas bubbles are very common postmortem due to systemic gas embolism after major open trauma, artificial respiration or initial decomposition; in particular putrefaction produces gas bubbles globally. Specific signs. Intracranial bleeding is hyperattenuating both in radiology and in postmortem imaging. Signs of strangulation are hemorrhage in the soft tissue of the neck like skin, subcutaneous tissue, platysma muscle and lymph nodes. The "vanishing" aorta is indicative for exsanguination. Fluid in the airways with mosaic lung densities and emphysema (aquosum) is typical for fresh-water drowning.
Resumo:
Blood aspiration is a significant forensic finding. In this study, we examined the value of postmortem computed tomography (CT) imaging in evaluating findings of blood aspiration. We selected 37 cases with autopsy evidence of blood in the lungs and/or in the airways previously submitted to total-body CT scanning. The CT-images were retrospectively analyzed. In one case with pulmonary blood aspiration, biopsy specimens were obtained under CT guide for histological examination. In six cases, CT detected pulmonary abnormalities suggestive of blood aspiration, not mentioned in the autopsy reports. CT reconstructions provided additional data about the distribution and extent of aspiration. In one needle-biopsied case, the pulmonary specimens showed blood in the alveoli. We suggest the use of CT imaging as a tool complementary to traditional techniques in cases of blood aspiration to avoid misdiagnosis, to guide the investigation of lung tissue, and to allow for more evidence-based inferences on the cause of death.
Resumo:
PURPOSE: This pilot study evaluated the wound healing and tissue response after placement of two different skin substitutes in subgingival mucosal pouches in rabbits. MATERIALS AND METHODS: Four rabbits were selected to receive a commercially available skin substitute consisting of a collagen matrix with fibroblasts and an epithelial layer (test membrane 1) and a prototype device consisting of a collagen matrix with fibroblasts only (test membrane 2). In each rabbit, two horizontal incisions were made in the buccal alveolar mucosa of the maxilla bilaterally to create submucosal pouches. Three pouches in each animal were filled with either the test 1 or test 2 membranes, and one pouch was left without a membrane (sham-operated control). All rabbits were sacrificed after a healing period of 4 weeks, and histologic samples were prepared and examined. RESULTS: After a healing period of 1 month, both tested membranes were still visible in the sections. Test membrane 1 was still bilayered, contained inflammatory cells in its center, and was encapsulated by a thick fibrous tissue. Numerous ectopic calcifications were evident in the collagenous part of the membrane and in association with some basal epithelial cells. Test membrane 2 was also encapsulated in fibrous tissue, with inflammatory cells present only between the fibrous encapsulation and the remnants of the membrane. For test membrane 2, no calcifications were visible. CONCLUSIONS: Test membrane 1 seemed to be more resistant to degradation, but there was also a more pronounced inflammatory reaction in comparison to test membrane 2, especially in the vicinity of the keratinocytes. The significance of the ectopic calcifications, along with that of the resorption or degradation processes of both tested membranes, must be evaluated in future experimental studies, with different time points after implantation examine
Resumo:
Paul Grice distinguishes between natural meaning and non-natural meaning, where the first notion is especially connected to something’s being a natural sign and the second to communication. It is argued that some of the arguments against the distinction being exhaustive are based on a misinterpretation of Grice, but also that the distinction cannot be exhaustive if one takes into account both the criterion of factivity and the connection to communication. If one makes a distinction between natural and non-natural communication, then there are different types of natural communication to be distinguished: goal-directed communication, intentional communication and open intentional communication. Given the empirical evidence, the behavior of chimpanzees and of human infants may be described as goal-directed communication, but there are also important differences between the communicative behavior of the two.
Resumo:
Considerable efforts have been directed toward the identification of small-ruminant prion diseases, i.e., classical and atypical scrapie as well as bovine spongiform encephalopathy (BSE). Here we report the in-depth molecular analysis of the proteinase K-resistant prion protein core fragment (PrP(res)) in a highly scrapie-affected goat flock in Greece. The PrP(res) profile by Western immunoblotting in most animals was that of classical scrapie in sheep. However, in a series of clinically healthy goats we identified a unique C- and N-terminally truncated PrP(res) fragment, which is akin but not identical to that observed for atypical scrapie. These findings reveal novel aspects of the nature and diversity of the molecular PrP(res) phenotypes in goats and suggest that these animals display a previously unrecognized prion protein disorder.
Resumo:
A 7 mo old female English springer spaniel was presented with diarrhea, vomiting, apathy, and hyperthermia. Further examinations revealed generalized lymphadenomegaly consistent with sterile neutrophilic-macrophagic lymphadenitis and pulmonary involvement. Subcutaneous nodules developed one day after presentation. Histology was consistent with sterile idiopathic nodular panniculitis and vasculitis. No infectious organism was isolated. The dog responded to prednisolone, but relapsed during medication tapering. Cyclosporine had to be added to control the disease. No further relapse had occurred 98 wk after the first presentation. This is an unusual presentation of a systemic sterile neutrophilic-macrophagic lymphadenitis with nodular panniculitis and vasculitis associated with gastrointestinal and pulmonary signs.
Resumo:
OBJECTIVE: To determine interobserver and intraobserver agreement for results of low-field magnetic resonance imaging (MRI) in dogs with and without disk-associated wobbler syndrome (DAWS). DESIGN: Validation study. ANIMALS: 21 dogs with and 23 dogs without clinical signs of DAWS. PROCEDURES: For each dog, MRI of the cervical vertebral column was performed. The MRI studies were presented in a randomized sequence to 4 board-certified radiologists blinded to clinical status. Observers assessed degree of disk degeneration, disk-associated and dorsal compression, alterations in intraspinal signal intensity (ISI), vertebral body abnormalities, and new bone formation and categorized each study as originating from a clinically affected or clinically normal dog. Interobserver agreement was calculated for 44 initial measurements for each observer. Intraobserver agreement was calculated for 11 replicate measurements for each observer. RESULTS: There was good interobserver agreement for ratings of disk degeneration and vertebral body abnormalities and moderate interobserver agreement for ratings of disk-associated compression, dorsal compression, alterations in ISI, new bone formation, and suspected clinical status. There was very good intraobserver agreement for ratings of disk degeneration, disk-associated compression, alterations in ISI, vertebral body abnormalities, and suspected clinical status. There was good intraobserver agreement for ratings of dorsal compression and new bone formation. Two of 21 clinically affected dogs were erroneously categorized as clinically normal, and 4 of 23 clinically normal dogs were erroneously categorized as clinically affected. CONCLUSIONS AND CLINICAL RELEVANCE: Results suggested that variability exists among observers with regard to results of MRI in dogs with DAWS and that MRI could lead to false-positive and false-negative assessments.