922 resultados para REFLEX MODIFICATION
Resumo:
BACKGROUND General practitioners (GPs) are in best position to suspect dementia. Mini-Mental State Examination (MMSE) and Clock Drawing Test (CDT) are widely used. Additional neurological tests may increase the accuracy of diagnosis. We aimed to evaluate diagnostic ability to detect dementia with a Short Smell Test (SST) and Palmo-Mental Reflex (PMR) in patients whose MMSE and CDT are normal, but who show signs of cognitive dysfunction. METHODS This was a 3.5-year cross-sectional observational study in the Memory Clinic of the University Department of Geriatrics in Bern, Switzerland. Participating patients with normal MMSE (>26 points) and CDT (>5 points) were referred by GPs because they suspected dementia. All were examined according to a standardized protocol. Diagnosis of dementia was based on DSM-IV TR criteria. We used SST and PMR to determine if they accurately detected dementia. RESULTS In our cohort, 154 patients suspected of dementia had normal MMSE and CDT test results. Of these, 17 (11 %) were demented. If SST or PMR were abnormal, sensitivity was 71 % (95 % CI 44-90 %), and specificity 64 % (95 % CI 55-72 %) for detecting dementia. If both tests were abnormal, sensitivity was 24 % (95 % CI 7-50 %), but specificity increased to 93 % (95 % CI 88-97 %). CONCLUSION Patients suspected of dementia, but with normal MMSE and CDT results, may benefit if SST and PMR are added as diagnostic tools. If both SST and PMR are abnormal, this is a red flag to investigate these patients further, even though their negative neuropsychological screening results.
Resumo:
River bedload surveyed at 50 sites in Westland is dominated by Alpine Schist or Torlesse Greywacke from the Alpine Fault hanging wall, with subordinate Pounamu Ultramafics or footwall-derived Western Province rocks. Tumbling experiments found ultramafics to have the lowest attrition rates, compared with greywacke sandstone and granite (which abrade to produce silt to medium-sand), or incompetent schist (which fragments). Arahura has greater total concentrations (103–105 t/km2) and proportions (5–40%) of ultramafic bedload compared with Hokitika and Taramakau catchments (101–104 t/km2, mostly <10%), matching relative areas of mapped Pounamu Ultramafic bedrock, but enriched relative to absolute areal proportions. Western Province rocks downthrown by the Alpine Fault are under-represented in the bedload. Enriched concentrations of ultramafic bedload decrease rapidly with distance downstream from source rock outcrops, changing near prominent ice-limit moraines. Bedload evolution with transport involves both downstream fining and dilution from tributaries, in a sediment supply regime more strongly influenced by tectonics and the imprint of past glaciation. Treasured New Zealand pounamu (jade) is associated with ultramafic rocks. Chances of discovery vary between catchments, are increased near glacial moraines, and are highest near source-rock outcrops in remote mountain headwaters.
Raum – Perspektive – Medium 2: Wahrnehmung im Blick. reflex: Tübinger Kunstgeschichte zum Bildwissen
Resumo:
Im zweiten Workshop zum Thema „Raum – Perspektive – Medium“, der im Februar 2009 am Kunsthistorischen Institut der Universität Tübingen stattfand, kristallisierten sich zwei zentrale Thesen heraus, die im nun folgenden Band aus interdisziplinärer Perspektive beleuchtet werden: Erstens die Frage nach dem Anteil des Betrachters am Bildprozess und zweitens eine damit einhergehende Bewertung des Medienbegriffs. In reflex 2 sind nun die Texte versammelt, die aus dem Workshop „Raum – Perspektive – Medium 2“ hervorgegangen sind.
Resumo:
Video-oculography devices are now used to quantify the vestibulo-ocular reflex (VOR) at the bedside using the head impulse test (HIT). Little is known about the impact of disruptive phenomena (e.g. corrective saccades, nystagmus, fixation losses, eye-blink artifacts) on quantitative VOR assessment in acute vertigo. This study systematically characterized the frequency, nature, and impact of artifacts on HIT VOR measures. From a prospective study of 26 patients with acute vestibular syndrome (16 vestibular neuritis, 10 stroke), we classified findings using a structured coding manual. Of 1,358 individual HIT traces, 72% had abnormal disruptive saccades, 44% had at least one artifact, and 42% were uninterpretable. Physicians using quantitative recording devices to measure head impulse VOR responses for clinical diagnosis should be aware of the potential impact of disruptive eye movements and measurement artifacts.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
New preventive approaches against dental erosion caused by acidic drinks and beverages include fortification of beverages with natural polymers. We have shown that the mixture of casein and mucin significantly improved the erosion-inhibiting properties of the human pellicle layer. This study aimed to investigate the effect of pellicle modification by casein, mucin and a casein-mucin mixture on the adhesion of early bacterial colonizers. Test specimens of human tooth enamel were prepared, covered with saliva and coated with 0.5% aqueous (aq.) casein, 0.27% aq. mucin or with 0.5% aq. casein-0.27% aq. mucin, after which the adhesion of Streptococcus gordonii, Streptococcus oralis, and Actinomyces odontolyticus was measured after incubation for 30 min and 2 h. log10 colony-forming units were compared by nonparametric tests. All three bacterial strains adhered in higher number to pellicle-coated enamel than to native enamel. The protein modifications of pellicle all decreased the counts of adhering bacteria up to 0.34 log10/mm2, the most efficient being the casein-mucin mixture. In addition to the recently shown erosion-reducing effect by casein-mucin, modification of the pellicle may inhibit bacterial adherence compared to untreated human pellicle.
Resumo:
BACKGROUND AND OBJECTIVES: The biased interpretation of ambiguous social situations is considered a maintaining factor of Social Anxiety Disorder (SAD). Studies on the modification of interpretation bias have shown promising results in laboratory settings. The present study aims at pilot-testing an Internet-based training that targets interpretation and judgmental bias. METHOD: Thirty-nine individuals meeting diagnostic criteria for SAD participated in an 8-week, unguided program. Participants were presented with ambiguous social situations, were asked to choose between neutral, positive, and negative interpretations, and were required to evaluate costs of potential negative outcomes. Participants received elaborate automated feedback on their interpretations and judgments. RESULTS: There was a pre-to-post-reduction of the targeted cognitive processing biases (d = 0.57-0.77) and of social anxiety symptoms (d = 0.87). Furthermore, results showed changes in depression and general psychopathology (d = 0.47-0.75). Decreases in cognitive biases and symptom changes did not correlate. The results held stable accounting for drop-outs (26%) and over a 6-week follow-up period. Forty-five percent of the completer sample showed clinical significant change and almost half of the participants (48%) no longer met diagnostic criteria for SAD. LIMITATIONS: As the study lacks a control group, results lend only preliminary support to the efficacy of the intervention. Furthermore, the mechanism of change remained unclear. CONCLUSION: First results promise a beneficial effect of the program for SAD patients. The treatment proved to be feasible and acceptable. Future research should evaluate the intervention in a randomized-controlled setting.