975 resultados para startle reflex
Resumo:
OBJECTIVE Vestibular neuritis is often mimicked by stroke (pseudoneuritis). Vestibular eye movements help discriminate the two conditions. We report vestibulo-ocular reflex (VOR) gain measures in neuritis and stroke presenting acute vestibular syndrome (AVS). METHODS Prospective cross-sectional study of AVS (acute continuous vertigo/dizziness lasting >24 h) at two academic centers. We measured horizontal head impulse test (HIT) VOR gains in 26 AVS patients using a video HIT device (ICS Impulse). All patients were assessed within 1 week of symptom onset. Diagnoses were confirmed by clinical examinations, brain magnetic resonance imaging with diffusion-weighted images, and follow-up. Brainstem and cerebellar strokes were classified by vascular territory-posterior inferior cerebellar artery (PICA) or anterior inferior cerebellar artery (AICA). RESULTS Diagnoses were vestibular neuritis (n = 16) and posterior fossa stroke (PICA, n = 7; AICA, n = 3). Mean HIT VOR gains (ipsilesional [standard error of the mean], contralesional [standard error of the mean]) were as follows: vestibular neuritis (0.52 [0.04], 0.87 [0.04]); PICA stroke (0.94 [0.04], 0.93 [0.04]); AICA stroke (0.84 [0.10], 0.74 [0.10]). VOR gains were asymmetric in neuritis (unilateral vestibulopathy) and symmetric in PICA stroke (bilaterally normal VOR), whereas gains in AICA stroke were heterogeneous (asymmetric, bilaterally low, or normal). In vestibular neuritis, borderline gains ranged from 0.62 to 0.73. Twenty patients (12 neuritis, six PICA strokes, two AICA strokes) had at least five interpretable HIT trials (for both ears), allowing an appropriate classification based on mean VOR gains per ear. Classifying AVS patients with bilateral VOR mean gains of 0.70 or more as suspected strokes yielded a total diagnostic accuracy of 90%, with stroke sensitivity of 88% and specificity of 92%. CONCLUSION Video HIT VOR gains differ between peripheral and central causes of AVS. PICA strokes were readily separated from neuritis using gain measures, but AICA strokes were at risk of being misclassified based on VOR gain alone.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
Hypersensitivity of pain pathways is considered a relevant determinant of symptoms in chronic pain patients, but data on its prevalence are very limited. To our knowledge, no data on the prevalence of spinal nociceptive hypersensitivity are available. We studied the prevalence of pain hypersensitivity and spinal nociceptive hypersensitivity in 961 consecutive patients with various chronic pain conditions. Pain threshold and nociceptive withdrawal reflex threshold to electrical stimulation were used to assess pain hypersensitivity and spinal nociceptive hypersensitivity, respectively. Using 10th percentile cutoff of previously determined reference values, the prevalence of pain hypersensitivity and spinal nociceptive hypersensitivity (95% confidence interval) was 71.2 (68.3-74.0) and 80.0 (77.0-82.6), respectively. As a secondary aim, we analyzed demographic, psychosocial, and clinical characteristics as factors potentially associated with pain hypersensitivity and spinal nociceptive hypersensitivity using logistic regression models. Both hypersensitivity parameters were unaffected by most factors analyzed. Depression, catastrophizing, pain-related sleep interference, and average pain intensity were significantly associated with hypersensitivity. However, none of them was significant for both unadjusted and adjusted analyses. Furthermore, the odds ratios were very low, indicating modest quantitative impact. To our knowledge, this is the largest prevalence study on central hypersensitivity and the first one on the prevalence of spinal nociceptive hypersensitivity in chronic pain patients. The results revealed an impressively high prevalence, supporting a high clinical relevance of this phenomenon. Electrical pain thresholds and nociceptive withdrawal reflex explore aspects of pain processing that are mostly independent of sociodemographic, psychological, and clinical pain-related characteristics.
Resumo:
PURPOSE Recent advances in optogenetics and gene therapy have led to promising new treatment strategies for blindness caused by retinal photoreceptor loss. Preclinical studies often rely on the retinal degeneration 1 (rd1 or Pde6b(rd1)) retinitis pigmentosa (RP) mouse model. The rd1 founder mutation is present in more than 100 actively used mouse lines. Since secondary genetic traits are well-known to modify the phenotypic progression of photoreceptor degeneration in animal models and human patients with RP, negligence of the genetic background in the rd1 mouse model is unwarranted. Moreover, the success of various potential therapies, including optogenetic gene therapy and prosthetic implants, depends on the progress of retinal degeneration, which might differ between rd1 mice. To examine the prospect of phenotypic expressivity in the rd1 mouse model, we compared the progress of retinal degeneration in two common rd1 lines, C3H/HeOu and FVB/N. METHODS We followed retinal degeneration over 24 weeks in FVB/N, C3H/HeOu, and congenic Pde6b(+) seeing mouse lines, using a range of experimental techniques including extracellular recordings from retinal ganglion cells, PCR quantification of cone opsin and Pde6b transcripts, in vivo flash electroretinogram (ERG), and behavioral optokinetic reflex (OKR) recordings. RESULTS We demonstrated a substantial difference in the speed of retinal degeneration and accompanying loss of visual function between the two rd1 lines. Photoreceptor degeneration and loss of vision were faster with an earlier onset in the FVB/N mice compared to C3H/HeOu mice, whereas the performance of the Pde6b(+) mice did not differ significantly in any of the tests. By postnatal week 4, the FVB/N mice expressed significantly less cone opsin and Pde6b mRNA and had neither ERG nor OKR responses. At 12 weeks of age, the retinal ganglion cells of the FVB/N mice had lost all light responses. In contrast, 4-week-old C3H/HeOu mice still had ERG and OKR responses, and we still recorded light responses from C3H/HeOu retinal ganglion cells until the age of 24 weeks. These results show that genetic background plays an important role in the rd1 mouse pathology. CONCLUSIONS Analogous to human RP, the mouse genetic background strongly influences the rd1 phenotype. Thus, different rd1 mouse lines may follow different timelines of retinal degeneration, making exact knowledge of genetic background imperative in all studies that use rd1 models.
Resumo:
PURPOSE Stress urinary incontinence (SUI) affects women of all ages including young athletes, especially those involved in high-impact sports. To date, hardly any studies are available testing pelvic floor muscles (PFM) during sports activities. The aim of this study was the description and reliability test of six PFM electromyography (EMG) variables during three different running speeds. The secondary objective was to evaluate whether there was a speed-dependent difference between the PFM activity variables. METHODS This trial was designed as an exploratory and reliability study including ten young healthy female subjects to characterize PFM pre-activity and reflex activity during running at 7, 9 and 11 km/h. Six variables for each running speed, averaged over ten steps per subject, were presented descriptively, tested regarding their reliability (Friedman, ICC, SEM, MD) and speed difference (Friedman). RESULTS PFM EMG variables varied between 67.6 and 106.1 %EMG, showed no systematic error and were low for SEM and MD using the single value model. Applying the average model over ten steps, ICC (3,k) were >0.75 and SEM and MD about 50 % lower than for the single value model. Activity was found to be highest in 11 km/h. CONCLUSION EMG variables showed excellent ICC and very low SEM and MD. Further studies should investigate inter-session reliability and PFM reactivity patterns of SUI patients using the average over ten steps for each variable as it showed very high ICC and very low SEM and MD. Subsequently, longer running distances and other high-impact sports disciplines could be studied.
Resumo:
Western honey bees (Apis mellifera) face an increasing number of challenges that in recent years have led to significant economic effects on apiculture, with attendant consequences for agriculture. Nosemosis is a fungal infection of honey bees caused by either Nosema apis or N. ceranae. The putative greater virulence of N. ceranae has spurred interest in understanding how it differs from N. apis. Little is known of effects of N. apis or N. ceranae on honey bee learning and memory. Following a Pavlovian model that relies on the proboscis extension reflex, we compared acquisition learning and long-term memory recall of uninfected (control) honey bees versus those inoculated with N. apis, N. ceranae, or both. We also tested whether spore intensity was associated with variation in learning and memory. Neither learning nor memory differed among treatments. There was no evidence of a relationship between spore intensity and learning, and only limited evidence of a negative effect on memory; this occurred only in the co-inoculation treatment. Our results suggest that if Nosema spp. are contributing to unusually high colony losses in recent years, the mechanism by which they may affect honey bees is probably not related to effects on learning or memory, at least as assessed by the proboscis extension reflex.
Resumo:
Transforming growth factor beta-1 (TGF-β1) is a cytokine and neurotrophic factor whose neuromodulatory effects in Aplysia californica were recently described. Previous results demonstrated that TGF-β1 induces long-term increases in the efficacy of sensorimotor synapses, a neural correlate of sensitization of the defensive tail withdrawal reflex. These results provided the first evidence that a neurotrophic factor regulates neuronal plasticity associated with a simple form of learning in Aplysia, and raised many questions regarding the nature of the modulation. No homologs of TGF-β had previously been identified in Aplysia, and thus, it was not known whether components of TGF-β1 signaling pathways were present in Aplysia. Furthermore, the signaling mechanisms engaged by TGF-β1 had not been identified, and it was not known whether TGF-β1 regulated other aspects of neuronal function.^ The present investigation into the actions of TGF-β1 was initiated by examining the distribution of the type II TGF-β1 receptor, the ligand binding receptor. The receptor was widely distributed in the CNS and most neurons exhibited somatic and neuritic immunoreactivity. In addition, the ability of TGF-β1 to activate the cAMP/PKA and MAPK pathways, known to regulate several important aspects of neuronal function, was examined. TGF-β1 acutely decreased cAMP levels in sensory neurons, activated MAPK and triggered translocation of MAPK to the nucleus. MAPK activation was critical for both short- and long-term regulation of neuronal function by TGF-β1. TGF-β1 acutely decreased synaptic depression induced by low frequency stimuli in a MAPK-dependent manner. This regulation may result, at least in part, from the modulation of synapsin, a major peripheral synaptic vesicle protein. TGF-β1 stimulated MAPK-dependent phosphorylation of synapsin, a process believed to regulate synaptic vesicle mobilization from reserve to readily-releasable pools of neurotransmitter. In addition to its acute effect on synaptic efficacy, TGF-β1 also induced long-term increases in sensory neuron excitability. Whereas transient exposure to TGF-β1 was not sufficient to drive short-or long-term changes in excitability, prolonged exposure to TGF-β1 induced long-term changes in excitability that depended on MAPK. The results of these studies represent significant progress toward an understanding of the role of TGF-β1 in neuronal plasticity. ^
Resumo:
Chronic respiratory illnesses are a significant cause of morbidity and mortality, and acute changes in respiratory function often lead to hospitalization. Air pollution is known to exacerbate asthma, but the molecular mechanisms of this are poorly understood. The current studies were aimed at clarifying the roles of nerve subtypes and purinergic receptors in respiratory reflex responses following exposure to irritants. In C57Bl/6J female mice, inspired adenosine produced sensory irritation, shown to be mediated mostly by A-delta fibers. Secondly, the response to inhaled acetic acid was discovered to be dually influenced by C and A-delta fibers, as indicated by the observed effects of capsaicin pretreatment, which selectively destroys TRPV1-expressing fibers (mostly C fibers) and pretreatment with theophylline, a nonselective adenosine receptor antagonist. The responses to both adenosine and acetic acid were enhanced in the ovalbumin-allergic airway disease model, although the particular pathway altered is still unknown.
Resumo:
A study was conducted to empirically determine the degradation of survey-grade GPS horizontal position measurements due to the effects of broadleaf forest canopies. The measurements were taken using GPS/GLONASS-capable receivers measuring C/A and P-codes, and carrier phase. Fourteen survey markers were chosen in central Connecticut to serve as reference markers for the study. These markers had varying degrees of sky obstruction due to overhanging tree canopies. Sky obstruction was measured by photographing the sky with a 35mm reflex camera fitted with a hemispherical lens. The negative was scanned and the image mapped using an equal- area projection to remove the distortion caused by the lens. The resulting digital image was thresholded to produce a black-and-white image in which a count of the black pixels is a measure of sky-area obstruction. The locations of the markers were determined independently before the study. During the study, each marker was occupied for four 20-minute sessions over the period of one week in mid-July, 1999. The location of the study markers produced relatively long baselines, as compared with similar studies. We compared the accuracy of GPS-only vs. GPS&GLONASS as a function of sky obstruction. Based on our results, GLONASS observations did not improve or degrade the accuracy of the position measurements. There is a loss of 2mm of accuracy per percent of sky obstruction for both GPS only and GPS&GLONASS.
Resumo:
Objective. Loud noises in neonatal intensive care units (NICUs) may impede growth and development for extremely low birthweight (ELBW, < 1000 grams) newborns. The objective of this study was to measure the association between NICU sound levels and ELBW neonates' arterial blood pressure to determine whether these newborns experience noise-induced stress. ^ Methods. Noise and arterial blood pressure recordings were collected for 9 ELBW neonates during the first week of life. Sound levels were measured inside the incubator, and each subject's arterial blood pressures were simultaneously recorded for 15 minutes (at 1 sec intervals). Time series cross-correlation functions were calculated for NICU noise and mean arterial blood pressure (MABP) recordings for each subject. The grand mean noise-MABP cross-correlation was calculated for all subjects and for lower and higher birthweight groups for comparison. ^ Results. The grand mean noise-MABP cross-correlation for all subjects was mostly negative (through 300 sec lag time) and nearly reached significance at the 95% level at 111 sec lag (mean r = -0.062). Lower birthweight newborns (454-709 g) experienced significant decreases in blood pressure with increasing NICU noise after 145 sec lag (peak r = -0.074). Higher birthweight newborns had an immediate negative correlation with NICU sound levels (at 3 sec lag, r = -0.071), but arterial blood pressures increased to a positive correlation with noise levels at 197 sec lag (r = 0.075). ^ Conclusions. ELBW newborns' arterial blood pressure was influenced by NICU noise levels during the first week of life. Lower birthweight newborns may have experienced an orienting reflex to NICU sounds. Higher birthweight newborns experienced an immediate orienting reflex to increasing sound levels, but arterial blood pressure increased approximately 3 minutes after increases in noise levels. Increases in arterial blood pressure following increased NICU sound levels may result from a stress response to noise. ^
Resumo:
The present work examines the role of cAMP in the induction of the type of long-term morphological changes that have been shown to be correlated with long-term sensitization in Aplysia.^ To examine this issue, cAMP was injected into individual tail sensory neurons in the pleural ganglion to mimic, at the single cell level, the effects of behavioral training. After a 22 hr incubation period, the same cells were filled with horseradish peroxidase and 2 hours later the tissue was fixed and processed. Morphological analysis revealed that cAMP induced an increase in two morphological features of the neurons, varicosities and branch points. These structural alterations, which are similar to those seen in siphon sensory neurons of the abdominal ganglion following long-term sensitization training of the siphon-gill withdrawal reflex, could subserve the altered behavioral response of the animal. These results expose another role played by cAMP in the induction of learning, the initiation of a structural substrate, which, in concert with other correlates, underlies learning.^ cAMP was injected into sensory neurons in the presence of the reversible protein synthesis inhibitor, anisomycin. The presence of anisomycin during and immediately following the nucleotide injection completely blocked the structural remodeling. These results indicate that the induction of morphological changes by cAMP is a process dependent on protein synthesis.^ To further examine the temporal requirement for protein synthesis in the induction of these changes, the time of anisomycin exposure was varied. The results indicate that the cellular processes triggered by cAMP are sensitive to the inhibition of protein synthesis for at least 7 hours after the nucleotide injection. This is a longer period of sensitivity than that for the induction of another correlate of long-term sensitization, facilitation of the sensory to motor neuron synaptic connection. Thus, these findings demonstrate that the period of sensitivity to protein synthesis inhibition is not identical for all correlates of learning. In addition, since the induction of the morphological changes can be blocked by anisomycin pulses administered at different times during and following the cAMP injection, this suggests that cAMP is triggering a cascade of protein synthesis, with successive rounds of synthesis being dependent on successful completion of preceding rounds. Inhibition at any time during this cascade can block the entire process and so prevent the development of the structural changes.^ The extent to which cAMP can mimic the structural remodeling induced by long-term training was also examined. Animals were subjected to unilateral sensitization training and the morphology of the sensory neurons was examined twenty-four hours later. Both cAMP injection and long-term training produced a twofold increase in varicosities and approximately a fifty percent increase in the number of branch points in the sensory neuron arborization within the pleural ganglion. (Abstract shortened by UMI.) ^