982 resultados para baroreceptor reflex
Resumo:
OBJECTIVE To determine a dexmedetomidine concentration, to be added to an alfaxalone-based bath solution, that will enhance the anaesthetic and analgesic effects of alfaxalone; and to compare the quality of anaesthesia and analgesia provided by immersion with either alfaxalone alone or alfaxalone with dexmedetomidine in oriental fire-bellied toads (Bombina orientalis). STUDY DESIGN Pilot study followed by a prospective, randomized, experimental trial. ANIMALS Fourteen oriental fire-bellied toads. METHODS The pilot study aimed to identify a useful dexmedetomidine concentration to be added to an anaesthetic bath containing 20 mg 100 mL(-1) alfaxalone. Thereafter, the toads were assigned to one of two groups, each comprising eight animals, to be administered either alfaxalone (group A) or alfaxalone-dexmedetomidine (group AD). After immersion for 20 minutes, the toads were removed from the anaesthetic bath and the righting, myotactic and nociceptive reflexes, cardiopulmonary variables and von Frey filaments threshold were measured at 5 minute intervals and compared statistically between groups. Side effects and complications were noted and recorded. RESULTS In the pilot study, a dexmedetomidine concentration of 0.3 mg 100 mL(-1) added to the alfaxalone-based solution resulted in surgical anaesthesia. The toads in group AD showed higher von Frey thresholds and lower nociceptive withdrawal reflex scores than those in group A. However, in group AD, surgical anaesthesia was observed in two out of eight toads only, and induction of anaesthesia was achieved in only 50% of the animals, as compared with 100% of the toads in group A. CONCLUSIONS AND CLINICAL RELEVANCE The addition of dexmedetomidine to an alfaxalone-based solution for immersion anaesthesia provided some analgesia in oriental fire-bellied toads, but failed to potentiate the level of unconsciousness and appeared to lighten the depth of anaesthesia. This limitation renders the combination unsuitable for anaesthetizing oriental fire-bellied toads for invasive procedures.
Resumo:
Oriental fire-bellied toads (Bombina orientalis) are small semi-aquatic anuran species popular as both pets and laboratory animals. Although they are commonly anaesthetized to undergo clinical and experimental procedures, very little is known about their anaesthetic management. The aims of this prospective, randomized, cross-over experimental trial were to establish effective butorphanol and morphine concentrations to be added to alfaxalone for immersion anaesthesia (pilot study), and to compare the anaesthetic and antinociceptive effects of the two drug mixtures (alfaxalone-butorphanol and alfaxalone-morphine), in Bombina orientalis toads. For the actual trial, the toads were randomly assigned to one of two treatment groups: AB and AM, with seven animals in each group, which received alfaxalone-butorphanol and alfaxalone-morphine combinations, respectively, at the concentrations established during the pilot study. Heart rate, respiratory rate, von Frey filament threshold and response to nociceptive withdrawal (NWR), righting and myotactic reflexes were measured at 5 min intervals until return of righting reflex was observed. The investigator who carried out all the measurements was blinded to the treatment. Any undesired effect or complication was noted and recorded. The two treatments were found to be comparable in terms of onset and duration of anaesthesia, and occurrence of undesired effects. However, group AM resulted in lower NWR scores and higher von Frey filament thresholds than group AB. It is concluded that, at the investigated concentrations and in combination with alfaxalone by immersion, morphine provides better antinociception than butorphanol in oriental fire-bellied toads.
Resumo:
Despite the close interrelation between vestibular and visual processing (e.g., vestibulo-ocular reflex), surprisingly little is known about vestibular function in visually impaired people. In this study, we investigated thresholds of passive whole-body motion discrimination (leftward vs. rightward) in nine visually impaired participants and nine age-matched sighted controls. Participants were rotated in yaw, tilted in roll, and translated along the interaural axis at two different frequencies (0.33 and 2 Hz) by means of a motion platform. Superior performance of visually impaired participants was found in the 0.33 Hz roll tilt condition. No differences were observed in the other motion conditions. Roll tilts stimulate the semicircular canals and otoliths simultaneously. The results could thus reflect a specific improvement in canal–otolith integration in the visually impaired and are consistent with the compensatory hypothesis, which implies that the visually impaired are able to compensate the absence of visual input.
Resumo:
The aim of this study was to test the effects of a sustained nystagmus on the head impulse response of the vestibulo-ocular reflex (VOR) in healthy subjects. VOR gain (slow-phase eye velocity/head velocity) was measured using video head impulse test goggles. Acting as a surrogate for a spontaneous nystagmus (SN), a post-rotatory nystagmus (PRN) was elicited after a sustained, constant-velocity rotation, and then head impulses were applied. 'Raw' VOR gain, uncorrected for PRN, in healthy subjects in response to head impulses with peak velocities in the range of 150°/s-250°/s was significantly increased (as reflected in an increase in the slope of the gain versus head velocity relationship) after inducing PRN with slow phases of nystagmus of high intensity (>30°/s) in the same but not in the opposite direction as the slow-phase response induced by the head impulses. The values of VOR gain themselves, however, remained in the normal range with slow-phase velocities of PRN < 30°/s. Finally, quick phases of PRN were suppressed during the first 20-160 ms of a head impulse; the time frame of suppression depended on the direction of PRN but not on the duration of the head impulse. Our results in normal subjects suggest that VOR gains measured using head impulses may have to be corrected for any superimposed SN when the slow-phase velocity of nystagmus is relatively high and the peak velocity of the head movements is relatively low. The suppression of quick phases during head impulses may help to improve steady fixation during rapid head movements.
Resumo:
The head impulse test (HIT) can identify a deficient vestibulo-ocular reflex (VOR) by the compensatory saccade (CS) generated once the head stops moving. The inward HIT is considered safer than the outward HIT, yet might have an oculomotor advantage given that the subject would presumably know the direction of head rotation. Here, we compare CS latencies following inward (presumed predictable) and outward (more unpredictable) HITs after acute unilateral vestibular nerve deafferentation. Seven patients received inward and outward HITs delivered at six consecutive postoperative days (POD) and again at POD 30. All head impulses were recorded by portable video-oculography. CS included those occurring during (covert) or after (overt) head rotation. Inward HITs included mean CS latencies (183.48 ms ± 4.47 SE) that were consistently shorter than those generated during outward HITs in the first 6 POD (p = 0.0033). Inward HITs induced more covert saccades compared to outward HITs, acutely. However, by POD 30 there were no longer any differences in latencies or proportions of CS and direction of head rotation. Patients with acute unilateral vestibular loss likely use predictive cues of head direction to elicit early CS to keep the image centered on the fovea. In acute vestibular hypofunction, inwardly applied HITs may risk a preponderance of covert saccades, yet this difference largely disappears within 30 days. Advantages of inwardly applied HITs are discussed and must be balanced against the risk of a false-negative HIT interpretation.
Resumo:
OBJECTIVE Vestibular neuritis is often mimicked by stroke (pseudoneuritis). Vestibular eye movements help discriminate the two conditions. We report vestibulo-ocular reflex (VOR) gain measures in neuritis and stroke presenting acute vestibular syndrome (AVS). METHODS Prospective cross-sectional study of AVS (acute continuous vertigo/dizziness lasting >24 h) at two academic centers. We measured horizontal head impulse test (HIT) VOR gains in 26 AVS patients using a video HIT device (ICS Impulse). All patients were assessed within 1 week of symptom onset. Diagnoses were confirmed by clinical examinations, brain magnetic resonance imaging with diffusion-weighted images, and follow-up. Brainstem and cerebellar strokes were classified by vascular territory-posterior inferior cerebellar artery (PICA) or anterior inferior cerebellar artery (AICA). RESULTS Diagnoses were vestibular neuritis (n = 16) and posterior fossa stroke (PICA, n = 7; AICA, n = 3). Mean HIT VOR gains (ipsilesional [standard error of the mean], contralesional [standard error of the mean]) were as follows: vestibular neuritis (0.52 [0.04], 0.87 [0.04]); PICA stroke (0.94 [0.04], 0.93 [0.04]); AICA stroke (0.84 [0.10], 0.74 [0.10]). VOR gains were asymmetric in neuritis (unilateral vestibulopathy) and symmetric in PICA stroke (bilaterally normal VOR), whereas gains in AICA stroke were heterogeneous (asymmetric, bilaterally low, or normal). In vestibular neuritis, borderline gains ranged from 0.62 to 0.73. Twenty patients (12 neuritis, six PICA strokes, two AICA strokes) had at least five interpretable HIT trials (for both ears), allowing an appropriate classification based on mean VOR gains per ear. Classifying AVS patients with bilateral VOR mean gains of 0.70 or more as suspected strokes yielded a total diagnostic accuracy of 90%, with stroke sensitivity of 88% and specificity of 92%. CONCLUSION Video HIT VOR gains differ between peripheral and central causes of AVS. PICA strokes were readily separated from neuritis using gain measures, but AICA strokes were at risk of being misclassified based on VOR gain alone.
Resumo:
Treatment-resistant hypertension (TRH) affects between 3 and 30% of hypertensive patients, and its presence is associated with increased cardiovascular morbidity and mortality. Until recently, the interest on these patients has been limited, because providing care for them is difficult and often frustrating. However, the arrival of new treatment options [i.e. catheter-based renal denervation (RDN) and baroreceptor stimulation] has revitalized the interest in this topic. The very promising results of the initial uncontrolled studies on the blood pressure (BP)-lowering effect of RDN in TRH seemed to suggest that this intervention might represent an easy solution for a complex problem. However, subsequently, data from controlled studies have tempered the enthusiasm of the medical community (and the industry). Conversely, these new studies emphasized some seminal aspects on this topic: (i) the key role of 24 h ambulatory BP and arterial stiffness measurement to identify 'true' resistant patients; (ii) the high prevalence of secondary hypertension among this population; and (iii) the difficulty to identify those patients who may profit from device-based interventions. Accordingly, for those patients with documented TRH, the guidelines suggest to refer them to a hypertension specialist/centre in order to perform adequate work-up and treatment strategies. The aim of this review is to provide guidance for the cardiologist on how to identify patients with TRH and elucidate the prevailing underlying pathophysiological mechanism(s), to define a strategy for the identification of patients with TRH who may benefit from device-based interventions and discuss results and limitations of these interventions, and finally to briefly summarize the different drug-based treatment strategies.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
Hypersensitivity of pain pathways is considered a relevant determinant of symptoms in chronic pain patients, but data on its prevalence are very limited. To our knowledge, no data on the prevalence of spinal nociceptive hypersensitivity are available. We studied the prevalence of pain hypersensitivity and spinal nociceptive hypersensitivity in 961 consecutive patients with various chronic pain conditions. Pain threshold and nociceptive withdrawal reflex threshold to electrical stimulation were used to assess pain hypersensitivity and spinal nociceptive hypersensitivity, respectively. Using 10th percentile cutoff of previously determined reference values, the prevalence of pain hypersensitivity and spinal nociceptive hypersensitivity (95% confidence interval) was 71.2 (68.3-74.0) and 80.0 (77.0-82.6), respectively. As a secondary aim, we analyzed demographic, psychosocial, and clinical characteristics as factors potentially associated with pain hypersensitivity and spinal nociceptive hypersensitivity using logistic regression models. Both hypersensitivity parameters were unaffected by most factors analyzed. Depression, catastrophizing, pain-related sleep interference, and average pain intensity were significantly associated with hypersensitivity. However, none of them was significant for both unadjusted and adjusted analyses. Furthermore, the odds ratios were very low, indicating modest quantitative impact. To our knowledge, this is the largest prevalence study on central hypersensitivity and the first one on the prevalence of spinal nociceptive hypersensitivity in chronic pain patients. The results revealed an impressively high prevalence, supporting a high clinical relevance of this phenomenon. Electrical pain thresholds and nociceptive withdrawal reflex explore aspects of pain processing that are mostly independent of sociodemographic, psychological, and clinical pain-related characteristics.
Resumo:
PURPOSE Recent advances in optogenetics and gene therapy have led to promising new treatment strategies for blindness caused by retinal photoreceptor loss. Preclinical studies often rely on the retinal degeneration 1 (rd1 or Pde6b(rd1)) retinitis pigmentosa (RP) mouse model. The rd1 founder mutation is present in more than 100 actively used mouse lines. Since secondary genetic traits are well-known to modify the phenotypic progression of photoreceptor degeneration in animal models and human patients with RP, negligence of the genetic background in the rd1 mouse model is unwarranted. Moreover, the success of various potential therapies, including optogenetic gene therapy and prosthetic implants, depends on the progress of retinal degeneration, which might differ between rd1 mice. To examine the prospect of phenotypic expressivity in the rd1 mouse model, we compared the progress of retinal degeneration in two common rd1 lines, C3H/HeOu and FVB/N. METHODS We followed retinal degeneration over 24 weeks in FVB/N, C3H/HeOu, and congenic Pde6b(+) seeing mouse lines, using a range of experimental techniques including extracellular recordings from retinal ganglion cells, PCR quantification of cone opsin and Pde6b transcripts, in vivo flash electroretinogram (ERG), and behavioral optokinetic reflex (OKR) recordings. RESULTS We demonstrated a substantial difference in the speed of retinal degeneration and accompanying loss of visual function between the two rd1 lines. Photoreceptor degeneration and loss of vision were faster with an earlier onset in the FVB/N mice compared to C3H/HeOu mice, whereas the performance of the Pde6b(+) mice did not differ significantly in any of the tests. By postnatal week 4, the FVB/N mice expressed significantly less cone opsin and Pde6b mRNA and had neither ERG nor OKR responses. At 12 weeks of age, the retinal ganglion cells of the FVB/N mice had lost all light responses. In contrast, 4-week-old C3H/HeOu mice still had ERG and OKR responses, and we still recorded light responses from C3H/HeOu retinal ganglion cells until the age of 24 weeks. These results show that genetic background plays an important role in the rd1 mouse pathology. CONCLUSIONS Analogous to human RP, the mouse genetic background strongly influences the rd1 phenotype. Thus, different rd1 mouse lines may follow different timelines of retinal degeneration, making exact knowledge of genetic background imperative in all studies that use rd1 models.
Resumo:
PURPOSE Stress urinary incontinence (SUI) affects women of all ages including young athletes, especially those involved in high-impact sports. To date, hardly any studies are available testing pelvic floor muscles (PFM) during sports activities. The aim of this study was the description and reliability test of six PFM electromyography (EMG) variables during three different running speeds. The secondary objective was to evaluate whether there was a speed-dependent difference between the PFM activity variables. METHODS This trial was designed as an exploratory and reliability study including ten young healthy female subjects to characterize PFM pre-activity and reflex activity during running at 7, 9 and 11 km/h. Six variables for each running speed, averaged over ten steps per subject, were presented descriptively, tested regarding their reliability (Friedman, ICC, SEM, MD) and speed difference (Friedman). RESULTS PFM EMG variables varied between 67.6 and 106.1 %EMG, showed no systematic error and were low for SEM and MD using the single value model. Applying the average model over ten steps, ICC (3,k) were >0.75 and SEM and MD about 50 % lower than for the single value model. Activity was found to be highest in 11 km/h. CONCLUSION EMG variables showed excellent ICC and very low SEM and MD. Further studies should investigate inter-session reliability and PFM reactivity patterns of SUI patients using the average over ten steps for each variable as it showed very high ICC and very low SEM and MD. Subsequently, longer running distances and other high-impact sports disciplines could be studied.
Resumo:
Western honey bees (Apis mellifera) face an increasing number of challenges that in recent years have led to significant economic effects on apiculture, with attendant consequences for agriculture. Nosemosis is a fungal infection of honey bees caused by either Nosema apis or N. ceranae. The putative greater virulence of N. ceranae has spurred interest in understanding how it differs from N. apis. Little is known of effects of N. apis or N. ceranae on honey bee learning and memory. Following a Pavlovian model that relies on the proboscis extension reflex, we compared acquisition learning and long-term memory recall of uninfected (control) honey bees versus those inoculated with N. apis, N. ceranae, or both. We also tested whether spore intensity was associated with variation in learning and memory. Neither learning nor memory differed among treatments. There was no evidence of a relationship between spore intensity and learning, and only limited evidence of a negative effect on memory; this occurred only in the co-inoculation treatment. Our results suggest that if Nosema spp. are contributing to unusually high colony losses in recent years, the mechanism by which they may affect honey bees is probably not related to effects on learning or memory, at least as assessed by the proboscis extension reflex.
Resumo:
Transforming growth factor beta-1 (TGF-β1) is a cytokine and neurotrophic factor whose neuromodulatory effects in Aplysia californica were recently described. Previous results demonstrated that TGF-β1 induces long-term increases in the efficacy of sensorimotor synapses, a neural correlate of sensitization of the defensive tail withdrawal reflex. These results provided the first evidence that a neurotrophic factor regulates neuronal plasticity associated with a simple form of learning in Aplysia, and raised many questions regarding the nature of the modulation. No homologs of TGF-β had previously been identified in Aplysia, and thus, it was not known whether components of TGF-β1 signaling pathways were present in Aplysia. Furthermore, the signaling mechanisms engaged by TGF-β1 had not been identified, and it was not known whether TGF-β1 regulated other aspects of neuronal function.^ The present investigation into the actions of TGF-β1 was initiated by examining the distribution of the type II TGF-β1 receptor, the ligand binding receptor. The receptor was widely distributed in the CNS and most neurons exhibited somatic and neuritic immunoreactivity. In addition, the ability of TGF-β1 to activate the cAMP/PKA and MAPK pathways, known to regulate several important aspects of neuronal function, was examined. TGF-β1 acutely decreased cAMP levels in sensory neurons, activated MAPK and triggered translocation of MAPK to the nucleus. MAPK activation was critical for both short- and long-term regulation of neuronal function by TGF-β1. TGF-β1 acutely decreased synaptic depression induced by low frequency stimuli in a MAPK-dependent manner. This regulation may result, at least in part, from the modulation of synapsin, a major peripheral synaptic vesicle protein. TGF-β1 stimulated MAPK-dependent phosphorylation of synapsin, a process believed to regulate synaptic vesicle mobilization from reserve to readily-releasable pools of neurotransmitter. In addition to its acute effect on synaptic efficacy, TGF-β1 also induced long-term increases in sensory neuron excitability. Whereas transient exposure to TGF-β1 was not sufficient to drive short-or long-term changes in excitability, prolonged exposure to TGF-β1 induced long-term changes in excitability that depended on MAPK. The results of these studies represent significant progress toward an understanding of the role of TGF-β1 in neuronal plasticity. ^
Resumo:
Chronic respiratory illnesses are a significant cause of morbidity and mortality, and acute changes in respiratory function often lead to hospitalization. Air pollution is known to exacerbate asthma, but the molecular mechanisms of this are poorly understood. The current studies were aimed at clarifying the roles of nerve subtypes and purinergic receptors in respiratory reflex responses following exposure to irritants. In C57Bl/6J female mice, inspired adenosine produced sensory irritation, shown to be mediated mostly by A-delta fibers. Secondly, the response to inhaled acetic acid was discovered to be dually influenced by C and A-delta fibers, as indicated by the observed effects of capsaicin pretreatment, which selectively destroys TRPV1-expressing fibers (mostly C fibers) and pretreatment with theophylline, a nonselective adenosine receptor antagonist. The responses to both adenosine and acetic acid were enhanced in the ovalbumin-allergic airway disease model, although the particular pathway altered is still unknown.
Resumo:
A study was conducted to empirically determine the degradation of survey-grade GPS horizontal position measurements due to the effects of broadleaf forest canopies. The measurements were taken using GPS/GLONASS-capable receivers measuring C/A and P-codes, and carrier phase. Fourteen survey markers were chosen in central Connecticut to serve as reference markers for the study. These markers had varying degrees of sky obstruction due to overhanging tree canopies. Sky obstruction was measured by photographing the sky with a 35mm reflex camera fitted with a hemispherical lens. The negative was scanned and the image mapped using an equal- area projection to remove the distortion caused by the lens. The resulting digital image was thresholded to produce a black-and-white image in which a count of the black pixels is a measure of sky-area obstruction. The locations of the markers were determined independently before the study. During the study, each marker was occupied for four 20-minute sessions over the period of one week in mid-July, 1999. The location of the study markers produced relatively long baselines, as compared with similar studies. We compared the accuracy of GPS-only vs. GPS&GLONASS as a function of sky obstruction. Based on our results, GLONASS observations did not improve or degrade the accuracy of the position measurements. There is a loss of 2mm of accuracy per percent of sky obstruction for both GPS only and GPS&GLONASS.