26 resultados para Object recognition test
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Effects of the dihydropyridine, nimodipine, an antagonist at L-type calcium channels, on the memory loss in rats caused by long term alcohol consumption were examined. Either a single dose of nimodipine or 2 weeks of repeated administration was given prior to withdrawal from 8 months of alcohol consumption. Memory was measured by the object recognition test and the T maze. Both nimodipine treatments prevented the memory deficits when these were measured between 1 and 2 months after alcohol withdrawal. At the end of the memory testing, 2 months after cessation of chronic alcohol consumption, glucocorticoid concentrations were increased in specific regions of rat brain without changes in plasma concentrations. Both nimodipine treatment schedules substantially reduced these rises in brain glucocorticoid. The data indicate that blockade of L-type calcium channels prior to alcohol withdrawal protects against the memory deficits caused by prolonged alcohol intake. This shows that specific drug treatments, such as nimodipine, given over the acute withdrawal phase, can prevented the neuronal changes responsible for subsequent adverse effects of long term consumption of alcohol. The results also suggest the possibility that regional brain glucocorticoid increases may be involved in the adverse effects of long term alcohol intake on memory. Such local changes in brain glucocorticoid levels would have major effects on neuronal function. The studies indicate that L-type calcium channels and brain glucocorticoid levels could form new targets for the treatment of cognitive deficits in alcoholics.
Resumo:
BACKGROUND: Studies were carried out to test the hypothesis that administration of a glucocorticoid Type II receptor antagonist, mifepristone (RU38486), just prior to withdrawal from chronic alcohol treatment, would prevent the consequences of the alcohol consumption and withdrawal in mice. MATERIALS AND METHODS: The effects of administration of a single intraperitoneal dose of mifepristone were examined on alcohol withdrawal hyperexcitability. Memory deficits during the abstinence phase were measured using repeat exposure to the elevated plus maze, the object recognition test, and the odor habituation/discrimination test. Neurotoxicity in the hippocampus and prefrontal cortex was examined using NeuN staining. RESULTS: Mifepristone reduced, though did not prevent, the behavioral hyperexcitability seen in TO strain mice during the acute phase of alcohol withdrawal (4 hours to 8 hours after cessation of alcohol consumption) following chronic alcohol treatment via liquid diet. There were no alterations in anxiety-related behavior in these mice at 1 week into withdrawal, as measured using the elevated plus maze. However, changes in behavior during a second exposure to the elevated plus maze 1 week later were significantly reduced by the administration of mifepristone prior to withdrawal, indicating a reduction in the memory deficits caused by the chronic alcohol treatment and withdrawal. The object recognition test and the odor habituation and discrimination test were then used to measure memory deficits in more detail, at between 1 and 2 weeks after alcohol withdrawal in C57/BL10 strain mice given alcohol chronically via the drinking fluid. A single dose of mifepristone given at the time of alcohol withdrawal significantly reduced the memory deficits in both tests. NeuN staining showed no evidence of neuronal loss in either prefrontal cortex or hippocampus after withdrawal from chronic alcohol treatment. CONCLUSIONS: The results suggest mifepristone may be of value in the treatment of alcoholics to reduce their cognitive deficits.
Resumo:
Aviation security strongly depends on screeners' performance in the detection of threat objects in x-ray images of passenger bags. We examined for the first time the effects of stress and stress-induced cortisol increases on detection performance of hidden weapons in an x-ray baggage screening task. We randomly assigned 48 participants either to a stress or a nonstress group. The stress group was exposed to a standardized psychosocial stress test (TSST). Before and after stress/nonstress, participants had to detect threat objects in a computer-based object recognition test (X-ray ORT). We repeatedly measured salivary cortisol and X-ray ORT performance before and after stress/nonstress. Cortisol increases in reaction to psychosocial stress induction but not to nonstress independently impaired x-ray detection performance. Our results suggest that stress-induced cortisol increases at peak reactivity impair x-ray screening performance.
Resumo:
Autism is a chronic pervasive neurodevelopmental disorder characterized by the early onset of social and communicative impairments as well as restricted, ritualized, stereotypic behavior. The endophenotype of autism includes neuropsychological deficits, for instance a lack of "Theory of Mind" and problems recognizing facial affect. In this study, we report the development and evaluation of a computer-based program to teach and test the ability to identify basic facially expressed emotions. 10 adolescent or adult subjects with high-functioning autism or Asperger-syndrome were included in the investigation. A priori the facial affect recognition test had shown good psychometric properties in a normative sample (internal consistency: rtt=.91-.95; retest reliability: rtt=.89-.92). In a prepost design, one half of the sample was randomly assigned to receive computer treatment while the other half of the sample served as control group. The training was conducted for five weeks, consisting of two hours training a week. The trained individuals improved significantly on the affect recognition task, but not on any other measure. Results support the usefulness of the program to teach the detection of facial affect. However, the improvement found is limited to a circumscribed area of social-communicative function and generalization is not ensured.
Resumo:
BACKGROUND: Higher visual functions can be defined as cognitive processes responsible for object recognition, color and shape perception, and motion detection. People with impaired higher visual functions after unilateral brain lesion are often tested with paper pencil tests, but such tests do not assess the degree of interaction between the healthy brain hemisphere and the impaired one. Hence, visual functions are not tested separately in the contralesional and ipsilesional visual hemifields. METHODS: A new measurement setup, that involves real-time comparisons of shape and size of objects, orientation of lines, speed and direction of moving patterns, in the right or left visual hemifield, has been developed. The setup was implemented in an immersive environment like a hemisphere to take into account the effects of peripheral and central vision, and eventual visual field losses. Due to the non-flat screen of the hemisphere, a distortion algorithm was needed to adapt the projected images to the surface. Several approaches were studied and, based on a comparison between projected images and original ones, the best one was used for the implementation of the test. Fifty-seven healthy volunteers were then tested in a pilot study. A Satisfaction Questionnaire was used to assess the usability of the new measurement setup. RESULTS: The results of the distortion algorithm showed a structural similarity between the warped images and the original ones higher than 97%. The results of the pilot study showed an accuracy in comparing images in the two visual hemifields of 0.18 visual degrees and 0.19 visual degrees for size and shape discrimination, respectively, 2.56° for line orientation, 0.33 visual degrees/s for speed perception and 7.41° for recognition of motion direction. The outcome of the Satisfaction Questionnaire showed a high acceptance of the battery by the participants. CONCLUSIONS: A new method to measure higher visual functions in an immersive environment was presented. The study focused on the usability of the developed battery rather than the performance at the visual tasks. A battery of five subtasks to study the perception of size, shape, orientation, speed and motion direction was developed. The test setup is now ready to be tested in neurological patients.
Resumo:
Brain mechanisms associated with artistic talents or skills are still not well understood. This exploratory study investigated differences in brain activity of artists and non-artists while drawing previously presented perspective line-drawings from memory and completing other drawing-related tasks. Electroencephalography (EEG) data were analyzed for power in the frequency domain by means of a Fast Fourier Transform (FFT). Low Resolution Brain Electromagnetic Tomography (LORETA) was applied to localize emerging significances. During drawing and related tasks, decreased power was seen in artists compared to non-artists mainly in upper alpha frequency ranges. Decreased alpha power is often associated with an increase in cognitive functioning and may reflect enhanced semantic memory performance and object recognition processes in artists. These assumptions are supported by the behavioral data assessed in this study and complement previous findings showing increased parietal activations in non-artists compared to artists while drawing. However, due to the exploratory nature of the analysis, additional confirmatory studies will be needed.
Resumo:
Introduction. Erroneous answers in studies on the misinformation effect (ME) can be reduced in different ways. In some studies, ME was reduced by SM questions, warnings, or a low credibility of the source of post-event information (PEI). Results are inconsistent, however. Of course, a participant can deliberately decide to refrain from reporting a critical item only when the difference between the original event and the PEI is distinguishable in principle. We were interested in the question to what extent the influence of erroneous information on a central aspect of the original event can be reduced by different means applied singly or in combination. Method. With a 2 (credibility; high vs. low) x 2 (warning; present vs. absent) between subjects design and an additional control group that received neither misinformation nor a warning (N = 116), we examined the above-mentioned factors’ influence on the ME. Participants viewed a short video of a robbery. The critical item suggested in the PEI was that the victim was given a kick by the perpetrator (which he was actually not). The memory test consisted of a two-forced-choice recognition test followed by a SM test. Results. To our surprise, neither a main effect of erroneous PEI nor a main effect of credibility was found. The error rates for the critical item in the control group (50%) as well as in the high (65%) and low (52%) credibility condition without warning did not significantly differ. A warning about possible misleading information in the PEI significantly reduced the influence of misinformation in both credibility conditions by 32-37%. Using a SM question significantly reduced the error rate too, but only in the high credibility no warning condition. Conclusion and Future Research. Our results show that, contrary to a warning or the use of a SM question, low source credibility did not reduce the ME. The most striking finding was, however, the absence of a main effect of erroneous PEI. Due to the high error rate in the control group, we suspect that the wrong answers might have been caused either by the response format (recognition test) or by autosuggestion possibly promoted by the high schema-consistency of the critical item. First results of a post-study in which we used open-ended questions before the recognition test support the former assumption. Results of a replication of this study using open-ended questions prior to the recognition test will be available by June.
Resumo:
The goal of this study was to investigate recognition memory performance across the lifespan and to determine how estimates of recollection and familiarity contribute to performance. In each of three experiments, participants from five groups from 14 up to 85 years of age (children, young adults, middle-aged adults, young-old adults, and old-old adults) were presented with high- and low-frequency words in a study phase and were tested immediately afterwards and/or after a one day retention interval. The results showed that word frequency and retention interval affected recognition memory performance as well as estimates of recollection and familiarity. Across the lifespan, the trajectory of recognition memory followed an inverse u-shape function that was neither affected by word frequency nor by retention interval. The trajectory of estimates of recollection also followed an inverse u-shape function, and was especially pronounced for low-frequency words. In contrast, estimates of familiarity did not differ across the lifespan. The results indicate that age differences in recognition memory are mainly due to differences in processes related to recollection while the contribution of familiarity-based processes seems to be age-invariant.
Resumo:
One to three percent of patients exposed to intravenously injected iodinated contrast media (CM) develop delayed hypersensitivity reactions. Positive patch test reactions, immunohistological findings, and CM-specific proliferation of T cells in vitro suggest a pathogenetic role for T cells. We have previously demonstrated that CM-specific T cell clones (TCCs) show a broad range of cross-reactivity to different CM. However, the mechanism of specific CM recognition by T cell receptors (TCRs) has not been analysed so far.
Resumo:
Primate multisensory object perception involves distributed brain regions. To investigate the network character of these regions of the human brain, we applied data-driven group spatial independent component analysis (ICA) to a functional magnetic resonance imaging (fMRI) data set acquired during a passive audio-visual (AV) experiment with common object stimuli. We labeled three group-level independent component (IC) maps as auditory (A), visual (V), and AV, based on their spatial layouts and activation time courses. The overlap between these IC maps served as definition of a distributed network of multisensory candidate regions including superior temporal, ventral occipito-temporal, posterior parietal and prefrontal regions. During an independent second fMRI experiment, we explicitly tested their involvement in AV integration. Activations in nine out of these twelve regions met the max-criterion (A < AV > V) for multisensory integration. Comparison of this approach with a general linear model-based region-of-interest definition revealed its complementary value for multisensory neuroimaging. In conclusion, we estimated functional networks of uni- and multisensory functional connectivity from one dataset and validated their functional roles in an independent dataset. These findings demonstrate the particular value of ICA for multisensory neuroimaging research and using independent datasets to test hypotheses generated from a data-driven analysis.
Resumo:
When reengineering legacy systems, it is crucial to assess if the legacy behavior has been preserved or how it changed due to the reengineering effort. Ideally if a legacy system is covered by tests, running the tests on the new version can identify potential differences or discrepancies. However, writing tests for an unknown and large system is difficult due to the lack of internal knowledge. It is especially difficult to bring the system to an appropriate state. Our solution is based on the acknowledgment that one of the few trustable piece of information available when approaching a legacy system is the running system itself. Our approach reifies the execution traces and uses logic programming to express tests on them. Thereby it eliminates the need to programatically bring the system in a particular state, and handles the test-writer a high-level abstraction mechanism to query the trace. The resulting system, called TESTLOG, was used on several real-world case studies to validate our claims.
Resumo:
The early detection of subjects with probable Alzheimer's disease (AD) is crucial for effective appliance of treatment strategies. Here we explored the ability of a multitude of linear and non-linear classification algorithms to discriminate between the electroencephalograms (EEGs) of patients with varying degree of AD and their age-matched control subjects. Absolute and relative spectral power, distribution of spectral power, and measures of spatial synchronization were calculated from recordings of resting eyes-closed continuous EEGs of 45 healthy controls, 116 patients with mild AD and 81 patients with moderate AD, recruited in two different centers (Stockholm, New York). The applied classification algorithms were: principal component linear discriminant analysis (PC LDA), partial least squares LDA (PLS LDA), principal component logistic regression (PC LR), partial least squares logistic regression (PLS LR), bagging, random forest, support vector machines (SVM) and feed-forward neural network. Based on 10-fold cross-validation runs it could be demonstrated that even tough modern computer-intensive classification algorithms such as random forests, SVM and neural networks show a slight superiority, more classical classification algorithms performed nearly equally well. Using random forests classification a considerable sensitivity of up to 85% and a specificity of 78%, respectively for the test of even only mild AD patients has been reached, whereas for the comparison of moderate AD vs. controls, using SVM and neural networks, values of 89% and 88% for sensitivity and specificity were achieved. Such a remarkable performance proves the value of these classification algorithms for clinical diagnostics.
Resumo:
Diagnosis of drug allergy involves first the recognition of sometimes unusual symptoms as drug allergy and, second, the identification of the eliciting drug. This is an often difficult task, as the clinical picture and underlying pathomechanisms are heterogeneous. In clinical routine, physicians frequently have to rely upon a suggestive history and eventual provocation tests, both having their specific limitations. For this reason both in vivo (skin tests) and in vitro tests are investigated intensively as tools to identify the disease-eliciting drug. One of the tests evaluated in drug allergy is the basophil activation test (BAT). Basophils with their high-affinity IgE receptors are easily accessible and therefore can be used as indicator cells for IgE-mediated reactions. Upon allergen challenge and cross-linking of membrane-bound IgE antibodies (via Fc-epsilon-RI) basophils up-regulate certain activation markers on their surface such as CD63 and CD203c, as well as intracellular markers (eg, phosphorylated p38MAPK). In BAT, these alterations can be detected rapidly on a single-cell basis by multicolor flow cytometry using specific monoclonal antibodies. Combining this technique with in vitro passive sensitization of donor basophils with patients' serum, one can prove the IgE dependence of a drug reaction. This article summarizes the authors' current experience with the BAT in the diagnostic management of immediate-type drug allergy mediated by drug-specific IgE antibodies.