932 resultados para spatial information processing theories
Resumo:
L’objectif principal de cette thèse était d’obtenir, via l’électrophysiologie cognitive, des indices de fonctionnement post-traumatisme craniocérébral léger (TCCL) pour différents niveaux de traitement de l’information, soit l’attention sélective, les processus décisionnels visuoattentionnels et les processus associés à l’exécution d’une réponse volontaire. L’hypothèse centrale était que les mécanismes de production des lésions de même que la pathophysiologie caractérisant le TCCL engendrent des dysfonctions visuoattentionnelles, du moins pendant la période aiguë suivant le TCCL (i.e. entre 1 et 3 mois post-accident), telles que mesurées à l’aide d’un nouveau paradigme électrophysiologique conçu à cet effet. Cette thèse présente deux articles qui décrivent le travail effectué afin de rencontrer ces objectifs et ainsi vérifier les hypothèses émises. Le premier article présente la démarche réalisée afin de créer une nouvelle tâche d’attention visuospatiale permettant d’obtenir les indices électrophysiologiques (amplitude, latence) et comportementaux (temps de réaction) liés aux processus de traitement visuel et attentionnel précoce (P1, N1, N2-nogo, P2, Ptc) à l’attention visuelle sélective (N2pc, SPCN) et aux processus décisionnels (P3b, P3a) chez un groupe de participants sains (i.e. sans atteinte neurologique). Le deuxième article présente l’étude des effets persistants d’un TCCL sur les fonctions visuoattentionelles via l’obtention des indices électrophysiologiques ciblés (amplitude, latence) et de données comportementales (temps de réaction à la tâche et résultats aux tests neuropsychologiques) chez deux cohortes d’individus TCCL symptomatiques, l’une en phase subaigüe (3 premiers mois post-accident), l’autre en phase chronique (6 mois à 1 an post-accident), en comparaison à un groupe de participants témoins sains. Les résultats des articles présentés dans cette thèse montrent qu’il a été possible de créer une tâche simple qui permet d’étudier de façon rapide et peu coûteuse les différents niveaux de traitement de l’information impliqués dans le déploiement de l’attention visuospatiale. Par la suite, l’utilisation de cette tâche auprès d’individus atteints d’un TCCL testés en phase sub-aiguë ou en phase chronique a permis d’objectiver des profils d’atteintes et de récupération différentiels pour chacune des composantes étudiées. En effet, alors que les composantes associées au traitement précoce de l’information visuelle (P1, N1, N2) étaient intactes, certaines composantes attentionnelles (P2) et cognitivo-attentionnelles (P3a, P3b) étaient altérées, suggérant une dysfonction au niveau des dynamiques spatio-temporelles de l’attention, de l’orientation de l’attention et de la mémoire de travail, à court et/ou à long terme après le TCCL, ceci en présence de déficits neuropsychologiques en phase subaiguë surtout et d’une symptomatologie post-TCCL persistante. Cette thèse souligne l’importance de développer des outils diagnostics sensibles et exhaustifs permettant d’objectiver les divers processus et sous-processus cognitifs susceptible d’être atteints après un TCCL.
Resumo:
Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.
Resumo:
Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as f-test is performed during each node’s split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.
Resumo:
On most if not all evaluatively relevant dimensions such as the temperature level, taste intensity, and nutritional value of a meal, one range of adequate, positive states is framed by two ranges of inadequate, negative states, namely too much and too little. This distribution of positive and negative states in the information ecology results in a higher similarity of positive objects, people, and events to other positive stimuli as compared to the similarity of negative stimuli to other negative stimuli. In other words, there are fewer ways in which an object, a person, or an event can be positive as compared to negative. Oftentimes, there is only one way in which a stimulus can be positive (e.g., a good meal has to have an adequate temperature level, taste intensity, and nutritional value). In contrast, there are many different ways in which a stimulus can be negative (e.g., a bad meal can be too hot or too cold, too spicy or too bland, or too fat or too lean). This higher similarity of positive as compared to negative stimuli is important, as similarity greatly impacts speed and accuracy on virtually all levels of information processing, including attention, classification, categorization, judgment and decision making, and recognition and recall memory. Thus, if the difference in similarity between positive and negative stimuli is a general phenomenon, it predicts and may explain a variety of valence asymmetries in cognitive processing (e.g., positive as compared to negative stimuli are processed faster but less accurately). In my dissertation, I show that the similarity asymmetry is indeed a general phenomenon that is observed in thousands of words and pictures. Further, I show that the similarity asymmetry applies to social groups. Groups stereotyped as average on the two dimensions agency / socio-economic success (A) and conservative-progressive beliefs (B) are stereotyped as positive or high on communion (C), while groups stereotyped as extreme on A and B (e.g., managers, homeless people, punks, and religious people) are stereotyped as negative or low on C. As average groups are more similar to one another than extreme groups, according to this ABC model of group stereotypes, positive groups are mentally represented as more similar to one another than negative groups. Finally, I discuss implications of the ABC model of group stereotypes, pointing to avenues for future research on how stereotype content shapes social perception, cognition, and behavior.
Resumo:
Collecting ground truth data is an important step to be accomplished before performing a supervised classification. However, its quality depends on human, financial and time ressources. It is then important to apply a validation process to assess the reliability of the acquired data. In this study, agricultural infomation was collected in the Brazilian Amazonian State of Mato Grosso in order to map crop expansion based on MODIS EVI temporal profiles. The field work was carried out through interviews for the years 2005-2006 and 2006-2007. This work presents a methodology to validate the training data quality and determine the optimal sample to be used according to the classifier employed. The technique is based on the detection of outlier pixels for each class and is carried out by computing Mahalanobis distances for each pixel. The higher the distance, the further the pixel is from the class centre. Preliminary observations through variation coefficent validate the efficiency of the technique to detect outliers. Then, various subsamples are defined by applying different thresholds to exclude outlier pixels from the classification process. The classification results prove the robustness of the Maximum Likelihood and Spectral Angle Mapper classifiers. Indeed, those classifiers were insensitive to outlier exclusion. On the contrary, the decision tree classifier showed better results when deleting 7.5% of pixels in the training data. The technique managed to detect outliers for all classes. In this study, few outliers were present in the training data, so that the classification quality was not deeply affected by the outliers.
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física
Resumo:
Os sistemas biológicos são surpreendentemente flexíveis pra processar informação proveniente do mundo real. Alguns organismos biológicos possuem uma unidade central de processamento denominada de cérebro. O cérebro humano consiste de 10(11) neurônios e realiza processamento inteligente de forma exata e subjetiva. A Inteligência Artificial (IA) tenta trazer para o mundo da computação digital a heurística dos sistemas biológicos de várias maneiras, mas, ainda resta muito para que isso seja concretizado. No entanto, algumas técnicas como Redes neurais artificiais e lógica fuzzy tem mostrado efetivas para resolver problemas complexos usando a heurística dos sistemas biológicos. Recentemente o numero de aplicação dos métodos da IA em sistemas zootécnicos tem aumentado significativamente. O objetivo deste artigo é explicar os princípios básicos da resolução de problemas usando heurística e demonstrar como a IA pode ser aplicada para construir um sistema especialista para resolver problemas na área de zootecnia.
Resumo:
The existence of quantum correlation (as revealed by quantum discord), other than entanglement and its role in quantum-information processing (QIP), is a current subject for discussion. In particular, it has been suggested that this nonclassical correlation may provide computational speedup for some quantum algorithms. In this regard, bulk nuclear magnetic resonance (NMR) has been successfully used as a test bench for many QIP implementations, although it has also been continuously criticized for not presenting entanglement in most of the systems used so far. In this paper, we report a theoretical and experimental study on the dynamics of quantum and classical correlations in an NMR quadrupolar system. We present a method for computing the correlations from experimental NMR deviation-density matrices and show that, given the action of the nuclear-spin environment, the relaxation produces a monotonic time decay in the correlations. Although the experimental realizations were performed in a specific quadrupolar system, the main results presented here can be applied to whichever system uses a deviation-density matrix formalism.
Resumo:
How does knowledge management (KM) by a government agency responsible for environmental impact assessment (EIA) potentially contribute to better environmental assessment and management practice? Staff members at government agencies in charge of the EIA process are knowledge workers who perform judgement-oriented tasks highly reliant on individual expertise, but also grounded on the agency`s knowledge accumulated over the years. Part of an agency`s knowledge can be codified and stored in an organizational memory, but is subject to decay or loss if not properly managed. The EIA agency operating in Western Australia was used as a case study. Its KM initiatives were reviewed, knowledge repositories were identified and staff surveyed to gauge the utilisation and effectiveness of such repositories in enabling them to perform EIA tasks. Key elements of KM are the preparation of substantive guidance and spatial information management. It was found that treatment of cumulative impacts on the environment is very limited and information derived from project follow-up is not properly captured and stored, thus not used to create new knowledge and to improve practice and effectiveness. Other opportunities for improving organizational learning include the use of after-action reviews. The learning about knowledge management in EIA practice gained from Western Australian experience should be of value to agencies worldwide seeking to understand where best to direct their resources for their own knowledge repositories and environmental management practice. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
We propose a review of recent developments on entanglement and nonclassical effects in collective two-atom systems and present a uniform physical picture of the many predicted phenomena. The collective effects have brought into sharp focus some of the most basic features of quantum theory, such as nonclassical states of light and entangled states of multiatom systems. The entangled states are linear superpositions of the internal states of the system which cannot be separated into product states of the individual atoms. This property is recognized as entirely quantum-mechanical effect and have played a crucial role in many discussions of the nature of quantum measurements and, in particular, in the developments of quantum communications. Much of the fundamental interest in entangled states is connected with its practical application ranging from quantum computation, information processing, cryptography, and interferometry to atomic spectroscopy.
Resumo:
Using spontaneous parametric down-conversion, we produce polarization-entangled states of two photons and characterize them using two-photon tomography to measure the density matrix. A controllable decoherence is imposed on the states by passing the photons through thick, adjustable birefringent elements. When the system is subject to collective decoherence, one particular entangled state is seen to be decoherence-free, as predicted by theory. Such decoherence-free systems may have an important role for the future of quantum computation and information processing.
Resumo:
Faced with today’s ill-structured business environment of fast-paced change and rising uncertainty, organizations have been searching for management tools that will perform satisfactorily under such ambiguous conditions. In the arena of managerial decision making, one of the approaches being assessed is the use of intuition. Based on our definition of intuition as a non-sequential information-processing mode, which comprises both cognitive and affective elements and results in direct knowing without any use of conscious reasoning, we develop a testable model of integrated analytical and intuitive decision making and propose ways to measure the use of intuition.
Resumo:
We show how an initially prepared quantum state of a radiation mode in a cavity can be preserved for a long time using a feedback scheme based on the injection of appropriately prepared atoms. We present a feedback scheme both for optical cavities, which can be continuously monitored by a photodetector, and for microwave cavities, which can be monitored only indirectly via the detection of atoms that have interacted with the cavity field. We also discuss the possibility of applying these methods for decoherence control in quantum information processing.
Resumo:
This paper reports a follow-up study to an article on the sensitivity of three tests of speed of information processing to impairment after concussion (Hinton-Bayre, Geffen, BL McFarland, 1997). Group analyses showed that practice effects can obscure the effects of concussion on information processing, thereby making the assessment of functional impairment and recovery after injury unreliable. A Reliable Change Index (RCI) was used to assess individual variations following concussion. It was found that 16 of the 20 concussed professional rugby league players were impaired 1-3 days following injury. It was also demonstrated that 7 players still displayed cognitive deficits at 1-2 weeks, before returning to preseason levels at 3-5 weeks. The RCI permits comparisons between different tests, players, and repeated assessments, thereby providing a quantitative basis for decisions regarding return to play.
Resumo:
Recent reports have shown neurodegenerative disorders to be associated with abnormal expansions of a CAG trinucleotide repeat allele at various autosomal loci. While normal chromosomes have 14 to 44 repeats, disease chromosomes may have 60 to 84 repeats. The number of CAG repeats on mutant chromosomes correlates with increasing severity of disease or decreasing age at onset of symptoms. Since we are interested in identifying the many quantitative trait loci (QTL) influencing brain functioning, we examined the possibility that the number of CAG repeats in the normal size range at these loci are relevant to "normal" neural functioning. We have used 150 pairs of adolescent (aged 16 years) twins and their parents to examine allele size at the MJD, SCA1, and DRPLA loci in heterozygous normal individuals. These are part of a large ongoing project using cognitive and physiological measures to investigate the genetie influences on cognition, and an extensive protocol of tests is employed to assess some of the key components of intellectual functioning. This study selected to examine full-scale psychometric IQ (FSIQ) and a measure of information processing (choice reaction time) and working memory (slow wave amplitude). CAG repeat size was determined on an ABI Genescan system following multiplex PCR amplification. Quantitative genetic analyses were performed to determine QTL effects of MJD, SCA1, and DRPLA on cognitive functioning. Analyses are in progress and will be discussed.