773 resultados para interactive mapping
Resumo:
Report for the scientific sojourn at the Stanford University from January until June 2007. Music is well known for affecting human emotional states, yet the relationship between specific musical parameters and emotional responses is still not clear. With the advent of new human-computer interaction (HCI) technologies, it is now possible to derive emotion-related information from physiological data and use it as an input to interactive music systems. Providing such implicit musical HCI will be highly relevant for a number of applications including music therapy, diagnosis, nteractive gaming, and physiologically-based musical instruments. A key question in such physiology-based compositions is how sound synthesis parameters can be mapped to emotional states of valence and arousal. We used both verbal and heart rate responses to evaluate the affective power of five musical parameters. Our results show that a significant correlation exists between heart rate and the subjective evaluation of well-defined musical parameters. Brightness and loudness showed to be arousing parameters on subjective scale while harmonicity and even partial attenuation factor resulted in heart rate changes typically associated to valence. This demonstrates that a rational approach to designing emotion-driven music systems for our public installations and music therapy applications is possible.
Resumo:
Rats were treated postnatally (PND 5-16) with BSO (l-buthionine-(S,R)-sulfoximine) in an animal model of schizophrenia based on transient glutathione deficit. The BSO treated rats were impaired in patrolling a maze or a homing table when adult, yet demonstrated preserved escape learning, place discrimination and reversal in a water maze task [37]. In the present work, BSO rats' performance in the water maze was assessed in conditions controlling for the available visual cues. First, in a completely curtained environment with two salient controlled cues, BSO rats showed little accuracy compared to control rats. Secondly, pre-trained BSO rats were impaired in reaching the familiar spatial position when curtains partially occluded different portions of the room environment in successive sessions. The apparently preserved place learning in a classical water maze task thus appears to require the stability and the richness of visual landmarks from the surrounding environment. In other words, the accuracy of BSO rats in place and reversal learning is impaired in a minimal cue condition or when the visual panorama changes between trials. However, if the panorama remains rich and stable between trials, BSO rats are equally efficient in reaching a familiar position or in learning a new one. This suggests that the BSO accurate performance in the water maze does not satisfy all the criteria for a cognitive map based navigation on the integration of polymodal cues. It supports the general hypothesis of a binding deficit in BSO rats.
Resumo:
This study presents a first attempt to extend the “Multi-scale integrated analysis of societal and ecosystem metabolism (MuSIASEM)” approach to a spatial dimension using GIS techniques in the Metropolitan area of Barcelona. We use a combination of census and commercial databases along with a detailed land cover map to create a layer of Common Geographic Units that we populate with the local values of human time spent in different activities according to MuSIASEM hierarchical typology. In this way, we mapped the hours of available human time, in regards to the working hours spent in different locations, putting in evidence the gradients in spatial density between the residential location of workers (generating the work supply) and the places where the working hours are actually taking place. We found a strong three-modal pattern of clumps of areas with different combinations of values of time spent on household activities and on paid work. We also measured and mapped spatial segregation between these two activities and put forward the conjecture that this segregation increases with higher energy throughput, as the size of the functional units must be able to cope with the flow of exosomatic energy. Finally, we discuss the effectiveness of the approach by comparing our geographic representation of exosomatic throughput to the one issued from conventional methods.
Resumo:
The molecular karyotypes for 20 reference strais of species complexes of Leishmania were determined by contour-clamped homogeneous eletric field (CHEF) electrosphoresis. Determination of number/position of chromosome-sized bands and chromosomal DNA locations of house-keeping genes were the two criteria used for differentiating and classifying the Leishmania species. We have established two gel running conditions of optimal separation of chromosomes, wich resolved DNA molecules as large as 2,500 kilobase pairs (kb). Chromosomes were polymorphic in number (22-30) and size (200-2,500 kb) of bands among members of five complexes of Leishmania. Although each stock had a distinct karyotype, in general the differences found between strains and/or species within each complex were not clear enough for parasite identification. However, each group showed a specific number of size-concordant DNA molecules, wich allowed distinction among the Leishmania complex parasites. Clear differences between the Old and New world groups of parasites or among some New World Leishmania species were also apparent in relation to the chromosome locations of beta-tubulin genes. Based on these results as well as data from other published studies the potencial of using DNA karyotype for identifying and classifying leishmanial field isolates is discussed.
Resumo:
Debris flow susceptibility mapping at a regional scale has been the subject of various studies. The complexity of the phenomenon and the variability of local controlling factors limit the use of process-based models for a first assessment. GISbased approaches associating an automatic detection of the source areas and a simple assessment of the debris flow spreading may provide a substantial basis for a preliminary susceptibility assessment at the regional scale. The use of a digital elevation model, with a 10 m resolution, for the Canton de Vaud territory (Switzerland), a lithological map and a land use map, has allowed automatic identification of the potential source areas. The spreading estimates are based on basic probabilistic and energy calculations that allow to define the maximal runout distance of a debris flow.
Resumo:
The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Advanced mapping of environmental data: Geostatistics, Machine Learning and Bayesian Maximum Entropy
Resumo:
This book combines geostatistics and global mapping systems to present an up-to-the-minute study of environmental data. Featuring numerous case studies, the reference covers model dependent (geostatistics) and data driven (machine learning algorithms) analysis techniques such as risk mapping, conditional stochastic simulations, descriptions of spatial uncertainty and variability, artificial neural networks (ANN) for spatial data, Bayesian maximum entropy (BME), and more.
Resumo:
Functional imaging with intravoxel incoherent motion (IVIM) magnetic resonance imaging (MRI) is demonstrated. Images were acquired at 3 Tesla using a standard Stejskal-Tanner diffusion-weighted echo-planar imaging sequence with multiple b-values. Cerebro-spinal fluid signal, which is highly incoherent, was suppressed with an inversion recovery preparation pulse. IVIM microvascular perfusion parameters were calculated according to a two-compartment (vascular and non-vascular) diffusion model. The results obtained in 8 healthy human volunteers during visual stimulation are presented. The IVIM blood flow related parameter fD* increased 170% during stimulation in the visual cortex, and 70% in the underlying white matter.
Resumo:
L’èxit del Projecte Genoma Humà (PGH) l’any 2000 va fer de la “medicina personalitzada” una realitat més propera. Els descobriments del PGH han simplificat les tècniques de seqüenciació de tal manera que actualment qualsevol persona pot aconseguir la seva seqüència d’ADN complerta. La tecnologia de Read Mapping destaca en aquest tipus de tècniques i es caracteritza per manegar una gran quantitat de dades. Hadoop, el framework d’Apache per aplicacions intensives de dades sota el paradigma Map Reduce, resulta un aliat perfecte per aquest tipus de tecnologia i ha sigut l’opció escollida per a realitzar aquest projecte. Durant tot el treball es realitza l’estudi, l’anàlisi i les experimentacions necessàries per aconseguir un Algorisme Genètic innovador que utilitzi tot el potencial de Hadoop.
Resumo:
The aim of this study was to develop a polymerase chain reaction (PCR) for the detection of respiratory syncytial virus (RSV) genomes. The primers were designed from published sequences and selected from conserved regions of the genome encoding for the N protein of subgroups A and B of RSV. PCR was applied to 20 specimens from children admitted to the respiratory ward of "William Soler" Pediatric Hospital in Havana City with a clinical diagnosis of bronchiolitis. The PCR was compared with viral isolation and with an indirect immunofluorescence technique that employs monoclonal antibodies of subgroups A and B. Of 20 nasopharyngeal exudates, 10 were found positive by the three assayed methods. In only two cases, samples that yielded positive RNA-PCR were found negative by indirect immunofluorescence and cell culture. Considering viral isolation as the "gold standard" technique, RNA-PCR had 100% sensitivity and 80% specificity. RNA-PCR is a specific and sensitive technique for the detection of the RSV genome. Technical advantages are discussed
Resumo:
Many protozoan parasites represent an important group of human pathogens. Pulsed Field Gradient Gel Electrophoresis (PFGE) analysis has been an important tool for fundamental genetic studies of parasites like Trypanosoma, Leishmania, Giardia or the human malaria parasite Plasmodium falciparum. We present PFGE conditions allowing a high resolution separation of chromosomes ranging from 500 to 4000 kb within a two day electrophoresis run. In addition, we present conditions for separating large chromosomes (2000-6000 kb) within 36 hr. We demontrate that the application of two dimentional PFGE (2D-PFGE) technique to parasite karyotypes is a very useful method for the analysis of dispersed gene families and comparative studies of the intrachomosomal genome organization