31 resultados para Processing Time

em Université de Lausanne, Switzerland


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Practicing physicians are faced with many medical decisions daily. These are mainly influenced by personal experience but should also consider patient preferences and the scientific evidence reflected by a constantly increasing number of medical publications and guidelines. With the objective of optimal medical treatment, the concept of evidence-based medicine is founded on these three aspects. It should be considered that there is a high risk of misinterpreting evidence, leading to medical errors and adverse effects without knowledge of the methodological background. OBJECTIVES: This article explains the concept of systematic error (bias) and its importance. Causes and effects as well as methods to minimize bias are discussed. This information should impart a deeper understanding, leading to a better assessment of studies and implementation of its recommendations in daily medical practice. CONCLUSION: Developed by the Cochrane Collaboration, the risk of bias (RoB) tool is an assessment instrument for the potential of bias in controlled trials. Good handling, short processing time, high transparency of judgements and a graphical presentation of findings that is easily comprehensible are among its strengths. Attached to this article the German translation of the RoB tool is published. This should facilitate the applicability for non-experts and moreover, support evidence-based medical decision-making.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Plasma catecholamines provide a reliable biomarker of sympathetic activity. The low circulating concentrations of catecholamines and analytical interferences require tedious sample preparation and long chromatographic runs to ensure their accurate quantification by HPLC with electrochemical detection. Published or commercially available methods relying on solid phase extraction technology lack sensitivity or require derivatization of catecholamine by hazardous reagents prior to tandem mass spectrometry (MS) analysis. Here, we manufactured a novel 96-well microplate device specifically designed to extract plasma catecholamines prior to their quantification by a new and highly sensitive ultraperformance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) method. Processing time, which included sample purification on activated aluminum oxide and elution, is less than 1 h per 96-well microplate. The UPLC-MS/MS analysis run time is 2.0 min per sample. This UPLC-MS/MS method does not require a derivatization step, reduces the turnaround time by 10-fold compared to conventional methods used for routine application, and allows catecholamine quantification in reduced plasma sample volumes (50-250 μL, e.g., from children and mice).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. METHODS: The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. RESULTS: The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. CONCLUSION: The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

«Quel est l'âge de cette trace digitale?» Cette question est relativement souvent soulevée au tribunal ou lors d'investigations, lorsque la personne suspectée admet avoir laissé ses empreintes digitales sur une scène de crime mais prétend l'avoir fait à un autre moment que celui du crime et pour une raison innocente. Toutefois, aucune réponse ne peut actuellement être donnée à cette question, puisqu'aucune méthodologie n'est pour l'heure validée et acceptée par l'ensemble de la communauté forensique. Néanmoins, l'inventaire de cas américains conduit dans cette recherche a montré que les experts fournissent tout de même des témoignages au tribunal concernant l'âge de traces digitales, même si ceux-­‐ci sont majoritairement basés sur des paramètres subjectifs et mal documentés. Il a été relativement aisé d'accéder à des cas américains détaillés, ce qui explique le choix de l'exemple. Toutefois, la problématique de la datation des traces digitales est rencontrée dans le monde entier, et le manque de consensus actuel dans les réponses données souligne la nécessité d'effectuer des études sur le sujet. Le but de la présente recherche est donc d'évaluer la possibilité de développer une méthode de datation objective des traces digitales. Comme les questions entourant la mise au point d'une telle procédure ne sont pas nouvelles, différentes tentatives ont déjà été décrites dans la littérature. Cette recherche les a étudiées de manière critique, et souligne que la plupart des méthodologies reportées souffrent de limitations prévenant leur utilisation pratique. Néanmoins, certaines approches basées sur l'évolution dans le temps de composés intrinsèques aux résidus papillaires se sont montrées prometteuses. Ainsi, un recensement détaillé de la littérature a été conduit afin d'identifier les composés présents dans les traces digitales et les techniques analytiques capables de les détecter. Le choix a été fait de se concentrer sur les composés sébacés détectés par chromatographie gazeuse couplée à la spectrométrie de masse (GC/MS) ou par spectroscopie infrarouge à transformée de Fourier. Des analyses GC/MS ont été menées afin de caractériser la variabilité initiale de lipides cibles au sein des traces digitales d'un même donneur (intra-­‐variabilité) et entre les traces digitales de donneurs différents (inter-­‐variabilité). Ainsi, plusieurs molécules ont été identifiées et quantifiées pour la première fois dans les résidus papillaires. De plus, il a été déterminé que l'intra-­‐variabilité des résidus était significativement plus basse que l'inter-­‐variabilité, mais que ces deux types de variabilité pouvaient être réduits en utilisant différents pré-­‐ traitements statistiques s'inspirant du domaine du profilage de produits stupéfiants. Il a également été possible de proposer un modèle objectif de classification des donneurs permettant de les regrouper dans deux classes principales en se basant sur la composition initiale de leurs traces digitales. Ces classes correspondent à ce qui est actuellement appelé de manière relativement subjective des « bons » ou « mauvais » donneurs. Le potentiel d'un tel modèle est élevé dans le domaine de la recherche en traces digitales, puisqu'il permet de sélectionner des donneurs représentatifs selon les composés d'intérêt. En utilisant la GC/MS et la FTIR, une étude détaillée a été conduite sur les effets de différents facteurs d'influence sur la composition initiale et le vieillissement de molécules lipidiques au sein des traces digitales. Il a ainsi été déterminé que des modèles univariés et multivariés pouvaient être construits pour décrire le vieillissement des composés cibles (transformés en paramètres de vieillissement par pré-­‐traitement), mais que certains facteurs d'influence affectaient ces modèles plus sérieusement que d'autres. En effet, le donneur, le substrat et l'application de techniques de révélation semblent empêcher la construction de modèles reproductibles. Les autres facteurs testés (moment de déposition, pression, température et illumination) influencent également les résidus et leur vieillissement, mais des modèles combinant différentes valeurs de ces facteurs ont tout de même prouvé leur robustesse dans des situations bien définies. De plus, des traces digitales-­‐tests ont été analysées par GC/MS afin d'être datées en utilisant certains des modèles construits. Il s'est avéré que des estimations correctes étaient obtenues pour plus de 60 % des traces-­‐tests datées, et jusqu'à 100% lorsque les conditions de stockage étaient connues. Ces résultats sont intéressants mais il est impératif de conduire des recherches supplémentaires afin d'évaluer les possibilités d'application de ces modèles dans des cas réels. Dans une perspective plus fondamentale, une étude pilote a également été effectuée sur l'utilisation de la spectroscopie infrarouge combinée à l'imagerie chimique (FTIR-­‐CI) afin d'obtenir des informations quant à la composition et au vieillissement des traces digitales. Plus précisément, la capacité de cette technique à mettre en évidence le vieillissement et l'effet de certains facteurs d'influence sur de larges zones de traces digitales a été investiguée. Cette information a ensuite été comparée avec celle obtenue par les spectres FTIR simples. Il en a ainsi résulté que la FTIR-­‐CI était un outil puissant, mais que son utilisation dans l'étude des résidus papillaires à des buts forensiques avait des limites. En effet, dans cette recherche, cette technique n'a pas permis d'obtenir des informations supplémentaires par rapport aux spectres FTIR traditionnels et a également montré des désavantages majeurs, à savoir de longs temps d'analyse et de traitement, particulièrement lorsque de larges zones de traces digitales doivent être couvertes. Finalement, les résultats obtenus dans ce travail ont permis la proposition et discussion d'une approche pragmatique afin d'aborder les questions de datation des traces digitales. Cette approche permet ainsi d'identifier quel type d'information le scientifique serait capable d'apporter aux enquêteurs et/ou au tribunal à l'heure actuelle. De plus, le canevas proposé décrit également les différentes étapes itératives de développement qui devraient être suivies par la recherche afin de parvenir à la validation d'une méthodologie de datation des traces digitales objective, dont les capacités et limites sont connues et documentées. -- "How old is this fingermark?" This question is relatively often raised in trials when suspects admit that they have left their fingermarks on a crime scene but allege that the contact occurred at a time different to that of the crime and for legitimate reasons. However, no answer can be given to this question so far, because no fingermark dating methodology has been validated and accepted by the whole forensic community. Nevertheless, the review of past American cases highlighted that experts actually gave/give testimonies in courts about the age of fingermarks, even if mostly based on subjective and badly documented parameters. It was relatively easy to access fully described American cases, thus explaining the origin of the given examples. However, fingermark dating issues are encountered worldwide, and the lack of consensus among the given answers highlights the necessity to conduct research on the subject. The present work thus aims at studying the possibility to develop an objective fingermark dating method. As the questions surrounding the development of dating procedures are not new, different attempts were already described in the literature. This research proposes a critical review of these attempts and highlights that most of the reported methodologies still suffer from limitations preventing their use in actual practice. Nevertheless, some approaches based on the evolution of intrinsic compounds detected in fingermark residue over time appear to be promising. Thus, an exhaustive review of the literature was conducted in order to identify the compounds available in the fingermark residue and the analytical techniques capable of analysing them. It was chosen to concentrate on sebaceous compounds analysed using gas chromatography coupled with mass spectrometry (GC/MS) or Fourier transform infrared spectroscopy (FTIR). GC/MS analyses were conducted in order to characterize the initial variability of target lipids among fresh fingermarks of the same donor (intra-­‐variability) and between fingermarks of different donors (inter-­‐variability). As a result, many molecules were identified and quantified for the first time in fingermark residue. Furthermore, it was determined that the intra-­‐variability of the fingermark residue was significantly lower than the inter-­‐variability, but that it was possible to reduce both kind of variability using different statistical pre-­‐ treatments inspired from the drug profiling area. It was also possible to propose an objective donor classification model allowing the grouping of donors in two main classes based on their initial lipid composition. These classes correspond to what is relatively subjectively called "good" or "bad" donors. The potential of such a model is high for the fingermark research field, as it allows the selection of representative donors based on compounds of interest. Using GC/MS and FTIR, an in-­‐depth study of the effects of different influence factors on the initial composition and aging of target lipid molecules found in fingermark residue was conducted. It was determined that univariate and multivariate models could be build to describe the aging of target compounds (transformed in aging parameters through pre-­‐ processing techniques), but that some influence factors were affecting these models more than others. In fact, the donor, the substrate and the application of enhancement techniques seemed to hinder the construction of reproducible models. The other tested factors (deposition moment, pressure, temperature and illumination) also affected the residue and their aging, but models combining different values of these factors still proved to be robust. Furthermore, test-­‐fingermarks were analysed with GC/MS in order to be dated using some of the generated models. It turned out that correct estimations were obtained for 60% of the dated test-­‐fingermarks and until 100% when the storage conditions were known. These results are interesting but further research should be conducted to evaluate if these models could be used in uncontrolled casework conditions. In a more fundamental perspective, a pilot study was also conducted on the use of infrared spectroscopy combined with chemical imaging in order to gain information about the fingermark composition and aging. More precisely, its ability to highlight influence factors and aging effects over large areas of fingermarks was investigated. This information was then compared with that given by individual FTIR spectra. It was concluded that while FTIR-­‐ CI is a powerful tool, its use to study natural fingermark residue for forensic purposes has to be carefully considered. In fact, in this study, this technique does not yield more information on residue distribution than traditional FTIR spectra and also suffers from major drawbacks, such as long analysis and processing time, particularly when large fingermark areas need to be covered. Finally, the results obtained in this research allowed the proposition and discussion of a formal and pragmatic framework to approach the fingermark dating questions. It allows identifying which type of information the scientist would be able to bring so far to investigators and/or Justice. Furthermore, this proposed framework also describes the different iterative development steps that the research should follow in order to achieve the validation of an objective fingermark dating methodology, whose capacities and limits are well known and properly documented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Time is embedded in any sensory experience: the movements of a dance, the rhythm of a piece of music, the words of a speaker are all examples of temporally structured sensory events. In humans, if and how visual cortices perform temporal processing remains unclear. Here we show that both primary visual cortex (V1) and extrastriate area V5/MT are causally involved in encoding and keeping time in memory and that this involvement is independent from low-level visual processing. Most importantly we demonstrate that V1 and V5/MT are functionally linked and temporally synchronized during time encoding whereas they are functionally independent and operate serially (V1 followed by V5/MT) while maintaining temporal information in working memory. These data challenge the traditional view of V1 and V5/MT as visuo-spatial features detectors and highlight the functional contribution and the temporal dynamics of these brain regions in the processing of time in millisecond range. The present project resulted in the paper entitled: 'How the visual brain encodes and keeps track of time' by Paolo Salvioni, Lysiann Kalmbach, Micah Murray and Domenica Bueti that is now submitted for publication to the Journal of Neuroscience.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Fragile X mental retardation protein (FMRP) regulates neuronal RNA metabolism, and its absence or mutations leads to the Fragile X syndrome (FXS). The β-amyloid precursor protein (APP) is involved in Alzheimer's disease, plays a role in synapse formation, and is upregulated in intellectual disabilities. Here, we show that during mouse synaptogenesis and in human FXS fibroblasts, a dual dysregulation of APP and the α-secretase ADAM10 leads to the production of an excess of soluble APPα (sAPPα). In FXS, sAPPα signals through the metabotropic receptor that, activating the MAP kinase pathway, leads to synaptic and behavioral deficits. Modulation of ADAM10 activity in FXS reduces sAPPα levels, restoring translational control, synaptic morphology, and behavioral plasticity. Thus, proper control of ADAM10-mediated APP processing during a specific developmental postnatal stage is crucial for healthy spine formation and function(s). Downregulation of ADAM10 activity at synapses may be an effective strategy for ameliorating FXS phenotypes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When speech is degraded, word report is higher for semantically coherent sentences (e.g., her new skirt was made of denim) than for anomalous sentences (e.g., her good slope was done in carrot). Such increased intelligibility is often described as resulting from "top-down" processes, reflecting an assumption that higher-level (semantic) neural processes support lower-level (perceptual) mechanisms. We used time-resolved sparse fMRI to test for top-down neural mechanisms, measuring activity while participants heard coherent and anomalous sentences presented in speech envelope/spectrum noise at varying signal-to-noise ratios (SNR). The timing of BOLD responses to more intelligible speech provides evidence of hierarchical organization, with earlier responses in peri-auditory regions of the posterior superior temporal gyrus than in more distant temporal and frontal regions. Despite Sentence content × SNR interactions in the superior temporal gyrus, prefrontal regions respond after auditory/perceptual regions. Although we cannot rule out top-down effects, this pattern is more compatible with a purely feedforward or bottom-up account, in which the results of lower-level perceptual processing are passed to inferior frontal regions. Behavioral and neural evidence that sentence content influences perception of degraded speech does not necessarily imply "top-down" neural processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human auditory system is comprised of specialized but interacting anatomic and functional pathways encoding object, spatial, and temporal information. We review how learning-induced plasticity manifests along these pathways and to what extent there are common mechanisms subserving such plasticity. A first series of experiments establishes a temporal hierarchy along which sounds of objects are discriminated along basic to fine-grained categorical boundaries and learned representations. A widespread network of temporal and (pre)frontal brain regions contributes to object discrimination via recursive processing. Learning-induced plasticity typically manifested as repetition suppression within a common set of brain regions. A second series considered how the temporal sequence of sound sources is represented. We show that lateralized responsiveness during the initial encoding phase of pairs of auditory spatial stimuli is critical for their accurate ordered perception. Finally, we consider how spatial representations are formed and modified through training-induced learning. A population-based model of spatial processing is supported wherein temporal and parietal structures interact in the encoding of relative and absolute spatial information over the initial ∼300ms post-stimulus onset. Collectively, these data provide insights into the functional organization of human audition and open directions for new developments in targeted diagnostic and neurorehabilitation strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interaural intensity and time differences (IID and ITD) are two binaural auditory cues for localizing sounds in space. This study investigated the spatio-temporal brain mechanisms for processing and integrating IID and ITD cues in humans. Auditory-evoked potentials were recorded, while subjects passively listened to noise bursts lateralized with IID, ITD or both cues simultaneously, as well as a more frequent centrally presented noise. In a separate psychophysical experiment, subjects actively discriminated lateralized from centrally presented stimuli. IID and ITD cues elicited different electric field topographies starting at approximately 75 ms post-stimulus onset, indicative of the engagement of distinct cortical networks. By contrast, no performance differences were observed between IID and ITD cues during the psychophysical experiment. Subjects did, however, respond significantly faster and more accurately when both cues were presented simultaneously. This performance facilitation exceeded predictions from probability summation, suggestive of interactions in neural processing of IID and ITD cues. Supra-additive neural response interactions as well as topographic modulations were indeed observed approximately 200 ms post-stimulus for the comparison of responses to the simultaneous presentation of both cues with the mean of those to separate IID and ITD cues. Source estimations revealed differential processing of IID and ITD cues initially within superior temporal cortices and also at later stages within temporo-parietal and inferior frontal cortices. Differences were principally in terms of hemispheric lateralization. The collective psychophysical and electrophysiological results support the hypothesis that IID and ITD cues are processed by distinct, but interacting, cortical networks that can in turn facilitate auditory localization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because we live in an extremely complex social environment, people require the ability to memorize hundreds or thousands of social stimuli. The aim of this study was to investigate the effect of multiple repetitions on the processing of names and faces varying in terms of pre-experimental familiarity. We measured both behavioral and electrophysiological responses to self-, famous and unknown names and faces in three phases of the experiment (in every phase, each type of stimuli was repeated a pre-determined number of times). We found that the negative brain potential in posterior scalp sites observed approximately 170 ms after the stimulus onset (N170) was insensitive to pre-experimental familiarity but showed slight enhancement with each repetition. The negative wave in the inferior-temporal regions observed at approximately 250 ms (N250) was affected by both pre-experimental (famous>unknown) and intra-experimental familiarity (the more repetitions, the larger N250). In addition, N170 and N250 for names were larger in the left inferior-temporal region, whereas right-hemispheric or bilateral patterns of activity for faces were observed. The subsequent presentations of famous and unknown names and faces were also associated with higher amplitudes of the positive waveform in the central-parietal sites analyzed in the 320-900 ms time-window (P300). In contrast, P300 remained unchanged after the subsequent presentations of self-name and self-face. Moreover, the P300 for unknown faces grew more quickly than for unknown names. The latter suggests that the process of learning faces is more effective than learning names, possibly because faces carry more semantic information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SOUND OBJECTS IN TIME, SPACE AND ACTIONThe term "sound object" describes an auditory experience that is associated with an acoustic event produced by a sound source. At cortical level, sound objects are represented by temporo-spatial activity patterns within distributed neural networks. This investigation concerns temporal, spatial and action aspects as assessed in normal subjects using electrical imaging or measurement of motor activity induced by transcranial magnetic stimulation (TMS).Hearing the same sound again has been shown to facilitate behavioral responses (repetition priming) and to modulate neural activity (repetition suppression). In natural settings the same source is often heard again and again, with variations in spectro-temporal and spatial characteristics. I have investigated how such repeats influence response times in a living vs. non-living categorization task and the associated spatio-temporal patterns of brain activity in humans. Dynamic analysis of distributed source estimations revealed differential sound object representations within the auditory cortex as a function of the temporal history of exposure to these objects. Often heard sounds are coded by a modulation in a bilateral network. Recently heard sounds, independently of the number of previous exposures, are coded by a modulation of a left-sided network.With sound objects which carry spatial information, I have investigated how spatial aspects of the repeats influence neural representations. Dynamics analyses of distributed source estimations revealed an ultra rapid discrimination of sound objects which are characterized by spatial cues. This discrimination involved two temporo-spatially distinct cortical representations, one associated with position-independent and the other with position-linked representations within the auditory ventral/what stream.Action-related sounds were shown to increase the excitability of motoneurons within the primary motor cortex, possibly via an input from the mirror neuron system. The role of motor representations remains unclear. I have investigated repetition priming-induced plasticity of the motor representations of action sounds with the measurement of motor activity induced by TMS pulses applied on the hand motor cortex. TMS delivered to the hand area within the primary motor cortex yielded larger magnetic evoked potentials (MEPs) while the subject was listening to sounds associated with manual than non- manual actions. Repetition suppression was observed at motoneuron level, since during a repeated exposure to the same manual action sound the MEPs were smaller. I discuss these results in terms of specialized neural network involved in sound processing, which is characterized by repetition-induced plasticity.Thus, neural networks which underlie sound object representations are characterized by modulations which keep track of the temporal and spatial history of the sound and, in case of action related sounds, also of the way in which the sound is produced.LES OBJETS SONORES AU TRAVERS DU TEMPS, DE L'ESPACE ET DES ACTIONSLe terme "objet sonore" décrit une expérience auditive associée avec un événement acoustique produit par une source sonore. Au niveau cortical, les objets sonores sont représentés par des patterns d'activités dans des réseaux neuronaux distribués. Ce travail traite les aspects temporels, spatiaux et liés aux actions, évalués à l'aide de l'imagerie électrique ou par des mesures de l'activité motrice induite par stimulation magnétique trans-crânienne (SMT) chez des sujets sains. Entendre le même son de façon répétitive facilite la réponse comportementale (amorçage de répétition) et module l'activité neuronale (suppression liée à la répétition). Dans un cadre naturel, la même source est souvent entendue plusieurs fois, avec des variations spectro-temporelles et de ses caractéristiques spatiales. J'ai étudié la façon dont ces répétitions influencent le temps de réponse lors d'une tâche de catégorisation vivant vs. non-vivant, et les patterns d'activité cérébrale qui lui sont associés. Des analyses dynamiques d'estimations de sources ont révélé des représentations différenciées des objets sonores au niveau du cortex auditif en fonction de l'historique d'exposition à ces objets. Les sons souvent entendus sont codés par des modulations d'un réseau bilatéral. Les sons récemment entendus sont codé par des modulations d'un réseau du côté gauche, indépendamment du nombre d'expositions. Avec des objets sonores véhiculant de l'information spatiale, j'ai étudié la façon dont les aspects spatiaux des sons répétés influencent les représentations neuronales. Des analyses dynamiques d'estimations de sources ont révélé une discrimination ultra rapide des objets sonores caractérisés par des indices spatiaux. Cette discrimination implique deux représentations corticales temporellement et spatialement distinctes, l'une associée à des représentations indépendantes de la position et l'autre à des représentations liées à la position. Ces représentations sont localisées dans la voie auditive ventrale du "quoi".Des sons d'actions augmentent l'excitabilité des motoneurones dans le cortex moteur primaire, possiblement par une afférence du system des neurones miroir. Le rôle des représentations motrices des sons d'actions reste peu clair. J'ai étudié la plasticité des représentations motrices induites par l'amorçage de répétition à l'aide de mesures de potentiels moteurs évoqués (PMEs) induits par des pulsations de SMT sur le cortex moteur de la main. La SMT appliquée sur le cortex moteur primaire de la main produit de plus grands PMEs alors que les sujets écoutent des sons associée à des actions manuelles en comparaison avec des sons d'actions non manuelles. Une suppression liée à la répétition a été observée au niveau des motoneurones, étant donné que lors de l'exposition répétée au son de la même action manuelle les PMEs étaient plus petits. Ces résultats sont discuté en termes de réseaux neuronaux spécialisés impliqués dans le traitement des sons et caractérisés par de la plasticité induite par la répétition. Ainsi, les réseaux neuronaux qui sous-tendent les représentations des objets sonores sont caractérisés par des modulations qui gardent une trace de l'histoire temporelle et spatiale du son ainsi que de la manière dont le son a été produit, en cas de sons d'actions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human electrophysiological studies support a model whereby sensitivity to so-called illusory contour stimuli is first seen within the lateral occipital complex. A challenge to this model posits that the lateral occipital complex is a general site for crude region-based segmentation, based on findings of equivalent hemodynamic activations in the lateral occipital complex to illusory contour and so-called salient region stimuli, a stimulus class that lacks the classic bounding contours of illusory contours. Using high-density electrical mapping of visual evoked potentials, we show that early lateral occipital cortex activity is substantially stronger to illusory contour than to salient region stimuli, whereas later lateral occipital complex activity is stronger to salient region than to illusory contour stimuli. Our results suggest that equivalent hemodynamic activity to illusory contour and salient region stimuli probably reflects temporally integrated responses, a result of the poor temporal resolution of hemodynamic imaging. The temporal precision of visual evoked potentials is critical for establishing viable models of completion processes and visual scene analysis. We propose that crude spatial segmentation analyses, which are insensitive to illusory contours, occur first within dorsal visual regions, not the lateral occipital complex, and that initial illusory contour sensitivity is a function of the lateral occipital complex.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The processing of biological motion is a critical, everyday task performed with remarkable efficiency by human sensory systems. Interest in this ability has focused to a large extent on biological motion processing in the visual modality (see, for example, Cutting, J. E., Moore, C., & Morrison, R. (1988). Masking the motions of human gait. Perception and Psychophysics, 44(4), 339-347). In naturalistic settings, however, it is often the case that biological motion is defined by input to more than one sensory modality. For this reason, here in a series of experiments we investigate behavioural correlates of multisensory, in particular audiovisual, integration in the processing of biological motion cues. More specifically, using a new psychophysical paradigm we investigate the effect of suprathreshold auditory motion on perceptions of visually defined biological motion. Unlike data from previous studies investigating audiovisual integration in linear motion processing [Meyer, G. F. & Wuerger, S. M. (2001). Cross-modal integration of auditory and visual motion signals. Neuroreport, 12(11), 2557-2560; Wuerger, S. M., Hofbauer, M., & Meyer, G. F. (2003). The integration of auditory and motion signals at threshold. Perception and Psychophysics, 65(8), 1188-1196; Alais, D. & Burr, D. (2004). No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research, 19, 185-194], we report the existence of direction-selective effects: relative to control (stationary) auditory conditions, auditory motion in the same direction as the visually defined biological motion target increased its detectability, whereas auditory motion in the opposite direction had the inverse effect. Our data suggest these effects do not arise through general shifts in visuo-spatial attention, but instead are a consequence of motion-sensitive, direction-tuned integration mechanisms that are, if not unique to biological visual motion, at least not common to all types of visual motion. Based on these data and evidence from neurophysiological and neuroimaging studies we discuss the neural mechanisms likely to underlie this effect.