160 resultados para Collective subject discourse technique
Resumo:
Introduction: The primary somatosensory cortex (SI) contains Brodmann areas (BA) 1, 2, 3a, and 3b. Research in non-human primates showed that BAs 3b, 1, and 2 each contain one full representation of the hand with separate representations for each finger. This research also showed that the finger representation in BA3b has larger and clearer finger somatotopy than BA1 and 2. Although several efforts to map finger somatotopy in SI by fMRI have been made at 1.5 and 3T these studies have yielded variable results and were not able to detect single subject finger somatotopy, probably due to the limited spatial extent of the cortical areas representing a digit (close to the resolution in most fMRI experiments), complications due to acquisition of consistent maps for individual subjects (Schweizer et al 2008), or inter-individual variability in sulcal anatomy impeding group studies. Here, we used 7T fMRI to investigate finger somatotopy in SI, some of its functional characteristics, and its reproducibility. Methods: Eight right-handed male subjects were scanned on a 7T scanner (Siemens Medical, Germany) with an 8-channel Tx/Rx rf-coil (Rapid Biomedical, Germany). 1.3x1.3x1.3mm3 resolution fMRI data were acquired using a sinusoidal readout EPI sequence (Speck et al, 2008) and FOV=210mm, TE/TR=27ms/2.5s, GRAPPA=2. Each volume contained 28 transverse slices covering SI. A single EPI volume with 64 slices was acquired to aid coregistration. 1x1x1mm3 anatomical data were acquire using the MP2RAGE sequence (Marques et al, 2009; TE/TR/TI1,2/TRmprage=2.63ms/7.2ms/0.9,3.2s/5s). Subjects were positioned supine in the scanner with their right arm comfortably against the magnet bore. An experimenter was positioned at the entrance of the bore where he could easily reach and stroke successively the two distal phalanxes of each digit. The order of stroked digit was D1 (thumb)-D3-D5-D2-D4, with 20s ON, 10s OFF alternated. This sequence was repeated four times per run and two functional runs were acquired per subject. Realignment, smoothing (FWHM 2 mm), coregistration of the anatomical to the fMRI data and calculation of t-statistics were done using SPM8. An SI mask was obtained via an F-contrast (p<0.001) over all digits. Within the mask, voxels were labeled with the number of the digit demonstrating the highest t-value for that particular voxel. Results: For all subjects, areas corresponding to the five digits were identified in contralateral SI. BA3b showed the most consistent somatotopic finger representation (see an example in Fig.1). The five digits were localized in a consecutive order in the cortex, with D1 most anterior, inferior and distal and D5, most posterior, superior and medial (mean distance between centres of mass of digit representations ±stderr: 4.2±0.7mm; see Fig. 2). The analysis of average beta values within each finger representation region revealed the specificity of the somatotopic region to the tactile input for each tested finger (except digit 4 and 5). Five of these subjects also presented an orderly and consecutive representation of the five digits in BA1 and 2. Conclusions: Our data reveal that the increased BOLD sensitivity at 7T and the high spatial resolution used in this study allow consistent somatotopic mapping using human touch as a stimulus and that human SI contains at least three separate regions that contain five separate representations of all single contralateral fingers. Moreover, adjacent fingers were represented at adjacent cortical regions across the three SI regions. The spatial organization of SI as reflected in individual subject topography corresponds well with previous electrophysiological data in non-human primates. The small distance between digit representations highlights the need for the high spatial resolution available at 7T.
Resumo:
PURPOSE: To investigate the ability of inversion recovery ON-resonant water suppression (IRON) in conjunction with P904 (superparamagnetic nanoparticles which consisting of a maghemite core coated with a low-molecular-weight amino-alcohol derivative of glucose) to perform steady-state equilibrium phase MR angiography (MRA) over a wide dose range. MATERIALS AND METHODS: Experiments were approved by the institutional animal care committee. Rabbits (n = 12) were imaged at baseline and serially after the administration of 10 incremental dosages of 0.57-5.7 mgFe/Kg P904. Conventional T1-weighted and IRON MRA were obtained on a clinical 1.5 Tesla (T) scanner to image the thoracic and abdominal aorta, and peripheral vessels. Contrast-to-noise ratios (CNR) and vessel sharpness were quantified. RESULTS: Using IRON MRA, CNR and vessel sharpness progressively increased with incremental dosages of the contrast agent P904, exhibiting constantly higher contrast values than T1 -weighted MRA over a very wide range of contrast agent doses (CNR of 18.8 ± 5.6 for IRON versus 11.1 ± 2.8 for T1 -weighted MRA at 1.71 mgFe/kg, P = 0.02 and 19.8 ± 5.9 for IRON versus -0.8 ± 1.4 for T1-weighted MRA at 3.99 mgFe/kg, P = 0.0002). Similar results were obtained for vessel sharpness in peripheral vessels, (Vessel sharpness of 46.76 ± 6.48% for IRON versus 33.20 ± 3.53% for T1-weighted MRA at 1.71 mgFe/Kg, P = 0.002, and of 48.66 ± 5.50% for IRON versus 19.00 ± 7.41% for T1-weighted MRA at 3.99 mgFe/Kg, P = 0.003). CONCLUSION: Our study suggests that quantitative CNR and vessel sharpness after the injection of P904 are consistently higher for IRON MRA when compared with conventional T1-weighted MRA. These findings apply for a wide range of contrast agent dosages.
Resumo:
Asked to comment on a collective discussion paper by Jennifer L. Mnookin et al., this Commentary identifies difficulties the authors encountered in defining or agreeing on the subject matter "forensic science" and its perceived deficiencies. They conclude that there is a need for a research culture, whereas this Commentary calls for the development of a forensic science culture through the development of forensic science education fed by research dedicated to forensic science issues. It is a call for a change of emphasis and, perhaps, of paradigm.
Resumo:
Few subjects have caught the attention of the entire world as much as those dealing with natural hazards. The first decade of this new millennium provides a litany of tragic examples of various hazards that turned into disasters affecting millions of individuals around the globe. The human losses (some 225,000 people) associated with the 2004 Indian Ocean earthquake and tsunami, the economic costs (approximately 200 billion USD) of the 2011 Tohoku Japan earthquake, tsunami and reactor event, and the collective social impacts of human tragedies experienced during Hurricane Katrina in 2005 all provide repetitive reminders that we humans are temporary guests occupying a very active and angry planet. Any examples may have been cited here to stress the point that natural events on Earth may, and often do, lead to disasters and catastrophes when humans place themselves into situations of high risk. Few subjects share the true interdisciplinary dependency that characterizes the field of natural hazards. From geology and geophysics to engineering and emergency response to social psychology and economics, the study of natural hazards draws input from an impressive suite of unique and previously independent specializations. Natural hazards provide a common platform to reduce disciplinary boundaries and facilitate a beneficial synergy in the provision of timely and useful information and action on this critical subject matter. As social norms change regarding the concept of acceptable risk and human migration leads to an explosion in the number of megacities, coastal over-crowding and unmanaged habitation in precarious environments such as mountainous slopes, the vulnerability of people and their susceptibility to natural hazards increases dramatically. Coupled with the concerns of changing climates, escalating recovery costs, a growing divergence between more developed and less developed countries, the subject of natural hazards remains on the forefront of issues that affect all people, nations, and environments all the time.This treatise provides a compendium of critical, timely and very detailed information and essential facts regarding the basic attributes of natural hazards and concomitant disasters. The Encyclopedia of Natural Hazards effectively captures and integrates contributions from an international portfolio of almost 300 specialists whose range of expertise addresses over 330 topics pertinent to the field of natural hazards. Disciplinary barriers are overcome in this comprehensive treatment of the subject matter. Clear illustrations and numerous color images enhance the primary aim to communicate and educate. The inclusion of a series of unique ?classic case study? events interspersed throughout the volume provides tangible examples linking concepts, issues, outcomes and solutions. These case studies illustrate different but notable recent, historic and prehistoric events that have shaped the world as we now know it. They provide excellent focal points linking the remaining terms in the volume to the primary field of study. This Encyclopedia of Natural Hazards will remain a standard reference of choice for many years.
Resumo:
The aim of this pilot study is to analyse the discourse of fathers of toddlers concerning fatherhood and the link between some particularities in the discourse and family alliance. The sample consists of 13 Swiss first time fathers (5 fathers of girls and 8 of boys). In order to evaluate the paternal discourse, the fathers were given a semi-structured interview, which was later analysed using the research package Alceste. The family alliance, i.e., the degree of coordination among the partners when executing a task together, was assessed through the Lausanne Trilogue Play (Fivaz-Depeursinge & Corboz-Warnery, 1999). The main results indicated an interesting link between classes of paternal discourse grouped around the following themes "affective relationship", "daily routine" and "educational goals", and the family alliance (defined in two major categories; functional and problematic alliances). Finally, clinical perspectives on links between paternal representations and family functioning at an interactive level are discussed
Resumo:
Pulse-wave velocity (PWV) is considered as the gold-standard method to assess arterial stiffness, an independent predictor of cardiovascular morbidity and mortality. Current available devices that measure PWV need to be operated by skilled medical staff, thus, reducing the potential use of PWV in the ambulatory setting. In this paper, we present a new technique allowing continuous, unsupervised measurements of pulse transit times (PTT) in central arteries by means of a chest sensor. This technique relies on measuring the propagation time of pressure pulses from their genesis in the left ventricle to their later arrival at the cutaneous vasculature on the sternum. Combined thoracic impedance cardiography and phonocardiography are used to detect the opening of the aortic valve, from which a pre-ejection period (PEP) value is estimated. Multichannel reflective photoplethysmography at the sternum is used to detect the distal pulse-arrival time (PAT). A PTT value is then calculated as PTT = PAT - PEP. After optimizing the parameters of the chest PTT calculation algorithm on a nine-subject cohort, a prospective validation study involving 31 normo- and hypertensive subjects was performed. 1/chest PTT correlated very well with the COMPLIOR carotid to femoral PWV (r = 0.88, p < 10 (-9)). Finally, an empirical method to map chest PTT values onto chest PWV values is explored.
Resumo:
L' évaluation quantitative des dangers et des expositions aux nanomatériaux se heurte à de nombreuses incertitudes qui ne seront levées qu'à mesure de la progression des connaissances scientifiques de leurs propriétés. L' une des conséquences de ces incertitudes est que les valeurs limites d'exposition professionnelle définies actuellement pour les poussières ne sont pas nécessairement pertinentes aux nanomatériaux. En l'absence de référentiel quantitatif et, à la demande de la DGS pour éclairer les réflexions de l' AFNOR et de l'ISO sur le sujet, une démarche de gestion graduée des risques (control banding) a été élaborée au sein de l' Anses. Ce développement a été réalisé à l'aide d'un groupe d'experts rapporteurs rattaché au Comité d'experts spécialisés évaluation des risques liés aux agents physiques, aux nouvelles technologies et aux grands aménagements. La mise en oeuvre de la démarche de gestion graduée des risques proposée repose sur quatre grandes étapes: 1. Le recueil des informations. Cette étape consiste à réunir les informations disponibles sur les dangers du nanomatériau manufacturé considéré ; ainsi que sur l'exposition potentielle des personnes aux postes de travail (observation sur le terrain, mesures, etc.). 2. L'attribution d'une bande de danger. Le danger potentiel du nanomatériau manufacturé présent, qu'il soit brut où incorporé dans une matrice (liquide ou solide) est évalué dans cette étape. La bande danger attribuée tient compte de la dangerosité du produit bulk ou de sa substance analogue à l'échelle non-nanométrique, de la bio-persistance du matériau (pour les matériaux fibreux), de sa solubilité et de son éventuelle réactivité. 3. Attribution d'une bande d'exposition. La bande d'exposition du nanomatériau manufacturé considéré ou du produit en contenant est définie par le niveau de potentiel d'émission du produit. Elle tient compte de sa forme physique (solide, liquide, poudre aérosol), de sa pulvérulence et de sa volatilité. Le nombre de travailleurs, la fréquence, la durée d'exposition ainsi que la quantité mise en oeuvre ne sont pas pris en compte, contrairement à une évaluation classique des risques chimiques. 4. Obtention d'une bande de maîtrise des risques. Le croisement des bandes de dangers et d'exposition préalablement attribuées permet de défi nir le niveau de maîtrise du risque. Il fait correspondre les moyens techniques et organisationnels à mettre en oeuvre pour maintenir le risque au niveau le plus faible possible. Un plan d'action est ensuite défi ni pour garantir l'effi cacité de la prévention recommandée par le niveau de maîtrise déterminé. Il tient compte des mesures de prévention déjà existantes et les renforce si nécessaire. Si les mesures indiquées par le niveau de maîtrise de risque ne sont pas réalisables, par exemple, pour des raisons techniques ou budgétaires, une évaluation de risque approfondie devra être réalisée par un expert. La gestion graduée des risques est une méthode alternative pour réaliser une évaluation qualitative de risques et mettre en place des moyens de prévention sans recourir à une évaluation quantitative des risques. Son utilisation semble particulièrement adaptée au contexte des nanomatériaux manufacturés, pour lequel les choix de valeurs de référence (Valeurs limites d'exposition en milieu professionnel) et des techniques de mesurage appropriées souffrent d'une grande incertitude. La démarche proposée repose sur des critères simples, accessibles dans la littérature scientifi que ou via les données techniques relatives aux produits utilisés. Pour autant, sa mise en oeuvre requiert des compétences minimales dans les domaines de la prévention des risques chimiques (chimie, toxicologie, etc.), des nanosciences et des nanotechnologies.
Resumo:
Cerebral blood flow can be studied in a multislice mode with a recently proposed perfusion sequence using inversion of water spins as an endogenous tracer without magnetization transfer artifacts. The magnetization transfer insensitive labeling technique (TILT) has been used for mapping blood flow changes at a microvascular level under motor activation in a multislice mode. In TILT, perfusion mapping is achieved by subtraction of a perfusion-sensitized image from a control image. Perfusion weighting is accomplished by proximal blood labeling using two 90 degrees radiofrequency excitation pulses. For control preparation the labeling pulses are modified such that they have no net effect on blood water magnetization. The percentage of blood flow change, as well as its spatial extent, has been studied in single and multislice modes with varying delays between labeling and imaging. The average perfusion signal change due to activation was 36.9 +/- 9.1% in the single-slice experiments and 38.1 +/- 7.9% in the multislice experiments. The volume of activated brain areas amounted to 1.51 +/- 0.95 cm3 in the contralateral primary motor (M1) area, 0.90 +/- 0.72 cc in the ipsilateral M1 area, 1.27 +/- 0.39 cm3 in the contralateral and 1.42 +/- 0.75 cm3 in the ipsilateral premotor areas, and 0.71 +/- 0.19 cm3 in the supplementary motor area.
Resumo:
Calibrated BOLD fMRI is a promising alternative to the classic BOLD contrast due to its reduced venous sensitivity and greater physiological specificity. The delayed adoption of this technique for cognitive studies may stem partly from a lack of information on the reproducibility of these measures in the context of cognitive tasks. In this study we have explored the applicability and reproducibility of a state-of-the-art calibrated BOLD technique using a complex functional task at 7 tesla. Reproducibility measures of BOLD, CBF, CMRO2 flow-metabolism coupling n and the calibration parameter M were compared and interpreted for three ROIs. We found an averaged intra-subject variation of CMRO2 of 8% across runs and 33% across days. BOLD (46% across runs, 36% across days), CBF (33% across runs, 46% across days) and M (41% across days) showed significantly higher intra-subject variability. Inter-subject variability was found to be high for all quantities, though CMRO2 was the most consistent across brain regions. The results of this study provide evidence that calibrated BOLD may be a viable alternative for longitudinal and cognitive MRI studies.