987 resultados para Digital mapping -- Databases


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dans le contexte climatique actuel, les régions méditerranéennes connaissent une intensification des phénomènes hydrométéorologiques extrêmes. Au Maroc, le risque lié aux inondations est devenu problématique, les communautés étant vulnérables aux événements extrêmes. En effet, le développement économique et urbain rapide et mal maîtrisé augmente l'exposition aux phénomènes extrêmes. La Direction du Développement et de la Coopération suisse (DDC) s'implique activement dans la réduction des risques naturels au Maroc. La cartographie des dangers et son intégration dans l'aménagement du territoire représentent une méthode efficace afin de réduire la vulnérabilité spatiale. Ainsi, la DDC a mandaté ce projet d'adaptation de la méthode suisse de cartographie des dangers à un cas d'étude marocain (la ville de Beni Mellal, région de Tadla-Azilal, Maroc). La méthode suisse a été adaptée aux contraintes spécifiques du terrain (environnement semi-aride, morphologie de piémont) et au contexte de transfert de connaissances (caractéristiques socio-économiques et pratiques). Une carte des phénomènes d'inondations a été produite. Elle contient les témoins morphologiques et les éléments anthropiques pertinents pour le développement et l'aggravation des inondations. La modélisation de la relation pluie-débit pour des événements de référence, et le routage des hydrogrammes de crue ainsi obtenus ont permis d'estimer quantitativement l'aléa inondation. Des données obtenues sur le terrain (estimations de débit, extension de crues connues) ont permis de vérifier les résultats des modèles. Des cartes d'intensité et de probabilité ont été obtenues. Enfin, une carte indicative du danger d'inondation a été produite sur la base de la matrice suisse du danger qui croise l'intensité et la probabilité d'occurrence d'un événement pour obtenir des degrés de danger assignables au territoire étudié. En vue de l'implémentation des cartes de danger dans les documents de l'aménagement du territoire, nous nous intéressons au fonctionnement actuel de la gestion institutionnelle du risque à Beni Mellal, en étudiant le degré d'intégration de la gestion et la manière dont les connaissances sur les risques influencent le processus de gestion. L'analyse montre que la gestion est marquée par une logique de gestion hiérarchique et la priorité des mesures de protection par rapport aux mesures passives d'aménagement du territoire. Les connaissances sur le risque restent sectorielles, souvent déconnectées. L'innovation dans le domaine de la gestion du risque résulte de collaborations horizontales entre les acteurs ou avec des sources de connaissances externes (par exemple les universités). Des recommandations méthodologiques et institutionnelles issues de cette étude ont été adressées aux gestionnaires en vue de l'implémentation des cartes de danger. Plus que des outils de réduction du risque, les cartes de danger aident à transmettre des connaissances vers le public et contribuent ainsi à établir une culture du risque. - Severe rainfall events are thought to be occurring more frequently in semi-arid areas. In Morocco, flood hazard has become an important topic, notably as rapid economic development and high urbanization rates have increased the exposure of people and assets in hazard-prone areas. The Swiss Agency for Development and Cooperation (SADC) is active in natural hazard mitigation in Morocco. As hazard mapping for urban planning is thought to be a sound tool for vulnerability reduction, the SADC has financed a project aimed at adapting the Swiss approach for hazard assessment and mapping to the case of Morocco. In a knowledge transfer context, the Swiss method was adapted to the semi-arid environment, the specific piedmont morphology and to socio-economic constraints particular to the study site. Following the Swiss guidelines, a hydro-geomorphological map was established, containing all geomorphic elements related to known past floods. Next, rainfall / runoff modeling for reference events and hydraulic routing of the obtained hydrographs were carried out in order to assess hazard quantitatively. Field-collected discharge estimations and flood extent for known floods were used to verify the model results. Flood hazard intensity and probability maps were obtained. Finally, an indicative danger map as defined within the Swiss hazard assessment terminology was calculated using the Swiss hazard matrix that convolves flood intensity with its recurrence probability in order to assign flood danger degrees to the concerned territory. Danger maps become effective, as risk mitigation tools, when implemented in urban planning. We focus on how local authorities are involved in the risk management process and how knowledge about risk impacts the management. An institutional vulnerability "map" was established based on individual interviews held with the main institutional actors in flood management. Results show that flood hazard management is defined by uneven actions and relationships, it is based on top-down decision-making patterns, and focus is maintained on active mitigation measures. The institutional actors embody sectorial, often disconnected risk knowledge pools, whose relationships are dictated by the institutional hierarchy. Results show that innovation in the risk management process emerges when actors collaborate despite the established hierarchy or when they open to outer knowledge pools (e.g. the academia). Several methodological and institutional recommendations were addressed to risk management stakeholders in view of potential map implementation to planning. Hazard assessment and mapping is essential to an integrated risk management approach: more than a mitigation tool, danger maps represent tools that allow communicating on hazards and establishing a risk culture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The DNA microarray technology has arguably caught the attention of the worldwide life science community and is now systematically supporting major discoveries in many fields of study. The majority of the initial technical challenges of conducting experiments are being resolved, only to be replaced with new informatics hurdles, including statistical analysis, data visualization, interpretation, and storage. Two systems of databases, one containing expression data and one containing annotation data are quickly becoming essential knowledge repositories of the research community. This present paper surveys several databases, which are considered "pillars" of research and important nodes in the network. This paper focuses on a generalized workflow scheme typical for microarray experiments using two examples related to cancer research. The workflow is used to reference appropriate databases and tools for each step in the process of array experimentation. Additionally, benefits and drawbacks of current array databases are addressed, and suggestions are made for their improvement.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The large spatial inhomogeneity in transmit B, field (B-1(+)) observable in human MR images at hi h static magnetic fields (B-0) severely impairs image quality. To overcome this effect in brain T-1-weighted images the, MPRAGE sequence was modified to generate two different images at different inversion times MP2RAGE By combining the two images in a novel fashion, it was possible to create T-1-weigthed images where the result image was free of proton density contrast, T-2* contrast, reception bias field, and, to first order transmit field inhomogeneity. MP2RAGE sequence parameters were optimized using Bloch equations to maximize contrast-to-noise ratio per unit of time between brain tissues and minimize the effect of B-1(+) variations through space. Images of high anatomical quality and excellent brain tissue differentiation suitable for applications such as segmentation and voxel-based morphometry were obtained at 3 and 7 T. From such T-1-weighted images, acquired within 12 min, high-resolution 3D T-1 maps were routinely calculated at 7 T with sub-millimeter voxel resolution (0.65-0.85 mm isotropic). T-1 maps were validated in phantom experiments. In humans, the T, values obtained at 7 T were 1.15 +/- 0.06 s for white matter (WM) and 1.92 +/- 0.16 s for grey matter (GM), in good agreement with literature values obtained at lower spatial resolution. At 3 T, where whole-brain acquisitions with 1 mm isotropic voxels were acquired in 8 min the T-1 values obtained (0.81 +/- 0.03 S for WM and 1.35 +/- 0.05 for GM) were once again found to be in very good agreement with values in the literature. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L’objectiu d’aquest estudi es investigar l’organització cortical junt amb la connectivitat còrtico-subcortical en subjectes sans, com a estudi preliminar. Els mapes corticals s’han fet per TMS navegada, i els punts motors obtinguts s’han exportant per estudi tractogràfic i anàlisi de las seves connexions. El coneixement precís de la localització de l’àrea cortical motora primària i les seves connexions es la base per ser utilitzada en estudis posteriors de la reorganització cortical i sub-cortical en pacients amb infart cerebral. Aquesta reorganització es deguda a la neuroplasticitat i pot ser influenciada per els efectes neuromoduladors de la estimulació cerebral no invasiva.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The SIB Swiss Institute of Bioinformatics (www.isb-sib.ch) was created in 1998 as an institution to foster excellence in bioinformatics. It is renowned worldwide for its databases and software tools, such as UniProtKB/Swiss-Prot, PROSITE, SWISS-MODEL, STRING, etc, that are all accessible on ExPASy.org, SIB's Bioinformatics Resource Portal. This article provides an overview of the scientific and training resources SIB has consistently been offering to the life science community for more than 15 years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The unstable rock slope, Stampa, above the village of Flåm, Norway, shows signs of both active and postglacial gravitational deformation over an area of 11 km2. Detailed structural field mapping, annual differential Global Navigation Satellite System (GNSS) surveys, as well as geomorphic analysis of high-resolution digital elevation models based on airborne and terrestrial laser scanning indicate that slope deformation is complex and spatially variable. Numerical modeling was used to investigate the influence of former rockslide activity and to better understand the failure mechanism. Field observations, kinematic analysis and numerical modeling indicate a strong structural control of the unstable area. Based on the integration of the above analyses, we propose that the failure mechanism is dominated by (1) a toppling component, (2) subsiding bilinear wedge failure and (3) planar sliding along the foliation at the toe of the unstable slope. Using differential GNSS, 18 points were measured annually over a period of up to 6 years. Two of these points have an average yearly movement of around 10 mm/year. They are located at the frontal cliff on almost completely detached blocks with volumes smaller than 300,000 m3. Large fractures indicate deep-seated gravitational deformation of volumes reaching several 100 million m3, but the movement rates in these areas are below 2 mm/year. Two different lobes of prehistoric rock slope failures were dated with terrestrial cosmogenic nuclides. While the northern lobe gave an average age of 4,300 years BP, the southern one resulted in two different ages (2,400 and 12,000 years BP), which represent most likely multiple rockfall events. This reflects the currently observable deformation style with unstable blocks in the northern part in between Joasete and Furekamben and no distinct blocks but a high rockfall activity around Ramnanosi in the south. With a relative susceptibility analysis it is concluded that small collapses of blocks along the frontal cliff will be more frequent. Larger collapses of free-standing blocks along the cliff with volumes > 100,000 m3, thus large enough to reach the fjord, cannot be ruled out. A larger collapse involving several million m3 is presently considered of very low likelihood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last decade microsatellites have become one of the most useful genetic markers used in a large number of organisms due to their abundance and high level of polymorphism. Microsatellites have been used for individual identification, paternity tests, forensic studies and population genetics. Data on microsatellite abundance comes preferentially from microsatellite enriched libraries and DNA sequence databases. We have conducted a search in GenBank of more than 16,000 Schistosoma mansoni ESTs and 42,000 BAC sequences. In addition, we obtained 300 sequences from CA and AT microsatellite enriched genomic libraries. The sequences were searched for simple repeats using the RepeatMasker software. Of 16,022 ESTs, we detected 481 (3%) sequences that contained 622 microsatellites (434 perfect, 164 imperfect and 24 compounds). Of the 481 ESTs, 194 were grouped in 63 clusters containing 2 to 15 ESTs per cluster. Polymorphisms were observed in 16 clusters. The 287 remaining ESTs were orphan sequences. Of the 42,017 BAC end sequences, 1,598 (3.8%) contained microsatellites (2,335 perfect, 287 imperfect and 79 compounds). The 1,598 BAC end sequences 80 were grouped into 17 clusters containing 3 to 17 BAC end sequences per cluster. Microsatellites were present in 67 out of 300 sequences from microsatellite enriched libraries (55 perfect, 38 imperfect and 15 compounds). From all of the observed loci 55 were selected for having the longest perfect repeats and flanking regions that allowed the design of primers for PCR amplification. Additionally we describe two new polymorphic microsatellite loci.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Different interferometric techniques were developed last decade to obtain full field, quantitative, and absolute phase imaging, such as phase-shifting, Fourier phase microscopy, Hilbert phase microscopy or digital holographic microscopy (DHM). Although, these techniques are very similar, DHM combines several advantages. In contrast, to phase shifting, DHM is indeed capable of single-shot hologram recording allowing a real-time absolute phase imaging. On the other hand, unlike to Fourier phase or Hilbert phase microscopy, DHM does not require to record in focus images of the specimen on the digital detector (CCD or CMOS camera), because a numerical focalization adjustment can be performed by a numerical wavefront propagation. Consequently, the depth of view of high NA microscope objectives is numerically extended. For example, two different biological cells, floating at different depths in a liquid, can be focalized numerically from the same digital hologram. Moreover, the numerical propagation associated to digital optics and automatic fitting procedures, permits vibrations insensitive full- field phase imaging and the complete compensation for a priori any image distortion or/and phase aberrations introduced for example by imperfections of holders or perfusion chamber. Examples of real-time full-field phase images of biological cells have been demonstrated. ©2008 COPYRIGHT SPIE

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The systematic collection of behavioural information is an important component of second-generation HIV surveillance. The extent of behavioural surveillance among injecting drug users (IDUs) in Europe was examined using data collected through a questionnaire sent to all 31 countries of the European Union and European Free Trade Association as part of a European-wide behavioural surveillance mapping study on HIV and other sexually transmitted infections. The questionnaire was returned by 28 countries during August to September 2008: 16 reported behavioural surveillance studies (two provided no further details). A total of 12 countries used repeated surveys for behavioural surveillance and five used their Treatment Demand Indicator system (three used both approaches). The data collected focused on drug use, injecting practices, testing for HIV and hepatitis C virus and access to healthcare. Eight countries had set national indicators: three indicators were each reported by five countries: the sharing any injecting equipment, uptake of HIV testing and uptake of hepatitis C virus testing. The recall periods used varied. Seven countries reported conducting one-off behavioural surveys (in one country without a repeated survey, these resulted an informal surveillance structure). All countries used convenience sampling, with service-based recruitment being the most common approach. Four countries had used respondent-driven sampling. Three fifths of the countries responding (18/28) reported behavioural surveillance activities among IDUs; however, harmonisation of behavioural surveillance indicators is needed.