96 resultados para digital projects
Resumo:
In this review, we summarize how the new concept of digital optics applied to the field of holographic microscopy has allowed the development of a reliable and flexible digital holographic quantitative phase microscopy (DH-QPM) technique at the nanoscale particularly suitable for cell imaging. Particular emphasis is placed on the original biological information provided by the quantitative phase signal. We present the most relevant DH-QPM applications in the field of cell biology, including automated cell counts, recognition, classification, three-dimensional tracking, discrimination between physiological and pathophysiological states, and the study of cell membrane fluctuations at the nanoscale. In the last part, original results show how DH-QPM can address two important issues in the field of neurobiology, namely, multiple-site optical recording of neuronal activity and noninvasive visualization of dendritic spine dynamics resulting from a full digital holographic microscopy tomographic approach.
Resumo:
Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography.
Resumo:
PURPOSE: To assess the technical feasibility of multi-detector row computed tomographic (CT) angiography in the assessment of peripheral arterial bypass grafts and to evaluate its accuracy and reliability in the detection of graft-related complications, including graft stenosis, aneurysmal changes, and arteriovenous fistulas. MATERIALS AND METHODS: Four-channel multi-detector row CT angiography was performed in 65 consecutive patients with 85 peripheral arterial bypass grafts. Each bypass graft was divided into three segments (proximal anastomosis, course of the graft body, and distal anastomosis), resulting in 255 segments. Two readers evaluated all CT angiograms with regard to image quality and the presence of bypass graft-related abnormalities, including graft stenosis, aneurysmal changes, and arteriovenous fistulas. The results were compared with McNemar test with Bonferroni correction. CT attenuation values were recorded at five different locations from the inflow artery to the outflow artery of the bypass graft. These findings were compared with the findings at duplex ultrasonography (US) in 65 patients and the findings at conventional digital subtraction angiography (DSA) in 27. RESULTS: Image quality was rated as good or excellent in 250 (98%) and in 252 (99%) of 255 bypass segments, respectively. There was excellent agreement both between readers and between CT angiography and duplex US in the detection of graft stenosis, aneurysmal changes, and arteriovenous fistulas (kappa = 0.86-0.99). CT angiography and duplex US were compared with conventional DSA, and there was no statistically significant difference (P >.25) in sensitivity or specificity between CT angiography and duplex US for both readers for detection of hemodynamically significant bypass stenosis or occlusion, aneurysmal changes, or arteriovenous fistulas. Mean CT attenuation values ranged from 232 HU in the inflow artery to 281 HU in the outflow artery of the bypass graft. CONCLUSION: Multi-detector row CT angiography may be an accurate and reliable technique after duplex US in the assessment of peripheral arterial bypass grafts and detection of graft-related complications, including stenosis, aneurysmal changes, and arteriovenous fistulas.
Resumo:
Turtle Mountain in Alberta, Canada has become an important field laboratory for testing different techniques related to the characterization and monitoring of large slope mass movements as the stability of large portions of the eastern face of the mountain is still questionable. In order to better quantify the volumes potentially unstable and the most probable failure mechanisms and potential consequences, structural analysis and runout modeling were preformed. The structural features of the eastern face were investigated using a high resolution digital elevation model (HRDEM). According to displacement datasets and structural observations, potential failure mechanisms affecting different portions of the mountain have been assessed. The volumes of the different potentially unstable blocks have been calculated using the Sloping Local Base Level (SLBL) method. Based on the volume estimation, two and three dimensional dynamic runout analyses have been performed. Calibration of this analysis is based on the experience from the adjacent Frank Slide and other similar rock avalanches. The results will be used to improve the contingency plans within the hazard area.
Resumo:
This is the fourth edition of the Nanosafety Cluster compendium. It documents the status of important projects on nanomaterial toxicity and exposure monitoring, integrated risk management, research infrastructure and coordination and support activities. The compendium is not intended to be a guidance document for human health and environmental safety management of nanotechnologies, as such guidance documents already exist and are widely available. Neither is the compendium intended to be a medium for the publication of scientific papers and research results, as this task is covered by scientific conferences and the peer reviewed press. The compendium aims to bring researchers closer together and show them the potential for synergy in their work. It is a means to establish links and communication between them during the actual research phase and well before the publication of their results. It thus focuses on the communication of projects' strategic aims, extensively covers specific work objectives and the methods used in research, and documents human capacities and available laboratory infrastructure. As such, the compendium supports collaboration on common goals and the joint elaboration of future plans, whilst compromising neither the potential for scientific publication, nor intellectual property rights.
Resumo:
We present a method to automatically segment red blood cells (RBCs) visualized by digital holographic microscopy (DHM), which is based on the marker-controlled watershed algorithm. Quantitative phase images of RBCs can be obtained by using off-axis DHM along to provide some important information about each RBC, including size, shape, volume, hemoglobin content, etc. The most important process of segmentation based on marker-controlled watershed is to perform an accurate localization of internal and external markers. Here, we first obtain the binary image via Otsu algorithm. Then, we apply morphological operations to the binary image to get the internal markers. We then apply the distance transform algorithm combined with the watershed algorithm to generate external markers based on internal markers. Finally, combining the internal and external markers, we modify the original gradient image and apply the watershed algorithm. By appropriately identifying the internal and external markers, the problems of oversegmentation and undersegmentation are avoided. Furthermore, the internal and external parts of the RBCs phase image can also be segmented by using the marker-controlled watershed combined with our method, which can identify the internal and external markers appropriately. Our experimental results show that the proposed method achieves good performance in terms of segmenting RBCs and could thus be helpful when combined with an automated classification of RBCs.
High resolution digital elevation model analysis for landslide hazard assessment (Åkerneset, Norway)
Resumo:
BACKGROUND: The diagnosis of malignant hematologic diseases has become increasingly complex during the last decade. It is based on the interpretation of results from different laboratory analyses, which range from microscopy to gene expression profiling. Recently, a method for the analysis of RNA phenotypes has been developed, the nCounter technology (Nanostring® Technologies), which allows for simultaneous quantification of hundreds of RNA molecules in biological samples. We evaluated this technique in a Swiss multi-center study on eighty-six samples from acute leukemia patients. METHODS: mRNA and protein profiles were established for normal peripheral blood and bone marrow samples. Signal intensities of the various tested antigens with surface expression were similar to those found in previously performed Affymetrix microarray analyses. Acute leukemia samples were analyzed for a set of twenty-two validated antigens and the Pearson Correlation Coefficient for nCounter and flow cytometry results was calculated. RESULTS: Highly significant values between 0.40 and 0.97 were found for the twenty-two antigens tested. A second correlation analysis performed on a per sample basis resulted in concordant results between flow cytometry and nCounter in 44-100% of the antigens tested (mean = 76%), depending on the number of blasts present in a sample, the homogeneity of the blast population, and the type of leukemia (AML or ALL). CONCLUSIONS: The nCounter technology allows for fast and easy depiction of a mRNA profile from hematologic samples. This technology has the potential to become a valuable tool for the diagnosis of acute leukemias, in addition to multi-color flow cytometry.
Resumo:
We report the case of a 37-year-old woman who developed critical upper limb ischemia caused by a cervical rib. Because the malformation was initially undiagnosed, a vascular bypass was performed, and failure occurred. Following a 6-month therapy with sildenafil, revascularization of the arm was successful and amputation was avoided. A 6-year follow-up shows a rich collateral network at the compression site and normal values of digital plethysmography. Because hand surgeons often see patients with digital ulcerations and other manifestations of peripheral vascular pathology, therapy of ischemia with sildenafil could be an effective treatment option in patients not responding to classic drugs.
Resumo:
Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed.
Resumo:
Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.