871 resultados para Biomedical imaging and visualization
Resumo:
Investigation of large, destructive earthquakes is challenged by their infrequent occurrence and the remote nature of geophysical observations. This thesis sheds light on the source processes of large earthquakes from two perspectives: robust and quantitative observational constraints through Bayesian inference for earthquake source models, and physical insights on the interconnections of seismic and aseismic fault behavior from elastodynamic modeling of earthquake ruptures and aseismic processes.
To constrain the shallow deformation during megathrust events, we develop semi-analytical and numerical Bayesian approaches to explore the maximum resolution of the tsunami data, with a focus on incorporating the uncertainty in the forward modeling. These methodologies are then applied to invert for the coseismic seafloor displacement field in the 2011 Mw 9.0 Tohoku-Oki earthquake using near-field tsunami waveforms and for the coseismic fault slip models in the 2010 Mw 8.8 Maule earthquake with complementary tsunami and geodetic observations. From posterior estimates of model parameters and their uncertainties, we are able to quantitatively constrain the near-trench profiles of seafloor displacement and fault slip. Similar characteristic patterns emerge during both events, featuring the peak of uplift near the edge of the accretionary wedge with a decay toward the trench axis, with implications for fault failure and tsunamigenic mechanisms of megathrust earthquakes.
To understand the behavior of earthquakes at the base of the seismogenic zone on continental strike-slip faults, we simulate the interactions of dynamic earthquake rupture, aseismic slip, and heterogeneity in rate-and-state fault models coupled with shear heating. Our study explains the long-standing enigma of seismic quiescence on major fault segments known to have hosted large earthquakes by deeper penetration of large earthquakes below the seismogenic zone, where mature faults have well-localized creeping extensions. This conclusion is supported by the simulated relationship between seismicity and large earthquakes as well as by observations from recent large events. We also use the modeling to connect the geodetic observables of fault locking with the behavior of seismicity in numerical models, investigating how a combination of interseismic geodetic and seismological estimates could constrain the locked-creeping transition of faults and potentially their co- and post-seismic behavior.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
Decades of costly failures in translating drug candidates from preclinical disease models to human therapeutic use warrant reconsideration of the priority placed on animal models in biomedical research. Following an international workshop attended by experts from academia, government institutions, research funding bodies, and the corporate and nongovernmental organisation (NGO) sectors, in this consensus report, we analyse, as case studies, five disease areas with major unmet needs for new treatments. In view of the scientifically driven transition towards a human pathway-based paradigm in toxicology, a similar paradigm shift appears to be justified in biomedical research. There is a pressing need for an approach that strategically implements advanced, human biology-based models and tools to understand disease pathways at multiple biological scales. We present recommendations to help achieve this.
Resumo:
Event extraction from texts aims to detect structured information such as what has happened, to whom, where and when. Event extraction and visualization are typically considered as two different tasks. In this paper, we propose a novel approach based on probabilistic modelling to jointly extract and visualize events from tweets where both tasks benefit from each other. We model each event as a joint distribution over named entities, a date, a location and event-related keywords. Moreover, both tweets and event instances are associated with coordinates in the visualization space. The manifold assumption that the intrinsic geometry of tweets is a low-rank, non-linear manifold within the high-dimensional space is incorporated into the learning framework using a regularization. Experimental results show that the proposed approach can effectively deal with both event extraction and visualization and performs remarkably better than both the state-of-the-art event extraction method and a pipeline approach for event extraction and visualization.
Resumo:
Background Plant-soil interaction is central to human food production and ecosystem function. Thus, it is essential to not only understand, but also to develop predictive mathematical models which can be used to assess how climate and soil management practices will affect these interactions. Scope In this paper we review the current developments in structural and chemical imaging of rhizosphere processes within the context of multiscale mathematical image based modeling. We outline areas that need more research and areas which would benefit from more detailed understanding. Conclusions We conclude that the combination of structural and chemical imaging with modeling is an incredibly powerful tool which is fundamental for understanding how plant roots interact with soil. We emphasize the need for more researchers to be attracted to this area that is so fertile for future discoveries. Finally, model building must go hand in hand with experiments. In particular, there is a real need to integrate rhizosphere structural and chemical imaging with modeling for better understanding of the rhizosphere processes leading to models which explicitly account for pore scale processes.
Resumo:
It is widely accepted that edema occurs early in the ischemic zone and persists in stable form for at least 1 week after myocardial ischemia/reperfusion. However, there are no longitudinal studies covering from very early (minutes) to late (1 week) reperfusion stages confirming this phenomenon. This study sought to perform a comprehensive longitudinal imaging and histological characterization of the edematous reaction after experimental myocardial ischemia/reperfusion. The study population consisted of 25 instrumented Large-White pigs (30 kg to 40 kg). Closed-chest 40-min ischemia/reperfusion was performed in 20 pigs, which were sacrificed at 120 min (n = 5), 24 h (n = 5), 4 days (n = 5), and 7 days (n = 5) after reperfusion and processed for histological quantification of myocardial water content. Cardiac magnetic resonance (CMR) scans with T2-weighted short-tau inversion recovery and T2-mapping sequences were performed at every follow-up stage until sacrifice. Five additional pigs sacrificed after baseline CMR served as controls. In all pigs, reperfusion was associated with a significant increase in T2 relaxation times in the ischemic region. On 24-h CMR, ischemic myocardium T2 times returned to normal values (similar to those seen pre-infarction). Thereafter, ischemic myocardium-T2 times in CMR performed on days 4 and 7 after reperfusion progressively and systematically increased. On day 7 CMR, T2 relaxation times were as high as those observed at reperfusion. Myocardial water content analysis in the ischemic region showed a parallel bimodal pattern: 2 high water content peaks at reperfusion and at day 7, and a significant decrease at 24 h. Contrary to the accepted view, myocardial edema during the first week after ischemia/reperfusion follows a bimodal pattern. The initial wave appears abruptly upon reperfusion and dissipates at 24 h. Conversely, the deferred wave of edema appears progressively days after ischemia/reperfusion and is maximal around day 7 after reperfusion.
Resumo:
PRISM (Polarized Radiation Imaging and Spectroscopy Mission) was proposed to ESA in May 2013 as a large-class mission for investigating within the framework of the ESA Cosmic Vision program a set of important scientific questions that require high res- olution, high sensitivity, full-sky observations of the sky emission at wavelengths ranging from millimeter-wave to the far-infrared. PRISM’s main objective is to explore the distant universe, probing cosmic history from very early times until now as well as the structures, distribution of matter, and velocity flows throughout our Hubble volume. PRISM will survey the full sky in a large number of frequency bands in both intensity and polarization and will measure the absolute spectrum of sky emission more than three orders of magnitude bet- ter than COBE FIRAS. The data obtained will allow us to precisely measure the absolute sky brightness and polarization of all the components of the sky emission in the observed frequency range, separating the primordial and extragalactic components cleanly from the galactic and zodiacal light emissions. The aim of this Extended White Paper is to provide a more detailed overview of the highlights of the new science that will be made possible by PRISM, which include: (1) the ultimate galaxy cluster survey using the Sunyaev-Zeldovich (SZ) e↵ect, detecting approximately 106 clusters extending to large redshift, including a char- acterization of the gas temperature of the brightest ones (through the relativistic corrections to the classic SZ template) as well as a peculiar velocity survey using the kinetic SZ e↵ect that comprises our entire Hubble volume; (2) a detailed characterization of the properties and evolution of dusty galaxies, where the most of the star formation in the universe took place, the faintest population of which constitute the di↵use CIB (Cosmic Infrared Background); (3) a characterization of the B modes from primordial gravity waves generated during inflation and from gravitational lensing, as well as the ultimate search for primordial non-Gaussianity using CMB polarization, which is less contaminated by foregrounds on small scales than thetemperature anisotropies; (4) a search for distortions from a perfect blackbody spectrum, which include some nearly certain signals and others that are more speculative but more informative; and (5) a study of the role of the magnetic field in star formation and its inter- action with other components of the interstellar medium of our Galaxy. These are but a few of the highlights presented here along with a description of the proposed instrument.
Resumo:
Abstract de congreso: Póster presentado en 12th International Conference on Materials Chemistry (MC12), 20 - 23 July 2015, York, United Kingdom
Resumo:
A direct reconstruction algorithm for complex conductivities in W-2,W-infinity(Omega), where Omega is a bounded, simply connected Lipschitz domain in R-2, is presented. The framework is based on the uniqueness proof by Francini (2000 Inverse Problems 6 107-19), but equations relating the Dirichlet-to-Neumann to the scattering transform and the exponentially growing solutions are not present in that work, and are derived here. The algorithm constitutes the first D-bar method for the reconstruction of conductivities and permittivities in two dimensions. Reconstructions of numerically simulated chest phantoms with discontinuities at the organ boundaries are included.
Resumo:
Optical coherence tomography (OCT) systems are becoming more commonly used in biomedical imaging and, to enable continued uptake, a reliable method of characterizing their performance and validating their operation is required. This paper outlines the use of femtosecond laser subsurface micro-inscription techniques to fabricate an OCT test artifact for validating the resolution performance of a commercial OCT system. The key advantage of this approach is that by utilizing the nonlinear absorption a three dimensional grid of highly localized point and line defects can be written in clear fused silica substrates.
Resumo:
We present a mini-review of the development and contemporary applications of diffusion-sensitive nuclear magnetic resonance (NMR) techniques in biomedical sciences. Molecular diffusion is a fundamental physical phenomenon present in all biological systems. Due to the connection between experimentally measured diffusion metrics and the microscopic environment sensed by the diffusing molecules, diffusion measurements can be used for characterisation of molecular size, molecular binding and association, and the morphology of biological tissues. The emergence of magnetic resonance was instrumental to the development of biomedical applications of diffusion. We discuss the fundamental physical principles of diffusion NMR spectroscopy and diffusion MR imaging. The emphasis is placed on conceptual understanding, historical evolution and practical applications rather than complex technical details. Mathematical description of diffusion is presented to the extent that it is required for the basic understanding of the concepts. We present a wide range of spectroscopic and imaging applications of diffusion magnetic resonance, including colloidal drug delivery vehicles; protein association; characterisation of cell morphology; neural fibre tractography; cardiac imaging; and the imaging of load-bearing connective tissues. This paper is intended as an accessible introduction into the exciting and growing field of diffusion magnetic resonance.
Resumo:
Computer aided technologies, medical imaging, and rapid prototyping has created new possibilities in biomedical engineering. The systematic variation of scaffold architecture as well as the mineralization inside a scaffold/bone construct can be studied using computer imaging technology and CAD/CAM and micro computed tomography (CT). In this paper, the potential of combining these technologies has been exploited in the study of scaffolds and osteochondral repair. Porosity, surface area per unit volume and the degree of interconnectivity were evaluated through imaging and computer aided manipulation of the scaffold scan data. For the osteochondral model, the spatial distribution and the degree of bone regeneration were evaluated. In this study the versatility of two softwares Mimics (Materialize), CTan and 3D realistic visualization (Skyscan) were assessed, too.
Resumo:
The current gold standard for the design of orthopaedic implants is 3D models of long bones obtained using computed tomography (CT). However, high-resolution CT imaging involves high radiation exposure, which limits its use in healthy human volunteers. Magnetic resonance imaging (MRI) is an attractive alternative for the scanning of healthy human volunteers for research purposes. Current limitations of MRI include difficulties of tissue segmentation within joints and long scanning times. In this work, we explore the possibility of overcoming these limitations through the use of MRI scanners operating at a higher field strength. We quantitatively compare the quality of anatomical MR images of long bones obtained at 1.5 T and 3 T and optimise the scanning protocol of 3 T MRI. FLASH images of the right leg of five human volunteers acquired at 1.5 T and 3 T were compared in terms of signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR). The comparison showed a relatively high CNR and SNR at 3 T for most regions of the femur and tibia, with the exception of the distal diaphyseal region of the femur and the mid diaphyseal region of the tibia. This was accompanied by an ~65% increase in the longitudinal spin relaxation time (T1) of the muscle at 3 T compared to 1.5 T. The results suggest that MRI at 3 T may be able to enhance the segmentability and potentially improve the accuracy of 3D anatomical models of long bones, compared to 1.5 T. We discuss how the total imaging times at 3 T can be kept short while maximising the CNR and SNR of the images obtained.
Resumo:
Purpose: The cytomegalovirus (CMV) promoter is one of the most commonly used promoters for expression of transgenes in mammalian cells. The aim of our study was to evaluate the role of methylation and upregulation of the CMV promoter by irradiation and the chemotherapeutic agent cisplatin in vivo using non-invasive fluorescence in vivo imaging. Procedures: Murine fibrosarcoma LPB and mammary carcinoma TS/A cells were stably transfected with plasmids encoding CMV and p21 promoter-driven green fluorescent protein (GFP) gene. Solid TS/A tumors were induced by subcutaneous injection of fluorescent tumor cells, while leg muscles were transiently transfected with plasmid encoding GFP under the control of the CMV promoter. Cells, tumors, and legs were treated either by DNA methylation inhibitor 5-azacytidine, irradiation, or cisplatin. GFP expression was determined using a fluorescence microplate reader in vitro and by non-invasive fluorescence imaging in vivo. Results: Treatment of cells, tumors, and legs with 5-azacytidine (re)activated the CMV promoter. Furthermore, treatment with irradiation or cisplatin resulted in significant upregulation of GFP expression both in vitro and in vivo. Conclusions: Observed alterations in the activity of the CMV promoter limit the usefulness of this widely used promoter as a constitutive promoter. On the other hand, inducibility of CMV promoters can be beneficially used in gene therapy when combined with standard cancer treatment, such as radiotherapy and chemotherapy. © 2010 The Author(s).
Resumo:
Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.