905 resultados para 3D interpolation
Resumo:
OBJECTIVES: To determine the accuracy of automated vessel-segmentation software for vessel-diameter measurements based on three-dimensional contrast-enhanced magnetic resonance angiography (3D-MRA). METHOD: In 10 patients with high-grade carotid stenosis, automated measurements of both carotid arteries were obtained with 3D-MRA by two independent investigators and compared with manual measurements obtained by digital subtraction angiography (DSA) and 2D maximum-intensity projection (2D-MIP) based on MRA and duplex ultrasonography (US). In 42 patients undergoing carotid endarterectomy (CEA), intraoperative measurements (IOP) were compared with postoperative 3D-MRA and US. RESULTS: Mean interoperator variability was 8% for measurements by DSA and 11% by 2D-MIP, but there was no interoperator variability with the automated 3D-MRA analysis. Good correlations were found between DSA (standard of reference), manual 2D-MIP (rP=0.6) and automated 3D-MRA (rP=0.8). Excellent correlations were found between IOP, 3D-MRA (rP=0.93) and US (rP=0.83). CONCLUSION: Automated 3D-MRA-based vessel segmentation and quantification result in accurate measurements of extracerebral-vessel dimensions.
Resumo:
Though 3D computer graphics has seen tremendous advancement in the past two decades, most available mechanisms for computer interaction in 3D are high cost and targeted for industry and virtual reality applications. Recent advances in Micro-Electro-Mechanical-System (MEMS) devices have brought forth a variety of new low-cost, low-power, miniature sensors with high accuracy, which are well suited for hand-held devices. In this work a novel design for a 3D computer game controller using inertial sensors is proposed, and a prototype device based on this design is implemented. The design incorporates MEMS accelerometers and gyroscopes from Analog Devices to measure the three components of the acceleration and angular velocity. From these sensor readings, the position and orientation of the hand-held compartment can be calculated using numerical methods. The implemented prototype is utilizes a USB 2.0 compliant interface for power and communication with the host system. A Microchip dsPIC microcontroller is used in the design. This microcontroller integrates the analog to digital converters, the program memory flash, as well as the core processor, on a single integrated circuit. A PC running Microsoft Windows operating system is used as the host machine. Prototype firmware for the microcontroller is developed and tested to establish the communication between the design and the host, and perform the data acquisition and initial filtering of the sensor data. A PC front-end application with a graphical interface is developed to communicate with the device, and allow real-time visualization of the acquired data.
Resumo:
The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).
Resumo:
The delivery of oxygen, nutrients, and the removal of waste are essential for cellular survival. Culture systems for 3D bone tissue engineering have addressed this issue by utilizing perfusion flow bioreactors that stimulate osteogenic activity through the delivery of oxygen and nutrients by low-shear fluid flow. It is also well established that bone responds to mechanical stimulation, but may desensitize under continuous loading. While perfusion flow and mechanical stimulation are used to increase cellular survival in vitro, 3D tissue-engineered constructs face additional limitations upon in vivo implantation. As it requires significant amounts of time for vascular infiltration by the host, implants are subject to an increased risk of necrosis. One solution is to introduce tissue-engineered bone that has been pre-vascularized through the co-culture of osteoblasts and endothelial cells on 3D constructs. It is unclear from previous studies: 1) how 3D bone tissue constructs will respond to partitioned mechanical stimulation, 2) how gene expression compares in 2D and in 3D, 3) how co-cultures will affect osteoblast activity, and 4) how perfusion flow will affect co-cultures of osteoblasts and endothelial cells. We have used an integrated approach to address these questions by utilizing mechanical stimulation, perfusion flow, and a co-culture technique to increase the success of 3D bone tissue engineering. We measured gene expression of several osteogenic and angiogenic genes in both 2D and 3D (static culture and mechanical stimulation), as well as in 3D cultures subjected to perfusion flow, mechanical stimulation and partitioned mechanical stimulation. Finally, we co-cultured osteoblasts and endothelial cells on 3D scaffolds and subjected them to long-term incubation in either static culture or under perfusion flow to determine changes in gene expression as well as histological measures of osteogenic and angiogenic activity. We discovered that 2D and 3D osteoblast cultures react differently to shear stress, and that partitioning mechanical stimulation does not affect gene expression in our model. Furthermore, our results suggest that perfusion flow may rescue 3D tissue-engineered constructs from hypoxic-like conditions by reducing hypoxia-specific gene expression and increasing histological indices of both osteogenic and angiogenic activity. Future research to elucidate the mechanisms behind these results may contribute to a more mature bone-like structure that integrates more quickly into host tissue, increasing the potential of bone tissue engineering.
Resumo:
The alveolated structure of the pulmonary acinus plays a vital role in gas exchange function. Three-dimensional (3D) analysis of the parenchymal region is fundamental to understanding this structure-function relationship, but only a limited number of attempts have been conducted in the past because of technical limitations. In this study, we developed a new image processing methodology based on finite element (FE) analysis for accurate 3D structural reconstruction of the gas exchange regions of the lung. Stereologically well characterized rat lung samples (Pediatr Res 53: 72-80, 2003) were imaged using high-resolution synchrotron radiation-based X-ray tomographic microscopy. A stack of 1,024 images (each slice: 1024 x 1024 pixels) with resolution of 1.4 mum(3) per voxel were generated. For the development of FE algorithm, regions of interest (ROI), containing approximately 7.5 million voxels, were further extracted as a working subunit. 3D FEs were created overlaying the voxel map using a grid-based hexahedral algorithm. A proper threshold value for appropriate segmentation was iteratively determined to match the calculated volume density of tissue to the stereologically determined value (Pediatr Res 53: 72-80, 2003). The resulting 3D FEs are ready to be used for 3D structural analysis as well as for subsequent FE computational analyses like fluid dynamics and skeletonization.
Resumo:
Stereological tools are the gold standard for accurate (i.e., unbiased) and precise quantification of any microscopic sample. The past decades have provided a broad spectrum of tools to estimate a variety of parameters such as volumes, surfaces, lengths, and numbers. Some of them require pairs of parallel sections that can be produced by either physical or optical sectioning, with optical sectioning being much more efficient when applicable. Unfortunately, transmission electron microscopy could not fully profit from these riches, mainly because of the large depth of field. Hence, optical sectioning was a long-time desire for electron microscopists. This desire was fulfilled with the development of electron tomography that yield stacks of slices from electron microscopic sections. Now, parallel optical slices of a previously unimagined small thickness (2-5nm axial resolution) can be produced. These optical slices minimize problems related to overprojection effects, and allow for direct stereological analysis, e.g., volume estimation with the Cavalieri principle and number estimation with the optical disector method. Here, we demonstrate that the symbiosis of stereology and electron tomography is an easy and efficient way for quantitative analysis at the electron microscopic level. We call this approach quantitative 3D electron microscopy.
Resumo:
The bridge inspection industry has yet to utilize a rapidly growing technology that shows promise to help improve the inspection process. This thesis investigates the abilities that 3D photogrammetry is capable of providing to the bridge inspector for a number of deterioration mechanisms. The technology can provide information about the surface condition of some bridge components, primarily focusing on the surface defects of a concrete bridge which include cracking, spalling and scaling. Testing was completed using a Canon EOS 7D camera which then processed photos using AgiSoft PhotoScan to align the photos and develop models. Further processing of the models was done using ArcMap in the ArcGIS 10 program to view the digital elevation models of the concrete surface. Several experiments were completed to determine the ability of the technique for the detection of the different defects. The cracks that were able to be resolved in this study were a 1/8 inch crack at a distance of two feet above the surface. 3D photogrammetry was able to be detect a depression of 1 inch wide with 3/16 inch depth which would be sufficient to measure any scaling or spalling that would be required be the inspector. The percentage scaled or spalled was also able to be calculated from the digital elevation models in ArcMap. Different camera factors including the distance from the defects, number of photos and angle, were also investigated to see how each factor affected the capabilities. 3D photogrammetry showed great promise in the detection of scaling or spalling of the concrete bridge surface.
Resumo:
Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation