949 resultados para Computerized tomography
Resumo:
A computed tomography number to relative electron density (CT-RED) calibration is performed when commissioning a radiotherapy CT scanner by imaging a calibration phantom with inserts of specified RED and recording the CT number displayed. In this work, CT-RED calibrations were generated using several commercially available phantoms to observe the effect of phantom geometry on conversion to electron density and, ultimately, the dose calculation in a treatment planning system. Using an anthropomorphic phantom as a gold standard, the CT number of a material was found to depend strongly on the amount and type of scattering material surrounding the volume of interest, with the largest variation observed for the highest density material tested, cortical bone. Cortical bone gave a maximum CT number difference of 1,110 when a cylindrical insert of diameter 28 mm scanned free in air was compared to that in the form of a 30 × 30 cm2 slab. The effect of using each CT-RED calibration on planned dose to a patient was quantified using a commercially available treatment planning system. When all calibrations were compared to the anthropomorphic calibration, the largest percentage dose difference was 4.2 % which occurred when the CT-RED calibration curve was acquired with heterogeneity inserts removed from the phantom and scanned free in air. The maximum dose difference observed between two dedicated CT-RED phantoms was ±2.1 %. A phantom that is to be used for CT-RED calibrations must have sufficient water equivalent scattering material surrounding the heterogeneous objects that are to be used for calibration.
Resumo:
A method for reconstruction of an object f(x) x=(x,y,z) from a limited set of cone-beam projection data has been developed. This method uses a modified form of convolution back-projection and projection onto convex sets (POCS) for handling the limited (or incomplete) data problem. In cone-beam tomography, one needs to have a complete geometry to completely reconstruct the original three-dimensional object. While complete geometries do exist, they are of little use in practical implementations. The most common trajectory used in practical scanners is circular, which is incomplete. It is, however, possible to recover some of the information of the original signal f(x) based on a priori knowledge of the nature of f(x). If this knowledge can be posed in a convex set framework, then POCS can be utilized. In this report, we utilize this a priori knowledge as convex set constraints to reconstruct f(x) using POCS. While we demonstrate the effectiveness of our algorithm for circular trajectories, it is essentially geometry independent and will be useful in any limited-view cone-beam reconstruction.
Resumo:
The paper presents for the first time a fully computerized method for structural synthesis of geared kinematic chains which can be used to derive epicyclic gear drives. The method has been formulated on the basis of representing these chains by their graphs, the graphs being in turn represented algebraically by their vertex-vertex incidence matrices. It has thus been possible to make advantageous use of concepts and results from graph theory to develop a method amenable for implementation on a digital computer. The computerized method has been applied to the structural synthesis of single-freedom geared kinematic chains with up to four gear pairs, and the results obtained thereform are presented and discussed.
Resumo:
Study Design Retrospective review of prospectively collected data. Objectives To analyze intervertebral (IV) fusion after thoracoscopic anterior spinal fusion (TASF) and explore the relationship between fusion scores and key clinical variables. Summary of Background Information TASF provides comparable correction with some advantages over posterior approaches but reported mechanical complications, and their relationship to non-union and graft material is unclear. Similarly, the optimal combination of graft type and implant stiffness for effecting successful radiologic union remains undetermined. Methods A subset of patients from a large single-center series who had TASF for progressive scoliosis underwent low-dose computed tomographic scans 2 years after surgery. The IV fusion mass in the disc space was assessed using the 4-point Sucato scale, where 1 indicates <50% and 4 indicates 100% bony fusion of the disc space. The effects of rod diameter, rod material, graft type, fusion level, and mechanical complications on fusion scores were assessed. Results Forty-three patients with right thoracic major curves (mean age 14.9 years) participated in the study. Mean fusion scores for patient subgroups ranged from 1.0 (IV levels with rod fractures) to 2.2 (4.5-mm rod with allograft), with scores tending to decrease with increasing rod size and stiffness. Graft type (autograft vs. allograft) did not affect fusion scores. Fusion scores were highest in the middle levels of the rod construct (mean 2.52), dropping off by 20% to 30% toward the upper and lower extremities of the rod. IV levels where a rod fractured had lower overall mean fusion scores compared to levels without a fracture. Mean total Scoliosis Research Society (SRS) questionnaire scores were 98.9 from a possible total of 120, indicating a good level of patient satisfaction. Conclusions Results suggest that 100% radiologic fusion of the entire disc space is not necessary for successful clinical outcomes following thoracoscopic anterior selective thoracic fusion.
Resumo:
Lateral or transaxial truncation of cone-beam data can occur either due to the field of view limitation of the scanning apparatus or iregion-of-interest tomography. In this paper, we Suggest two new methods to handle lateral truncation in helical scan CT. It is seen that reconstruction with laterally truncated projection data, assuming it to be complete, gives severe artifacts which even penetrates into the field of view. A row-by-row data completion approach using linear prediction is introduced for helical scan truncated data. An extension of this technique known as windowed linear prediction approach is introduced. Efficacy of the two techniques are shown using simulation with standard phantoms. A quantitative image quality measure of the resulting reconstructed images are used to evaluate the performance of the proposed methods against an extension of a standard existing technique.
Resumo:
In an effort to develop a fully computerized approach for structural synthesis of kinematic chains the steps involved in the method of structural synthesis based on transformation of binary chains [38] have been recast in a format suitable for implementation on a digital computer. The methodology thus evolved has been combined with the algebraic procedures for structural analysis [44] to develop a unified computer program for structural synthesis and analysis of simple jointed kinematic chains with a degree of freedom 0. Applications of this program are presented in the succeeding parts of the paper.
Resumo:
The reliability of the computer program for structural synthesis and analysis of simple-jointed kinematic chains developed in Part 1 has been established by applying it to several cases for whuch solutions are either fully or partially available in the literature, such as 7-link, zero-freedom chains; 8- and 10-link, single-freedom chains; 12-link, single-freedom binary chains; and 9-link, two-freedom chains. In the process some discrepancies in the results reported in previous literature have been brought to light.
Resumo:
The unified computer program for structural synthesis and analysis developed in Part 1 has been employed to derive the new and complete collection of 97 10-link, three-freedom simple-jointed kinematic chains. The program shows that of these chains, 3 have total freedom, 70 have partial freedom and the remaining 24 have fractionated freedom and that the 97 chains yield a total of 676 distinct mechanisms.
Resumo:
[Excerpt] The effects of framing on decisions has been widely studied, producing research that suggests individuals respond to framing in predictable and fairly consistent ways (Bazerman, 1984, 1990; Tversky & Kahneman, 1986; Thaler, 1980). The essential finding from this body of research is that "individuals treat risks concerning perceived gains (for example, saving jobs and plants) differently from risks concerning perceived losses (losing jobs and plants)" (Bazerman, 1990, pp. 49-50). Specifically, individuals tend to avoid risks concerning gains, and seek risks concerning losses.
Resumo:
In dentistry, basic imaging techniques such as intraoral and panoramic radiography are in most cases the only imaging techniques required for the detection of pathology. Conventional intraoral radiographs provide images with sufficient information for most dental radiographic needs. Panoramic radiography produces a single image of both jaws, giving an excellent overview of oral hard tissues. Regardless of the technique, plain radiography has only a limited capability in the evaluation of three-dimensional (3D) relationships. Technological advances in radiological imaging have moved from two-dimensional (2D) projection radiography towards digital, 3D and interactive imaging applications. This has been achieved first by the use of conventional computed tomography (CT) and more recently by cone beam CT (CBCT). CBCT is a radiographic imaging method that allows accurate 3D imaging of hard tissues. CBCT has been used for dental and maxillofacial imaging for more than ten years and its availability and use are increasing continuously. However, at present, only best practice guidelines are available for its use, and the need for evidence-based guidelines on the use of CBCT in dentistry is widely recognized. We evaluated (i) retrospectively the use of CBCT in a dental practice, (ii) the accuracy and reproducibility of pre-implant linear measurements in CBCT and multislice CT (MSCT) in a cadaver study, (iii) prospectively the clinical reliability of CBCT as a preoperative imaging method for complicated impacted lower third molars, and (iv) the tissue and effective radiation doses and image quality of dental CBCT scanners in comparison with MSCT scanners in a phantom study. Using CBCT, subjective identification of anatomy and pathology relevant in dental practice can be readily achieved, but dental restorations may cause disturbing artefacts. CBCT examination offered additional radiographic information when compared with intraoral and panoramic radiographs. In terms of the accuracy and reliability of linear measurements in the posterior mandible, CBCT is comparable to MSCT. CBCT is a reliable means of determining the location of the inferior alveolar canal and its relationship to the roots of the lower third molar. CBCT scanners provided adequate image quality for dental and maxillofacial imaging while delivering considerably smaller effective doses to the patient than MSCT. The observed variations in patient dose and image quality emphasize the importance of optimizing the imaging parameters in both CBCT and MSCT.
Resumo:
Thickness measurements derived from optical coherence tomography (OCT) images of the eye are a fundamental clinical and research metric, since they provide valuable information regarding the eye’s anatomical and physiological characteristics, and can assist in the diagnosis and monitoring of numerous ocular conditions. Despite the importance of these measurements, limited attention has been given to the methods used to estimate thickness in OCT images of the eye. Most current studies employing OCT use an axial thickness metric, but there is evidence that axial thickness measures may be biased by tilt and curvature of the image. In this paper, standard axial thickness calculations are compared with a variety of alternative metrics for estimating tissue thickness. These methods were tested on a data set of wide-field chorio-retinal OCT scans (field of view (FOV) 60° x 25°) to examine their performance across a wide region of interest and to demonstrate the potential effect of curvature of the posterior segment of the eye on the thickness estimates. Similarly, the effect of image tilt was systematically examined with the same range of proposed metrics. The results demonstrate that image tilt and curvature of the posterior segment can affect axial tissue thickness calculations, while alternative metrics, which are not biased by these effects, should be considered. This study demonstrates the need to consider alternative methods to calculate tissue thickness in order to avoid measurement error due to image tilt and curvature.
Resumo:
To develop and compare a set of metrics for calculating tissue thickness in wide-field OCT data.
Resumo:
Purpose: A computationally efficient algorithm (linear iterative type) based on singular value decomposition (SVD) of the Jacobian has been developed that can be used in rapid dynamic near-infrared (NIR) diffuse optical tomography. Methods: Numerical and experimental studies have been conducted to prove the computational efficacy of this SVD-based algorithm over conventional optical image reconstruction algorithms. Results: These studies indicate that the performance of linear iterative algorithms in terms of contrast recovery (quantitation of optical images) is better compared to nonlinear iterative (conventional) algorithms, provided the initial guess is close to the actual solution. The nonlinear algorithms can provide better quality images compared to the linear iterative type algorithms. Moreover, the analytical and numerical equivalence of the SVD-based algorithm to linear iterative algorithms was also established as a part of this work. It is also demonstrated that the SVD-based image reconstruction typically requires O(NN2) operations per iteration, as contrasted with linear and nonlinear iterative methods that, respectively, requir O(NN3) and O(NN6) operations, with ``NN'' being the number of unknown parameters in the optical image reconstruction procedure. Conclusions: This SVD-based computationally efficient algorithm can make the integration of image reconstruction procedure with the data acquisition feasible, in turn making the rapid dynamic NIR tomography viable in the clinic to continuously monitor hemodynamic changes in the tissue pathophysiology.
Resumo:
We propose a self-regularized pseudo-time marching scheme to solve the ill-posed, nonlinear inverse problem associated with diffuse propagation of coherent light in a tissuelike object. In particular, in the context of diffuse correlation tomography (DCT), we consider the recovery of mechanical property distributions from partial and noisy boundary measurements of light intensity autocorrelation. We prove the existence of a minimizer for the Newton algorithm after establishing the existence of weak solutions for the forward equation of light amplitude autocorrelation and its Frechet derivative and adjoint. The asymptotic stability of the solution of the ordinary differential equation obtained through the introduction of the pseudo-time is also analyzed. We show that the asymptotic solution obtained through the pseudo-time marching converges to that optimal solution provided the Hessian of the forward equation is positive definite in the neighborhood of optimal solution. The superior noise tolerance and regularization-insensitive nature of pseudo-dynamic strategy are proved through numerical simulations in the context of both DCT and diffuse optical tomography. (C) 2010 Optical Society of America.
Resumo:
We address the issue of noise robustness of reconstruction techniques for frequency-domain optical-coherence tomography (FDOCT). We consider three reconstruction techniques: Fourier, iterative phase recovery, and cepstral techniques. We characterize the reconstructions in terms of their statistical bias and variance and obtain approximate analytical expressions under the assumption of small noise. We also perform Monte Carlo analyses and show that the experimental results are in agreement with the theoretical predictions. It turns out that the iterative and cepstral techniques yield reconstructions with a smaller bias than the Fourier method. The three techniques, however, have identical variance profiles, and their consistency increases linearly as a function of the signal-to-noise ratio.