989 resultados para Normalized image log slope


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a wavelet - based approach to solve the non-linear perturbation equation encountered in optical tomography. A particularly suitable data gathering geometry is used to gather a data set consisting of differential changes in intensity owing to the presence of the inhomogeneous regions. With this scheme, the unknown image, the data, as well as the weight matrix are all represented by wavelet expansions, thus yielding the representation of the original non - linear perturbation equation in the wavelet domain. The advantage in use of the non-linear perturbation equation is that there is no need to recompute the derivatives during the entire reconstruction process. Once the derivatives are computed, they are transformed into the wavelet domain. The purpose of going to the wavelet domain, is that, it has an inherent localization and de-noising property. The use of approximation coefficients, without the detail coefficients, is ideally suited for diffuse optical tomographic reconstructions, as the diffusion equation removes most of the high frequency information and the reconstruction appears low-pass filtered. We demonstrate through numerical simulations, that through solving merely the approximation coefficients one can reconstruct an image which has the same information content as the reconstruction from a non-waveletized procedure. In addition we demonstrate a better noise tolerance and much reduced computation time for reconstructions from this approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The behaviour of saturated soils undergoing consolidation is very complex, It may not follow Terzaghi's theory over the entire consolidation process, Different soils may behave in such a way as to fit into Terzaghi's theory over some specific stages of the consolidation process (percentage of consolidation), This may be one of the reasons for the difficulties faced by the existing curve-fitting procedures in obtaining the coefficient of consolidation, c(v). It has been shown that the slope of the initial linear portion of the theoretical log U-log T curve is constant over a wider range of degree of consolidation, U, when compared with the other methods in use, This initial well-defined straight line in the log U-log T plot intersects the U = 100% line at T = pi/4, which corresponds to U = 88.3%, The proposed log delta-log t method is based on this observation, which gives the value of c(v) through simple graphical construction, In the proposed method, which is more versatile, identification of the characteristic straight lines is very clear; the intersection of these lines is more precise and the method does not depend upon the initial compression for the determination of c(v).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work, a cooling channel is employed to produce semi-solid A356 alloy slurry. To understand the transport process involved, a 3D non-isothermal, multiphase volume averaging model has been developed for simulation of the semi-solid slurry generation process in the cooling channel. For simulation purpose, the three phases considered are the parent melt, the nearly spherical grains and air as separated but highly coupled interpenetrating continua. The conservation equations of mass, momentum, energy and species have been solved for each phase and the thermal and mechanical interactions (drag force) among the phases have been considered using appropriate model. The superheated liquid alloy is poured at the top of the cooling slope/channel, where specified velocity inlet boundary condition is used in the model, and allowed to flow along gravity through the channel. The melt loses its superheat and becomes semisolid up to the end of cooling channel due to the evolving -Al grains with decreasing temperature. The air phase forms a definable air/liquid melt interface, i.e. free surface, due its low density. The results obtained from the present model includes volume fractions of three different phases considered, grain evolution, grain growth rate, size and distribution of solid grains. The effect of key process variables such as pouring temperature, slope angle of the cooling channel and cooling channel wall temperature on temperature distribution, velocity distribution, grain formation and volume fraction of different phases are also studied. The results obtained from the simulations are validated by microstructure study using SEM and quantitative image analysis of the semi-solid slurry microstructure obtained from the experimental set-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a GPU implementation of normalized cuts for road extraction problem using panchromatic satellite imagery. The roads have been extracted in three stages namely pre-processing, image segmentation and post-processing. Initially, the image is pre-processed to improve the tolerance by reducing the clutter (that mostly represents the buildings, vegetation,. and fallow regions). The road regions are then extracted using the normalized cuts algorithm. Normalized cuts algorithm is a graph-based partitioning `approach whose focus lies in extracting the global impression (perceptual grouping) of an image rather than local features. For the segmented image, post-processing is carried out using morphological operations - erosion and dilation. Finally, the road extracted image is overlaid on the original image. Here, a GPGPU (General Purpose Graphical Processing Unit) approach has been adopted to implement the same algorithm on the GPU for fast processing. A performance comparison of this proposed GPU implementation of normalized cuts algorithm with the earlier algorithm (CPU implementation) is presented. From the results, we conclude that the computational improvement in terms of time as the size of image increases for the proposed GPU implementation of normalized cuts. Also, a qualitative and quantitative assessment of the segmentation results has been projected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new nonlinear integral transform relating the ocean wave spectrum to the along-track interferometric synthetic aperture radar (AT-INSAR) image spectrum. The AT-INSAR, which is a synthetic aperture radar (SAR) employing two antennas displaced along the platform's flight direction, is considered to be a better instrument for imaging ocean waves than the SAR. This is because the AT-INSAR yields the phase spectrum and not only the amplitude spectrum as with the conventional SAR. While the SAR and AT-INSAR amplitude spectra depend strongly on the modulation of the normalized radar cross section (NRCS) by the long ocean waves, which is poorly known, the phase spectrum depends only weakly on this modulation. By measuring the phase difference between the signals received by both antennas, AT-INSAR measures the radial component of the orbital velocity associated with the ocean waves, which is related to the ocean wave height field by a well-known transfer function. The nonlinear integral transform derived in this paper differs from the one previously derived by Bao et al. [1999] by an additional term containing the derivative of the radial component of the orbital velocity associated with the long ocean waves. By carrying out numerical simulations, we show that, in general, this additional term cannot be neglected. Furthermore, we present two new quasi-linear approximations to the nonlinear integral transform relating the ocean wave spectrum to the AT-INSAR phase spectrum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Considering the characteristics of the time and space scales of the eddies we established a quasi-static and quasi-geostrophic model to describe their variation and movement in shelf slope water. The analytical solution revealed the main properties of the variation: slow expansion and fast stagnation processes and the law of the eddy motion affected under the background field. All theoretical results are proved by satellite image measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The VEGETATION (VGT) sensor in SPOT 4 has four spectral bands that are equivalent to Landsat Thematic Mapper (TM) bands (blue, red, near-infrared and mid-infrared spectral bands) and provides daily images of the global land surface at a 1-km spatial resolution. We propose a new index for identifying and mapping of snow ice cover, namely the Normalized Difference Snow/Ice Index (NDSII), which uses reflectance values of red and mid-infrared spectral bands of Landsat TM and VGT. For Landsat TM data, NDSII is calculated as NDSIITM =(TM3 -TM5)/(TM3 +TM5); for VGT data, NDSII is calculated as NDSIIVGT =(B2- MIR)/(B2 + MIR). As a case study we used a Landsat TM image that covers the eastern part of the Qilian mountain range in the Qinghai-Xizang (Tibetan) plateau of China. NDSIITM gave similar estimates of the area and spatial distribution of snow/ice cover to the Normalized Difference Snow Index (NDSI=(TM2-TM5)/(TM2+TM5)) which has been proposed by Hall et al. The results indicated that the VGT sensor might have the potential for operational monitoring and mapping of snow/ice cover from regional to global scales, when using NDSIIVGT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Log-polar image architectures, motivated by the structure of the human visual field, have long been investigated in computer vision for use in estimating motion parameters from an optical flow vector field. Practical problems with this approach have been: (i) dependence on assumed alignment of the visual and motion axes; (ii) sensitivity to occlusion form moving and stationary objects in the central visual field, where much of the numerical sensitivity is concentrated; and (iii) inaccuracy of the log-polar architecture (which is an approximation to the central 20°) for wide-field biological vision. In the present paper, we show that an algorithm based on generalization of the log-polar architecture; termed the log-dipolar sensor, provides a large improvement in performance relative to the usual log-polar sampling. Specifically, our algorithm: (i) is tolerant of large misalignmnet of the optical and motion axes; (ii) is insensitive to significant occlusion by objects of unknown motion; and (iii) represents a more correct analogy to the wide-field structure of human vision. Using the Helmholtz-Hodge decomposition to estimate the optical flow vector field on a log-dipolar sensor, we demonstrate these advantages, using synthetic optical flow maps as well as natural image sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work investigated the differences between multileaf collimator (MLC) positioning accuracy determined using either log files or electronic portal imaging devices (EPID) and then assessed the possibility of reducing patient specific quality control (QC) via phantom-less methodologies. In-house software was developed, and validated, to track MLC positional accuracy with the rotational and static gantry picket fence tests using an integrated electronic portal image. This software was used to monitor MLC daily performance over a 1 year period for two Varian TrueBeam linear accelerators, with the results directly compared with MLC positions determined using leaf trajectory log files. This software was validated by introducing known shifts and collimator errors. Skewness of the MLCs was found to be 0.03 ± 0.06° (mean ±1 standard deviation (SD)) and was dependent on whether the collimator was rotated manually or automatically. Trajectory log files, analysed using in-house software, showed average MLC positioning errors with a magnitude of 0.004 ± 0.003 mm (rotational) and 0.004 ± 0.011 mm (static) across two TrueBeam units over 1 year (mean ±1 SD). These ranges, as indicated by the SD, were lower than the related average MLC positioning errors of 0.000 ± 0.025 mm (rotational) and 0.000 ± 0.039 mm (static) that were obtained using the in-house EPID based software. The range of EPID measured MLC positional errors was larger due to the inherent uncertainties of the procedure. Over the duration of the study, multiple MLC positional errors were detected using the EPID based software but these same errors were not detected using the trajectory log files. This work shows the importance of increasing linac specific QC when phantom-less methodologies, such as the use of log files, are used to reduce patient specific QC. Tolerances of 0.25 mm have been created for the MLC positional errors using the EPID-based automated picket fence test. The software allows diagnosis of any specific leaf that needs repair and gives an indication as to the course of action that is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A photograph with a male dressed in a shirt and bow-tie cutting a log and two other men in suits standing on either side of the log. They are surrounded by a large crowd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A photograph of an elderly female standing on the porch of a log house. A handwritten note is on the reverse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A visual SLAM system has been implemented and optimised for real-time deployment on an AUV equipped with calibrated stereo cameras. The system incorporates a novel approach to landmark description in which landmarks are local sub maps that consist of a cloud of 3D points and their associated SIFT/SURF descriptors. Landmarks are also sparsely distributed which simplifies and accelerates data association and map updates. In addition to landmark-based localisation the system utilises visual odometry to estimate the pose of the vehicle in 6 degrees of freedom by identifying temporal matches between consecutive local sub maps and computing the motion. Both the extended Kalman filter and unscented Kalman filter have been considered for filtering the observations. The output of the filter is also smoothed using the Rauch-Tung-Striebel (RTS) method to obtain a better alignment of the sequence of local sub maps and to deliver a large-scale 3D acquisition of the surveyed area. Synthetic experiments have been performed using a simulation environment in which ray tracing is used to generate synthetic images for the stereo system

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The elucidation of spatial variation in the landscape can indicate potential wildlife habitats or breeding sites for vectors, such as ticks or mosquitoes, which cause a range of diseases. Information from remotely sensed data could aid the delineation of vegetation distribution on the ground in areas where local knowledge is limited. The data from digital images are often difficult to interpret because of pixel-to-pixel variation, that is, noise, and complex variation at more than one spatial scale. Landsat Thematic Mapper Plus (ETM+) and Satellite Pour l'Observation de La Terre (SPOT) image data were analyzed for an area close to Douna in Mali, West Africa. The variograms of the normalized difference vegetation index (NDVI) from both types of image data were nested. The parameters of the nested variogram function from the Landsat ETM+ data were used to design the sampling for a ground survey of soil and vegetation data. Variograms of the soil and vegetation data showed that their variation was anisotropic and their scales of variation were similar to those of NDVI from the SPOT data. The short- and long-range components of variation in the SPOT data were filtered out separately by factorial kriging. The map of the short-range component appears to represent the patterns of vegetation and associated shallow slopes and drainage channels of the tiger bush system. The map of the long-range component also appeared to relate to broader patterns in the tiger bush and to gentle undulations in the topography. The results suggest that the types of image data analyzed in this study could be used to identify areas with more moisture in semiarid regions that could support wildlife and also be potential vector breeding sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Perfusion CT imaging of the liver has potential to improve evaluation of tumour angiogenesis. Quantitative parameters can be obtained applying mathematical models to Time Attenuation Curve (TAC). However, there are still some difficulties for an accurate quantification of perfusion parameters due, for example, to algorithms employed, to mathematical model, to patient’s weight and cardiac output and to the acquisition system. In this thesis, new parameters and alternative methodologies about liver perfusion CT are presented in order to investigate the cause of variability of this technique. Firstly analysis were made to assess the variability related to the mathematical model used to compute arterial Blood Flow (BFa) values. Results were obtained implementing algorithms based on “ maximum slope method” and “Dual input one compartment model” . Statistical analysis on simulated data demonstrated that the two methods are not interchangeable. Anyway slope method is always applicable in clinical context. Then variability related to TAC processing in the application of slope method is analyzed. Results compared with manual selection allow to identify the best automatic algorithm to compute BFa. The consistency of a Standardized Perfusion Index (SPV) was evaluated and a simplified calibration procedure was proposed. At the end the quantitative value of perfusion map was analyzed. ROI approach and map approach provide related values of BFa and this means that pixel by pixel algorithm give reliable quantitative results. Also in pixel by pixel approach slope method give better results. In conclusion the development of new automatic algorithms for a consistent computation of BFa and the analysis and definition of simplified technique to compute SPV parameter, represent an improvement in the field of liver perfusion CT analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An algorithm for the real-time registration of a retinal video sequence captured with a scanning digital ophthalmoscope (SDO) to a retinal composite image is presented. This method is designed for a computer-assisted retinal laser photocoagulation system to compensate for retinal motion and hence enhance the accuracy, speed, and patient safety of retinal laser treatments. The procedure combines intensity and feature-based registration techniques. For the registration of an individual frame, the translational frame-to-frame motion between preceding and current frame is detected by normalized cross correlation. Next, vessel points on the current video frame are identified and an initial transformation estimate is constructed from the calculated translation vector and the quadratic registration matrix of the previous frame. The vessel points are then iteratively matched to the segmented vessel centerline of the composite image to refine the initial transformation and register the video frame to the composite image. Criteria for image quality and algorithm convergence are introduced, which assess the exclusion of single frames from the registration process and enable a loss of tracking signal if necessary. The algorithm was successfully applied to ten different video sequences recorded from patients. It revealed an average accuracy of 2.47 ± 2.0 pixels (∼23.2 ± 18.8 μm) for 2764 evaluated video frames and demonstrated that it meets the clinical requirements.