280 resultados para Particle image velocimetry


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital Image Correlation and Tracking (DIC/DDIT) is an optical method that employs tracking & image registration techniques for accurate 2D and 3D measurements of changes in images. This is often used to measure deformation (engineering), displacement, and strain, but it is widely applied in many areas of science and engineering. One very common application is for measuring the motion of an optical mouse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Image fusion techniques are useful to integrate the geometric detail of a high-resolution panchromatic (PAN) image and the spectral information of a low-resolution multispectral (MSS) image, particularly important for understanding land use dynamics at larger scale (1:25000 or lower), which is required by the decision makers to adopt holistic approaches for regional planning. Fused images can extract features from source images and provide more information than one scene of MSS image. High spectral resolution aids in identification of objects more distinctly while high spatial resolution allows locating the objects more clearly. The geoinformatics technologies with an ability to provide high-spatial-spectral-resolution data helps in inventorying, mapping, monitoring and sustainable management of natural resources. Fusion module in GRDSS, taking into consideration the limitations in spatial resolution of MSS data and spectral resolution of PAN data, provide high-spatial-spectral-resolution remote sensing images required for land use mapping on regional scale. GRDSS is a freeware GIS Graphic User Interface (GUI) developed in Tcl/Tk is based on command line arguments of GRASS (Geographic Resources Analysis Support System) with the functionalities for raster analysis, vector analysis, site analysis, image processing, modeling and graphics visualization. It has the capabilities to capture, store, process, analyse, prioritize and display spatial and temporal data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a low cost but high resolution retinal image acquisition system of the human eye. The images acquired by a CMOS image sensor are communicated through the Universal Serial Bus (USB) interface to a personal computer for viewing and further processing. The image acquisition time was estimated to be 2.5 seconds. This system can also be used in telemedicine applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article is concerned with a study of an unusual effect due to density of biomass pellets in modern stoves based on close-coupled gasification-combustion process. The two processes, namely, flaming with volatiles and glowing of the char show different effects. The mass flux of the fuel bears a constant ratio with the air flow rate of gasification during the flaming process and is independent of particle density; char glowing process shows a distinct effect of density. The bed temperatures also have similar features: during flaming, they are identical, but distinct in the char burn (gasification) regime. For the cases, wood char and pellet char, the densities are 350, 990 kg/m(3), and the burn rates are 2.5 and 3.5 g/min with the bed temperatures being 1380 and 1502 K, respectively. A number of experiments on practical stoves showed wood char combustion rates of 2.5 +/- 0.5 g/min and pellet char burn rates of 3.5 +/- 0.5 g/min. In pursuit of the resolution of the differences, experimental data on single particle combustion for forced convection and ambient temperatures effects have been obtained. Single particle char combustion rate with air show a near-d(2) law and surface and core temperatures are identical for both wood and pellet char. A model based on diffusion controlled heat release-radiation-convection balance is set up. Explanation of the observed results needs to include the ash build-up over the char. This model is then used to explain observed behavior in the packed bed; the different packing densities of the biomass chars leading to different heat release rates per unit bed volume are deduced as the cause of the differences in burn rate and bed temperatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose the design and implementation of hardware architecture for spatial prediction based image compression scheme, which consists of prediction phase and quantization phase. In prediction phase, the hierarchical tree structure obtained from the test image is used to predict every central pixel of an image by its four neighboring pixels. The prediction scheme generates an error image, to which the wavelet/sub-band coding algorithm can be applied to obtain efficient compression. The software model is tested for its performance in terms of entropy, standard deviation. The memory and silicon area constraints play a vital role in the realization of the hardware for hand-held devices. The hardware architecture is constructed for the proposed scheme, which involves the aspects of parallelism in instructions and data. The processor consists of pipelined functional units to obtain the maximum throughput and higher speed of operation. The hardware model is analyzed for performance in terms throughput, speed and power. The results of hardware model indicate that the proposed architecture is suitable for power constrained implementations with higher data rate

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the introduction of 2D flat-panel X-ray detectors, 3D image reconstruction using helical cone-beam tomography is fast replacing the conventional 2D reconstruction techniques. In 3D image reconstruction, the source orbit or scanning geometry should satisfy the data sufficiency or completeness condition for exact reconstruction. The helical scan geometry satisfies this condition and hence can give exact reconstruction. The theoretically exact helical cone-beam reconstruction algorithm proposed by Katsevich is a breakthrough and has attracted interest in the 3D reconstruction using helical cone-beam Computed Tomography.In many practical situations, the available projection data is incomplete. One such case is where the detector plane does not completely cover the full extent of the object being imaged in lateral direction resulting in truncated projections. This result in artifacts that mask small features near to the periphery of the ROI when reconstructed using the convolution back projection (CBP) method assuming that the projection data is complete. A number of techniques exist which deal with completion of missing data followed by the CBP reconstruction. In 2D, linear prediction (LP)extrapolation has been shown to be efficient for data completion, involving minimal assumptions on the nature of the data, producing smooth extensions of the missing projection data.In this paper, we propose to extend the LP approach for extrapolating helical cone beam truncated data. The projection on the multi row flat panel detectors has missing columns towards either ends in the lateral direction in truncated data situation. The available data from each detector row is modeled using a linear predictor. The available data is extrapolated and this completed projection data is backprojected using the Katsevich algorithm. Simulation results show the efficacy of the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term stability studies of particle storage rings can not be carried out using conventional numerical integration algorithms. We require symplectic integration algorithms which are both fast and accurate. In this paper, we study a symplectic integration method wherein the sym-plectic map representing the Hamiltonian system is refactorized using polynomial symplectic maps. This method is used to perform long term integration on a particle storage ring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Image and video filtering is a key image-processing task in computer vision especially in noisy environment. In most of the cases the noise source is unknown and hence possess a major difficulty in the filtering operation. In this paper we present an error-correction based learning approach for iterative filtering. A new FIR filter is designed in which the filter coefficients are updated based on Widrow-Hoff rule. Unlike the standard filter the proposed filter has the ability to remove noise without the a priori knowledge of the noise. Experimental result shows that the proposed filter efficiently removes the noise and preserves the edges in the image. We demonstrate the capability of the proposed algorithm by testing it on standard images infected by Gaussian noise and on a real time video containing inherent noise. Experimental result shows that the proposed filter is better than some of the existing standard filters