930 resultados para drought reconstruction
Resumo:
Traditional image reconstruction methods in rapid dynamic diffuse optical tomography employ l(2)-norm-based regularization, which is known to remove the high-frequency components in the reconstructed images and make them appear smooth. The contrast recovery in these type of methods is typically dependent on the iterative nature of method employed, where the nonlinear iterative technique is known to perform better in comparison to linear techniques (noniterative) with a caveat that nonlinear techniques are computationally complex. Assuming that there is a linear dependency of solution between successive frames resulted in a linear inverse problem. This new framework with the combination of l(1)-norm based regularization can provide better robustness to noise and provide better contrast recovery compared to conventional l(2)-based techniques. Moreover, it is shown that the proposed l(1)-based technique is computationally efficient compared to its counterpart (l(2)-based one). The proposed framework requires a reasonably close estimate of the actual solution for the initial frame, and any suboptimal estimate leads to erroneous reconstruction results for the subsequent frames.
Resumo:
CAELinux is a Linux distribution which is bundled with free software packages related to Computer Aided Engineering (CAE). The free software packages include software that can build a three dimensional solid model, programs that can mesh a geometry, software for carrying out Finite Element Analysis (FEA), programs that can carry out image processing etc. Present work has two goals: 1) To give a brief description of CAELinux 2) To demonstrate that CAELinux could be useful for Computer Aided Engineering, using an example of the three dimensional reconstruction of a pig liver from a stack of CT-scan images. One can note that instead of using CAELinux, using commercial software for reconstructing the liver would cost a lot of money. One can also note that CAELinux is a free and open source operating system and all software packages that are included in the operating system are also free. Hence one can conclude that CAELinux could be a very useful tool in application areas like surgical simulation which require three dimensional reconstructions of biological organs. Also, one can see that CAELinux could be a very useful tool for Computer Aided Engineering, in general.
Resumo:
Drought is the most crucial environmental factor that limits productivity of many crop plants. Exploring novel genes and gene combinations is of primary importance in plant drought tolerance research. Stress tolerant genotypes/species are known to express novel stress responsive genes with unique functional significance. Hence, identification and characterization of stress responsive genes from these tolerant species might be a reliable option to engineer the drought tolerance. Safflower has been found to be a relatively drought tolerant crop and thus, it has been the choice of study to characterize the genes expressed under drought stress. In the present study, we have evaluated differential drought tolerance of two cultivars of safflower namely, A1 and Nira using selective physiological marker traits and we have identified cultivar A1 as relatively drought tolerant. To identify the drought responsive genes, we have constructed a stress subtracted cDNA library from cultivar A1 following subtractive hybridization. Analysis of similar to 1,300 cDNA clones resulted in the identification of 667 unique drought responsive ESTs. Protein homology search revealed that 521 (78 %) out of 667 ESTs showed significant similarity to known sequences in the database and majority of them previously identified as drought stress-related genes and were found to be involved in a variety of cellular functions ranging from stress perception to cellular protection. Remaining 146 (22 %) ESTs were not homologous to known sequences in the database and therefore, they were considered to be unique and novel drought responsive genes of safflower. Since safflower is a stress-adapted oil-seed crop this observation has great relevance. In addition, to validate the differential expression of the identified genes, expression profiles of selected clones were analyzed using dot blot (reverse northern), and northern blot analysis. We showed that these clones were differentially expressed under different abiotic stress conditions. The implications of the analyzed genes in abiotic stress tolerance are discussed in our study.
Resumo:
We propose an iterative data reconstruction technique specifically designed for multi-dimensional multi-color fluorescence imaging. Markov random field is employed (for modeling the multi-color image field) in conjunction with the classical maximum likelihood method. It is noted that, ill-posed nature of the inverse problem associated with multi-color fluorescence imaging forces iterative data reconstruction. Reconstruction of three-dimensional (3D) two-color images (obtained from nanobeads and cultured cell samples) show significant reduction in the background noise (improved signal-to-noise ratio) with an impressive overall improvement in the spatial resolution (approximate to 250 nm) of the imaging system. Proposed data reconstruction technique may find immediate application in 3D in vivo and in vitro multi-color fluorescence imaging of biological specimens. (C) 2012 American Institute of Physics. http://dx.doi.org/10.1063/1.4769058]
Resumo:
The classical approach to A/D conversion has been uniform sampling and we get perfect reconstruction for bandlimited signals by satisfying the Nyquist Sampling Theorem. We propose a non-uniform sampling scheme based on level crossing (LC) time information. We show stable reconstruction of bandpass signals with correct scale factor and hence a unique reconstruction from only the non-uniform time information. For reconstruction from the level crossings we make use of the sparse reconstruction based optimization by constraining the bandpass signal to be sparse in its frequency content. While overdetermined system of equations is resorted to in the literature we use an undetermined approach along with sparse reconstruction formulation. We could get a reconstruction SNR > 20dB and perfect support recovery with probability close to 1, in noise-less case and with lower probability in the noisy case. Random picking of LC from different levels over the same limited signal duration and for the same length of information, is seen to be advantageous for reconstruction.
Resumo:
This paper considers the problem of identifying the footprints of communication of multiple transmitters in a given geographical area. To do this, a number of sensors are deployed at arbitrary but known locations in the area, and their individual decisions regarding the presence or absence of the transmitters' signal are combined at a fusion center to reconstruct the spatial spectral usage map. One straightforward scheme to construct this map is to query each of the sensors and cluster the sensors that detect the primary's signal. However, using the fact that a typical transmitter footprint map is a sparse image, two novel compressive sensing based schemes are proposed, which require significantly fewer number of transmissions compared to the querying scheme. A key feature of the proposed schemes is that the measurement matrix is constructed from a pseudo-random binary phase shift applied to the decision of each sensor prior to transmission. The measurement matrix is thus a binary ensemble which satisfies the restricted isometry property. The number of measurements needed for accurate footprint reconstruction is determined using compressive sampling theory. The three schemes are compared through simulations in terms of a performance measure that quantifies the accuracy of the reconstructed spatial spectral usage map. It is found that the proposed sparse reconstruction technique-based schemes significantly outperform the round-robin scheme.
Resumo:
We address the problem of signal reconstruction from Fourier transform magnitude spectrum. The problem arises in many real-world scenarios where magnitude-only measurements are possible, but it is required to construct a complex-valued signal starting from those measurements. We present some new general results in this context and show that the previously known results on minimum-phase rational transfer functions, and recoverability of minimum-phase functions from magnitude spectrum, form special cases of the results reported in this paper. Some simulation results are also provided to demonstrate the practical feasibility of the reconstruction methodology.
Resumo:
General circulation models (GCMs) are routinely used to simulate future climatic conditions. However, rainfall outputs from GCMs are highly uncertain in preserving temporal correlations, frequencies, and intensity distributions, which limits their direct application for downscaling and hydrological modeling studies. To address these limitations, raw outputs of GCMs or regional climate models are often bias corrected using past observations. In this paper, a methodology is presented for using a nested bias-correction approach to predict the frequencies and occurrences of severe droughts and wet conditions across India for a 48-year period (2050-2099) centered at 2075. Specifically, monthly time series of rainfall from 17 GCMs are used to draw conclusions for extreme events. An increasing trend in the frequencies of droughts and wet events is observed. The northern part of India and coastal regions show maximum increase in the frequency of wet events. Drought events are expected to increase in the west central, peninsular, and central northeast regions of India. (C) 2013 American Society of Civil Engineers.
Resumo:
This study borrows the measures developed for the operation of water resources systems as a means of characterizing droughts in a given region. It is argued that the common approach of assessing drought using a univariate measure (severity or reliability) is inadequate as decision makers need assessment of the other facets considered here. It is proposed that the joint distribution of reliability, resilience, and vulnerability (referred to as RRV in a reservoir operation context), assessed using soil moisture data over the study region, be used to characterize droughts. Use is made of copulas to quantify the joint distribution between these variables. As reliability and resilience vary in a nonlinear but almost deterministic way, the joint probability distribution of only resilience and vulnerability is modeled. Recognizing the negative association between the two variables, a Plackett copula is used to formulate the joint distribution. The developed drought index, referred to as the drought management index (DMI), is able to differentiate the drought proneness of a given area when compared to other areas. An assessment of the sensitivity of the DMI to the length of the data segments used in evaluation indicates relative stability is achieved if the data segments are 5years or longer. The proposed approach is illustrated with reference to the Malaprabha River basin in India, using four adjoining Climate Prediction Center grid cells of soil moisture data that cover an area of approximately 12,000 km(2). (C) 2013 American Society of Civil Engineers.
Resumo:
We propose and experimentally demonstrate a three-dimensional (3D) image reconstruction methodology based on Taylor series approximation (TSA) in a Bayesian image reconstruction formulation. TSA incorporates the requirement of analyticity in the image domain, and acts as a finite impulse response filter. This technique is validated on images obtained from widefield, confocal laser scanning fluorescence microscopy and two-photon excited 4pi (2PE-4pi) fluorescence microscopy. Studies on simulated 3D objects, mitochondria-tagged yeast cells (labeled with Mitotracker Orange) and mitochondrial networks (tagged with Green fluorescent protein) show a signal-to-background improvement of 40% and resolution enhancement from 360 to 240 nm. This technique can easily be extended to other imaging modalities (single plane illumination microscopy (SPIM), individual molecule localization SPIM, stimulated emission depletion microscopy and its variants).
Resumo:
Imaging thick specimen at a large penetration depth is a challenge in biophysics and material science. Refractive index mismatch results in spherical aberration that is responsible for streaking artifacts, while Poissonian nature of photon emission and scattering introduces noise in the acquired three-dimensional image. To overcome these unwanted artifacts, we introduced a two-fold approach: first, point-spread function modeling with correction for spherical aberration and second, employing maximum-likelihood reconstruction technique to eliminate noise. Experimental results on fluorescent nano-beads and fluorescently coated yeast cells (encaged in Agarose gel) shows substantial minimization of artifacts. The noise is substantially suppressed, whereas the side-lobes (generated by streaking effect) drops by 48.6% as compared to raw data at a depth of 150 mu m. Proposed imaging technique can be integrated to sophisticated fluorescence imaging techniques for rendering high resolution beyond 150 mu m mark. (C) 2013 AIP Publishing LLC.
Resumo:
The sparse recovery methods utilize the l(p)-normbased regularization in the estimation problem with 0 <= p <= 1. These methods have a better utility when the number of independent measurements are limited in nature, which is a typical case for diffuse optical tomographic image reconstruction problem. These sparse recovery methods, along with an approximation to utilize the l(0)-norm, have been deployed for the reconstruction of diffuse optical images. Their performancewas compared systematically using both numerical and gelatin phantom cases to show that these methods hold promise in improving the reconstructed image quality.
Resumo:
A novel Projection Error Propagation-based Regularization (PEPR) method is proposed to improve the image quality in Electrical Impedance Tomography (EIT). PEPR method defines the regularization parameter as a function of the projection error developed by difference between experimental measurements and calculated data. The regularization parameter in the reconstruction algorithm gets modified automatically according to the noise level in measured data and ill-posedness of the Hessian matrix. Resistivity imaging of practical phantoms in a Model Based Iterative Image Reconstruction (MoBIIR) algorithm as well as with Electrical Impedance Diffuse Optical Reconstruction Software (EIDORS) with PEPR. The effect of PEPR method is also studied with phantoms with different configurations and with different current injection methods. All the resistivity images reconstructed with PEPR method are compared with the single step regularization (STR) and Modified Levenberg Regularization (LMR) techniques. The results show that, the PEPR technique reduces the projection error and solution error in each iterations both for simulated and experimental data in both the algorithms and improves the reconstructed images with better contrast to noise ratio (CNR), percentage of contrast recovery (PCR), coefficient of contrast (COC) and diametric resistivity profile (DRP). (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Recently, it has been shown that fusion of the estimates of a set of sparse recovery algorithms result in an estimate better than the best estimate in the set, especially when the number of measurements is very limited. Though these schemes provide better sparse signal recovery performance, the higher computational requirement makes it less attractive for low latency applications. To alleviate this drawback, in this paper, we develop a progressive fusion based scheme for low latency applications in compressed sensing. In progressive fusion, the estimates of the participating algorithms are fused progressively according to the availability of estimates. The availability of estimates depends on computational complexity of the participating algorithms, in turn on their latency requirement. Unlike the other fusion algorithms, the proposed progressive fusion algorithm provides quick interim results and successive refinements during the fusion process, which is highly desirable in low latency applications. We analyse the developed scheme by providing sufficient conditions for improvement of CS reconstruction quality and show the practical efficacy by numerical experiments using synthetic and real-world data. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Spatial resolution in photoacoustic and thermoacoustic tomography is ultrasound transducer (detector) bandwidth limited. For a circular scanning geometry the axial (radial) resolution is not affected by the detector aperture, but the tangential (lateral) resolution is highly dependent on the aperture size, and it is also spatially varying (depending on the location relative to the scanning center). Several approaches have been reported to counter this problem by physically attaching a negative acoustic lens in front of the nonfocused transducer or by using virtual point detectors. Here, we have implemented a modified delay-and-sum reconstruction method, which takes into account the large aperture of the detector, leading to more than fivefold improvement in the tangential resolution in photoacoustic (and thermoacoustic) tomography. Three different types of numerical phantoms were used to validate our reconstruction method. It is also shown that we were able to preserve the shape of the reconstructed objects with the modified algorithm. (C) 2014 Optical Society of America