140 resultados para Radiotherapy, Image-Guided
Resumo:
tRNA isolated from Image Image , grown in the presence of radioactive sulfur was analyzed for the occurrence of thionucleotides. The analysis revealed the presence of at least five thionucleotides, of which three were identified as 4-thiouridylic acid, 5-methylaminomethyl-2-thiouridylic acid and 2-thiocytidylic acid. Iodine-oxidation affected the acceptor ability of several amino acid specific tRNAs, those for lysine and serine being affected most. The tRNA of Image Image differs from that of Image . Image both in the number and the relative proportion of thionucleotides.
Resumo:
Annulation of aromatic rings on the folded Image ,Image ,Image -triquinane backbone has led to the design of potential host systems Image and Image whose crystal structures have been determined.
Resumo:
Denoising of images in compressed wavelet domain has potential application in transmission technology such as mobile communication. In this paper, we present a new image denoising scheme based on restoration of bit-planes of wavelet coefficients in compressed domain. It exploits the fundamental property of wavelet transform - its ability to analyze the image at different resolution levels and the edge information associated with each band. The proposed scheme relies on the fact that noise commonly manifests itself as a fine-grained structure in image and wavelet transform allows the restoration strategy to adapt itself according to directional features of edges. The proposed approach shows promising results when compared with conventional unrestored scheme, in context of error reduction and has capability to adapt to situations where noise level in the image varies. The applicability of the proposed approach has implications in restoration of images due to noisy channels. This scheme, in addition, to being very flexible, tries to retain all the features, including edges of the image. The proposed scheme is computationally efficient.
Resumo:
In positron emission tomography (PET), image reconstruction is a demanding problem. Since, PET image reconstruction is an ill-posed inverse problem, new methodologies need to be developed. Although previous studies show that incorporation of spatial and median priors improves the image quality, the image artifacts such as over-smoothing and streaking are evident in the reconstructed image. In this work, we use a simple, yet powerful technique to tackle the PET image reconstruction problem. Proposed technique is based on the integration of Bayesian approach with that of finite impulse response (FIR) filter. A FIR filter is designed whose coefficients are determined based on the surface diffusion model. The resulting reconstructed image is iteratively filtered and fed back to obtain the new estimate. Experiments are performed on a simulated PET system. The results show that the proposed approach is better than recently proposed MRP algorithm in terms of image quality and normalized mean square error.
Resumo:
Usually digital image forgeries are created by copy-pasting a portion of an image onto some other image. While doing so, it is often necessary to resize the pasted portion of the image to suit the sampling grid of the host image. The resampling operation changes certain characteristics of the pasted portion, which when detected serves as a clue of tampering. In this paper, we present deterministic techniques to detect resampling, and localize the portion of the image that has been tampered with. Two of the techniques are in pixel domain and two others in frequency domain. We study the efficacy of our techniques against JPEG compression and subsequent resampling of the entire tampered image.
Resumo:
In this paper, we present a growing and pruning radial basis function based no-reference (NR) image quality model for JPEG-coded images. The quality of the images are estimated without referring to their original images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity factors such as edge amplitude, edge length, background activity and background luminance. Image quality estimation involves computation of functional relationship between HVS features and subjective test scores. Here, the problem of quality estimation is transformed to a function approximation problem and solved using GAP-RBF network. GAP-RBF network uses sequential learning algorithm to approximate the functional relationship. The computational complexity and memory requirement are less in GAP-RBF algorithm compared to other batch learning algorithms. Also, the GAP-RBF algorithm finds a compact image quality model and does not require retraining when the new image samples are presented. Experimental results prove that the GAP-RBF image quality model does emulate the mean opinion score (MOS). The subjective test results of the proposed metric are compared with JPEG no-reference image quality index as well as full-reference structural similarity image quality index and it is observed to outperform both.
Resumo:
The neural network finds its application in many image denoising applications because of its inherent characteristics such as nonlinear mapping and self-adaptiveness. The design of filters largely depends on the a-priori knowledge about the type of noise. Due to this, standard filters are application and image specific. Widely used filtering algorithms reduce noisy artifacts by smoothing. However, this operation normally results in smoothing of the edges as well. On the other hand, sharpening filters enhance the high frequency details making the image non-smooth. An integrated general approach to design a finite impulse response filter based on principal component neural network (PCNN) is proposed in this study for image filtering, optimized in the sense of visual inspection and error metric. This algorithm exploits the inter-pixel correlation by iteratively updating the filter coefficients using PCNN. This algorithm performs optimal smoothing of the noisy image by preserving high and low frequency features. Evaluation results show that the proposed filter is robust under various noise distributions. Further, the number of unknown parameters is very few and most of these parameters are adaptively obtained from the processed image.
Resumo:
Denoising of medical images in wavelet domain has potential application in transmission technologies such as teleradiology. This technique becomes all the more attractive when we consider the progressive transmission in a teleradiology system. The transmitted images are corrupted mainly due to noisy channels. In this paper, we present a new real time image denoising scheme based on limited restoration of bit-planes of wavelet coefficients. The proposed scheme exploits the fundamental property of wavelet transform - its ability to analyze the image at different resolution levels and the edge information associated with each sub-band. The desired bit-rate control is achieved by applying the restoration on a limited number of bit-planes subject to the optimal smoothing. The proposed method adapts itself to the preference of the medical expert; a single parameter can be used to balance the preservation of (expert-dependent) relevant details against the degree of noise reduction. The proposed scheme relies on the fact that noise commonly manifests itself as a fine-grained structure in image and wavelet transform allows the restoration strategy to adapt itself according to directional features of edges. The proposed approach shows promising results when compared with unrestored case, in context of error reduction. It also has capability to adapt to situations where noise level in the image varies and with the changing requirements of medical-experts. The applicability of the proposed approach has implications in restoration of medical images in teleradiology systems. The proposed scheme is computationally efficient.
Resumo:
Image filtering techniques have potential applications in biomedical image processing such as image restoration and image enhancement. The potential of traditional filters largely depends on the apriori knowledge about the type of noise corrupting the image. This makes the standard filters to be application specific. For example, the well-known median filter and its variants can remove the salt-and-pepper (or impulse) noise at low noise levels. Each of these methods has its own advantages and disadvantages. In this paper, we have introduced a new finite impulse response (FIR) filter for image restoration where, the filter undergoes a learning procedure. The filter coefficients are adaptively updated based on correlated Hebbian learning. This algorithm exploits the inter pixel correlation in the form of Hebbian learning and hence performs optimal smoothening of the noisy images. The application of the proposed filter on images corrupted with Gaussian noise, results in restorations which are better in quality compared to those restored by average and Wiener filters. The restored image is found to be visually appealing and artifact-free
Resumo:
Denoising of images in compressed wavelet domain has potential application in transmission technology such as mobile communication. In this paper, we present a new image denoising scheme based on restoration of bit-planes of wavelet coefficients in compressed domain. It exploits the fundamental property of wavelet transform - its ability to analyze the image at different resolution levels and the edge information associated with each band. The proposed scheme relies on the fact that noise commonly manifests itself as a fine-grained structure in image and wavelet transform allows the restoration strategy to adapt itself according to directional features of edges. The proposed approach shows promising results when compared with conventional unrestored scheme, in context of error reduction and has capability to adapt to situations where noise level in the image varies. The applicability of the proposed approach has implications in restoration of images due to noisy channels. This scheme, in addition, to being very flexible, tries to retain all the features, including edges of the image. The proposed scheme is computationally efficient.
Resumo:
The effect of neutralizing endogenous follicle stimulating hormone (FSH) or luteinizing hormone (LH) with specific antisera on the Image Image and Image Image synthesis of estrogen in the ovary of cycling hamster was studied. Neutralization of FSH or LH on proestrus resulted in a reduction in the estradiol concentration of the ovary on diestrus-2 and next proestrus, suggesting an impairment in follicular development.Injection of FSH antiserum at 0900 h of diestrus-2 significantly reduced the ovarian estradiol concentration within 6–7 h. Further, these ovaries on incubation with testosterone(T) Image Image at 1600 h of the same day or the next day synthesized significantly lower amounts of estradiol, compared to corresponding control ovaries. Although testosterone itself, in the absence of endogenous FSH, could stimulate estrogen synthesis to some extent, FSH had to be supplemented with T to restore estrogen synthesis to the level seen in control ovaries incubated with T. Lack of FSH thus appeared to affect the aromatization step in the estrogen biosynthetic pathway in the ovary of hamster on diestrus-2. In contrast to this, FSH antiserum given on the morning of proestrus had no effect on the Image Image and Image Image synthesis of estrogen, when examined 6–7 h later. The results suggest that there could be a difference in the need for FSH at different times of the cycle.Neutralization of LH either on diestrus-2 or proestrus resulted in a drastic reduction in estradiol concentration of the ovary. This block was at the level of androgen synthesis, since supplementing testosterone alone Image Image could stimulate estrogen synthesis to a more or less similar extent as in the ovaries of control hamsters.
Resumo:
A generalized analysis, using the Vander Lugt operational notation, of the building block optical system comprising a single holographic optical element (HOE) for achieving simultaneous display of the spectrum and the image of an object in a single plane, has been carried out. The salient features of this analysis are: (1) it allows comprehensive characterization of the HOE, (2) it provides insights into the many possible configurations for the system, and (3) it explains the existing results in a consistent manner.
Resumo:
Background: Dengue virus along with the other members of the flaviviridae family has reemerged as deadly human pathogens. Understanding the mechanistic details of these infections can be highly rewarding in developing effective antivirals. During maturation of the virus inside the host cell, the coat proteins E and M undergo conformational changes, altering the morphology of the viral coat. However, due to low resolution nature of the available 3-D structures of viral assemblies, the atomic details of these changes are still elusive. Results: In the present analysis, starting from C alpha positions of low resolution cryo electron microscopic structures the residue level details of protein-protein interaction interfaces of dengue virus coat proteins have been predicted. By comparing the preexisting structures of virus in different phases of life cycle, the changes taking place in these predicted protein-protein interaction interfaces were followed as a function of maturation process of the virus. Besides changing the current notion about the presence of only homodimers in the mature viral coat, the present analysis indicated presence of a proline-rich motif at the protein-protein interaction interface of the coat protein. Investigating the conservation status of these seemingly functionally crucial residues across other members of flaviviridae family enabled dissecting common mechanisms used for infections by these viruses. Conclusions: Thus, using computational approach the present analysis has provided better insights into the preexisting low resolution structures of virus assemblies, the findings of which can be made use of in designing effective antivirals against these deadly human pathogens.
Resumo:
In rapid parallel magnetic resonance imaging, the problem of image reconstruction is challenging. Here, a novel image reconstruction technique for data acquired along any general trajectory in neural network framework, called ``Composite Reconstruction And Unaliasing using Neural Networks'' (CRAUNN), is proposed. CRAUNN is based on the observation that the nature of aliasing remains unchanged whether the undersampled acquisition contains only low frequencies or includes high frequencies too. Here, the transformation needed to reconstruct the alias-free image from the aliased coil images is learnt, using acquisitions consisting of densely sampled low frequencies. Neural networks are made use of as machine learning tools to learn the transformation, in order to obtain the desired alias-free image for actual acquisitions containing sparsely sampled low as well as high frequencies. CRAUNN operates in the image domain and does not require explicit coil sensitivity estimation. It is also independent of the sampling trajectory used, and could be applied to arbitrary trajectories as well. As a pilot trial, the technique is first applied to Cartesian trajectory-sampled data. Experiments performed using radial and spiral trajectories on real and synthetic data, illustrate the performance of the method. The reconstruction errors depend on the acceleration factor as well as the sampling trajectory. It is found that higher acceleration factors can be obtained when radial trajectories are used. Comparisons against existing techniques are presented. CRAUNN has been found to perform on par with the state-of-the-art techniques. Acceleration factors of up to 4, 6 and 4 are achieved in Cartesian, radial and spiral cases, respectively. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
In this paper we develop a multithreaded VLSI processor linear array architecture to render complex environments based on the radiosity approach. The processing elements are identical and multithreaded. They work in Single Program Multiple Data (SPMD) mode. A new algorithm to do the radiosity computations based on the progressive refinement approach[2] is proposed. Simulation results indicate that the architecture is latency tolerant and scalable. It is shown that a linear array of 128 uni-threaded processing elements sustains a throughput close to 0.4 million patches/sec.