921 resultados para anderson localization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The removal of non-coding sequences, introns, is an essential part of messenger RNA processing. In most metazoan organisms, the U12-type spliceosome processes a subset of introns containing highly conserved recognition sequences. U12-type introns constitute less than 0,5% of all introns and reside preferentially in genes related to information processing functions, as opposed to genes encoding for metabolic enzymes. It has previously been shown that the excision of U12-type introns is inefficient compared to that of U2-type introns, supporting the model that these introns could provide a rate-limiting control for gene expression. The low efficiency of U12-type splicing is believed to have important consequences to gene expression by limiting the production of mature mRNAs from genes containing U12-type introns. The inefficiency of U12-type splicing has been attributed to the low abundance of the components of the U12-type spliceosome in cells, but this hypothesis has not been proven. The aim of the first part of this work was to study the effect of the abundance of the spliceosomal snRNA components on splicing. Cells with a low abundance of the U12-type spliceosome were found to inefficiently process U12-type introns encoded by a transfected construct, but the expression levels of endogenous genes were not found to be affected by the abundance of the U12-type spliceosome. However, significant levels of endogenous unspliced U12-type intron-containing pre-mRNAs were detected in cells. Together these results support the idea that U12-type splicing may limit gene expression in some situations. The inefficiency of U12-type splicing has also promoted the idea that the U12-type spliceosome may control gene expression, limiting the mRNA levels of some U12-type intron-containing genes. While the identities of the primary target genes that contain U12-type introns are relatively well known, little has previously been known about the downstream genes and pathways potentially affected by the efficiency of U12-type intron processing. Here, the effects of U12-type splicing efficiency on a whole organism were studied in a Drosophila line with a mutation in an essential U12-type spliceosome component. Genes containing U12-type introns showed variable gene-specific responses to the splicing defect, which points to variation in the susceptibility of different genes to changes in splicing efficiency. Surprisingly, microarray screening revealed that metabolic genes were enriched among downstream effects, and that the phenotype could largely be attributed to one U12-type intron-containing mitochondrial gene. Gene expression control by the U12-type spliceosome could thus have widespread effects on metabolic functions in the organism. The subcellular localization of the U12-type spliceosome components was studied as a response to a recent dispute on the localization of the U12-type spliceosome. All components studied were found to be nuclear indicating that the processing of U12-type introns occurs within the nucleus, thus clarifying a question central to the field. The results suggest that the U12-type spliceosome can limit the expression of genes that contain U12-type introns in a gene-specific manner. Through its limiting role in pre-mRNA processing, the U12-type splicing activity can affect specific genetic pathways, which in the case of Drosophila are involved in metabolic functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Viruses are submicroscopic, infectious agents that are obligate intracellular parasites. They adopt various types of strategies for their parasitic replication and proliferation in infected cells. The nucleic acid genome of a virus contains information that redirects molecular machinery of the cell to the replication and production of new virions. Viruses that replicate in the cytoplasm and are unable to use the nuclear transcription machinery of the host cell have developed their own transcription and capping systems. This thesis describes replication strategies of two distantly related viruses, hepatitis E virus (HEV) and Semliki Forest virus (SFV), which belong to the alphavirus-like superfamily of positive-strand RNA viruses. We have demonstrated that HEV and SFV share a unique cap formation pathway specific for alphavirus-like superfamily. The capping enzyme first acts as a methyltransferase, catalyzing the transfer of a methyl group from S-adenosylmethionine to GTP to yield m7GTP. It then transfers the methylated guanosine to the end of viral mRNA. Both reactions are virus-specific and differ from those described for the host cell. Therefore, these capping reactions offer attractive targets for the development of antiviral drugs. Additionally, it has been shown that replication of SFV and HEV takes place in association with cellular membranes. The origin of these membranes and the intracellular localization of the components of the replication complex were studied by modern microscopy techniques. It was demonstrated that SFV replicates in cytoplasmic membranes that are derived from endosomes and lysosomes. According to our studies, site for HEV replication seems to be the intermediate compartment which mediates the traffic between endoplasmic reticulum and the Golgi complex. As a result of this work, a unique mechanism of cap formation for hepatitis E virus replicase has been characterized. It represents a novel target for the development of specific inhibitors against viral replication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an approach based on Zernike moments and Delaunay triangulation for localization of hand-written text in machine printed text documents. The Zernike moments of the image are first evaluated and we classify the text as hand-written using the nearest neighbor classifier. These features are independent of size, slant, orientation, translation and other variations in handwritten text. We then use Delaunay triangulation to reclassify the misclassified text regions. When imposing Delaunay triangulation on the centroid points of the connected components, we extract features based on the triangles and reclassify the text. We remove the noise components in the document as part of the preprocessing step so this method works well on noisy documents. The success rate of the method is found to be 86%. Also for specific hand-written elements such as signatures or similar text the accuracy is found to be even higher at 93%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Binaural experiments are described which indicate that the ability of the brain to localize a desired sound and to suppress undesired sounds coming from other directions can be traced in part to the different times of arrival of a sound at the two ears. It is suggested that the brain inserts a time delay in one of the two nerve paths associated with the ears so as to be able to compare, and thus concentrate on, those sounds arriving at the ears with this particular time of arrival distance.The ability to perceive weak sounds binaurally in the presence of noise is shown to be a simple function of the direction of the desired sound and noise. An explanation is given for the effect reported by Koenig that front and rear confusion is avoided by head movements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose two texture-based approaches, one involving Gabor filters and the other employing log-polar wavelets, for separating text from non-text elements in a document image. Both the proposed algorithms compute local energy at some information-rich points, which are marked by Harris' corner detector. The advantage of this approach is that the algorithm calculates the local energy at selected points and not throughout the image, thus saving a lot of computational time. The algorithm has been tested on a large set of scanned text pages and the results have been seen to be better than the results from the existing algorithms. Among the proposed schemes, the Gabor filter based scheme marginally outperforms the wavelet based scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes and compares four methods of binarzing text images captured using a camera mounted on a cell phone. The advantages and disadvantages(image clarity and computational complexity) of each method over the others are demonstrated through binarized results. The images are of VGA or lower resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We calculate the binding energy of a hole pair within the extended Anderson Hamiltonian for the high-Tc cuprates including a Cu impurity and an oxygen-derived band. The results indicate that stable hole pairs can be formed for intra-atomic and interatomic Coulomb repulsion strengths larger than 6 and 3.5 eV, respectively. It is also shown that the total hybridization strength between the Cu 3d and oxygen p band should be less than 2.5 eV. The hole pairing takes place primarily within the oxygen-derived p band. The range of parameter values for which hole pairing occurs is also consistent with the earlier photoemission results from these cuprates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A damage detection and imaging methodology based on symmetry of neighborhood sensor path and similarity of signal patterns with respect to radial paths in a circular array of sensors has been developed It uses information regarding Limb wave propagation along with a triangulation scheme to rapidly locate and quantify the severity of damage without using all of the sensor data. In a plate like structure, such a scheme can be effectively employed besides full field imaging of wave scattering pattern from the damage, if present in the plate. This new scheme is validated experimentally. Hole and corrosion type damages have been detected and quantified using the proposed scheme successfully. A wavelet based cumulative damage index has been studied which shows monotonic sensitivity against the severity of the damage. which is most desired in a Structural Health Monitoring system. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to evaluate and test methods which could improve local estimates of a general model fitted to a large area. In the first three studies, the intention was to divide the study area into sub-areas that were as homogeneous as possible according to the residuals of the general model, and in the fourth study, the localization was based on the local neighbourhood. According to spatial autocorrelation (SA), points closer together in space are more likely to be similar than those that are farther apart. Local indicators of SA (LISAs) test the similarity of data clusters. A LISA was calculated for every observation in the dataset, and together with the spatial position and residual of the global model, the data were segmented using two different methods: classification and regression trees (CART) and the multiresolution segmentation algorithm (MS) of the eCognition software. The general model was then re-fitted (localized) to the formed sub-areas. In kriging, the SA is modelled with a variogram, and the spatial correlation is a function of the distance (and direction) between the observation and the point of calculation. A general trend is corrected with the residual information of the neighbourhood, whose size is controlled by the number of the nearest neighbours. Nearness is measured as Euclidian distance. With all methods, the root mean square errors (RMSEs) were lower, but with the methods that segmented the study area, the deviance in single localized RMSEs was wide. Therefore, an element capable of controlling the division or localization should be included in the segmentation-localization process. Kriging, on the other hand, provided stable estimates when the number of neighbours was sufficient (over 30), thus offering the best potential for further studies. Even CART could be combined with kriging or non-parametric methods, such as most similar neighbours (MSN).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report results from a first principles calculation of spatially dependent correlation functions around a magnetic impurity in metals described by the nondegenerate Anderson model. Our computations are based on a combination of perturbative scaling theory and numerical renormalization group methods. Results for the conduction election charge density around the impurity and correlation functions involving the conduction electron and impurity charge and spin densities will be presented. The behavior in various regimes including the mixed valent regime will be explored.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have investigated the influence of Fe excess on the electrical transport and magnetism of Fe1+yTe0.5Se0.5 (y=0.04 and 0.09) single crystals. Both compositions exhibit resistively determined superconducting transitions (T-c) with an onset temperature of about 15 K. From the width of the superconducting transition and the magnitude of the lower critical field H-c1, it is inferred that excess of Fe suppresses superconductivity. The linear and nonlinear responses of the ac susceptibility show that the superconducting state for these compositions is inhomogeneous. A possible origin of this phase separation is a magnetic coupling between Fe excess occupying interstitial sites in the chalcogen planes and those in the Fe-square lattice. The temperature derivative of the resistivity d(rho)/d(T) in the temperature range T-c < T < T-a with T-a being the temperature of a magnetic anomaly, changes from positive to negative with increasing Fe. A log 1/T divergence of the resistivity above T-c in the sample with higher amount of Fe suggests a disorder-driven electronic localization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method of source localization in shallow water, based on subspace concept, is described. It is shown that a vector representing the source in the image space spanned by the direction vectors of the source images is orthogonal to the noise eigenspace of the covariance matrix. Computer simulation has shown that a horizontal array of eight sensors can accurately localize one or more uncorrelated sources in shallow water dominated by multipath propagation.