942 resultados para Subpixel precision


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivation: The number of bacterial genomes being sequenced is increasing very rapidly and hence, it is crucial to have procedures for rapid and reliable annotation of their functional elements such as promoter regions, which control the expression of each gene or each transcription unit of the genome. The present work addresses this requirement and presents a generic method applicable across organisms. Results: Relative stability of the DNA double helical sequences has been used to discriminate promoter regions from non-promoter regions. Based on the difference in stability between neighboring regions, an algorithm has been implemented to predict promoter regions on a large scale over 913 microbial genome sequences. The average free energy values for the promoter regions as well as their downstream regions are found to differ, depending on their GC content. Threshold values to identify promoter regions have been derived using sequences flanking a subset of translation start sites from all microbial genomes and then used to predict promoters over the complete genome sequences. An average recall value of 72% (which indicates the percentage of protein and RNA coding genes with predicted promoter regions assigned to them) and precision of 56% is achieved over the 913 microbial genome dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

According to the most prevalent view, there are 3-4 fixed "slots" in visual working memory for temporary storage. Recently this view has been challenged with a theory of dynamic resources which are restricted in their totality but can be freely allocated. The aim of this study is to clarify which one of the theories better describes the performance in visual working memory tasks with contour shapes. Thus in this study, the interest is in both the number of recalled stimuli and the precision of the memory representations. Stimuli in the experiments were radial frequency patterns, which were constructed by sinusoidally modulating the radius of a circle. Five observers participated in the experiment and it consisted of two different tasks. In the delayed discrimination task the number of recalled stimuli was measured with 2-interval forced choice task. Observer was shown serially two displays with 1, 5 s ISI (inter stimulus interval). Displays contained 1-6 patterns and they differed from each other with changed amplitude in one pattern. The participant s task was to report whether the changed pattern had higher amplitude in the first or in the second interval. The amount of amplitude change was defined with QUEST-procedure and the 75 % discrimination threshold was measured in the task. In the recall task the precision of the memory representations was measured with subjective adjustment method. First, observer was shown 1-6 patterns and after 1, 5 s ISI one location of the previously shown pattern was cued. Observer s task was to adjust amplitude of a probe pattern to match the amplitude of the pattern in working memory. In the delayed discrimination task the performance of all observes declined smoothly when the number of presented patterns was increased. The result supports the resource theory of working memory as there was no sudden fall in the performance. The amplitude threshold for one item was 0.01 0.05 and as the number of items increased from 1 to 6 there was a 4 15 -fold linear increase in the amplitude threshold (0.14 0.29). In the recall adjustment task the precision of four observers performance declined smoothly as the number of presented patterns was increased. The result also supports the resource theory. The standard deviation for one item was 0.03 0.05 and as the number of items increased from 1 to 6 there was a 2 3 -fold linear increase in the amplitude threshold (0.06 0.11). These findings show that the performance in a visual working memory task is described better according to the theory of freely allocated resources and not to the traditional slot-model. In addition, the allocation of the resources depends on the properties of the individual observer and the visual working memory task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low interlaminar strength and the consequent possibility of interlaminar failures in composite laminates demand an examination of interlaminar stresses and/or strains to ensure their satisfactory performance. As a first approximation, these stresses can be obtained from thickness-wise integration of ply equilibrium equations using in-plane stresses from the classical laminated plate theory. Implementation of this approach in the finite element form requires evaluation of third and fourth order derivatives of the displacement functions in an element. Hence, a high precision element developed by Jayachandrabose and Kirkhope (1985) is used here and the required derivatives are obtained in two ways. (i) from direct differentiation of element shape functions; and (ii) by adapting a finite difference technique applied to the nodal strains and curvatures obtained from the finite element analysis. Numerical results obtained for a three-layered symmetric and a two-layered asymmetric laminate show that the second scheme is quite effective compared to the first scheme particularly for the case of asymmetric laminates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Language software applications encounter new words, e.g., acronyms, technical terminology, names or compounds of such words. In order to add new words to a lexicon, we need to indicate their inflectional paradigm. We present a new generally applicable method for creating an entry generator, i.e. a paradigm guesser, for finite-state transducer lexicons. As a guesser tends to produce numerous suggestions, it is important that the correct suggestions be among the first few candidates. We prove some formal properties of the method and evaluate it on Finnish, English and Swedish full-scale transducer lexicons. We use the open-source Helsinki Finite-State Technology to create finitestate transducer lexicons from existing lexical resources and automatically derive guessers for unknown words. The method has a recall of 82-87 % and a precision of 71-76 % for the three test languages. The model needs no external corpus and can therefore serve as a baseline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diffuse optical tomographic image reconstruction uses advanced numerical models that are computationally costly to be implemented in the real time. The graphics processing units (GPUs) offer desktop massive parallelization that can accelerate these computations. An open-source GPU-accelerated linear algebra library package is used to compute the most intensive matrix-matrix calculations and matrix decompositions that are used in solving the system of linear equations. These open-source functions were integrated into the existing frequency-domain diffuse optical image reconstruction algorithms to evaluate the acceleration capability of the GPUs (NVIDIA Tesla C 1060) with increasing reconstruction problem sizes. These studies indicate that single precision computations are sufficient for diffuse optical tomographic image reconstruction. The acceleration per iteration can be up to 40, using GPUs compared to traditional CPUs in case of three-dimensional reconstruction, where the reconstruction problem is more underdetermined, making the GPUs more attractive in the clinical settings. The current limitation of these GPUs in the available onboard memory (4 GB) that restricts the reconstruction of a large set of optical parameters, more than 13, 377. (C) 2010 Society of Photo-Optical Instrumentation Engineers. DOI: 10.1117/1.3506216]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The constructional details of an 18-bit binary inductive voltage divider (IVD) for a.c. bridge applications is described. Simplified construction with less number of windings, interconnection of winding through SPDT solid state relays instead of DPDT relays, improves reliability of IVD. High accuracy for most precision measurement achieved without D/A converters. The checks for self consistency in voltage division shows that the error is less than 2 counts in 2(18).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Woolley's revolutionary proposal that quantum mechanics does not sanction the concept of ''molecular structure'' - which is but only a ''metaphor'' - has fundamental implications for physical organic chemistry. On the one hand, the Uncertainty Principle limits the precision with which transition state structures may be defined; on the other, extension of the structure concept to the transition state may be unviable. Attempts to define transition states have indeed caused controversy. Consequences for molecular recognition, and a mechanistic classification, are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the main disturbances in EEG signals is EMG artefacts generated by muscle movements. In the paper, the use of a linear phase FIR digital low-pass filter with finite wordlength precision coefficients is proposed, designed using the compensation procedure, to minimise EMG artefacts in contaminated EEG signals. To make the filtering more effective, different structures are used, i.e. cascading, twicing and sharpening (apart from simple low-pass filtering) of the designed FIR filter Modifications are proposed to twicing and sharpening structures to regain the linear phase characteristics that are lost in conventional twicing and sharpening operations. The efficacy of all these transformed filters in minimising EMG artefacts is studied, using SNR improvements as a performance measure for simulated signals. Time plots of the signals are also compared. Studies show that the modified sharpening structure is superior in performance to all other proposed methods. These algorithms have also been applied to real or recorded EMG-contaminated EEG signal. Comparison of time plots, and also the output SNR, show that the proposed modified sharpened structure works better in minimising EMG artefacts compared with other methods considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The physics potential of e(+) e(-) linear colliders is summarized in this report. These machines are planned to operate in the first phase at a center-of-mass energy of 500 GeV, before being scaled up to about 1 TeV. In the second phase of the operation, a final energy of about 2 TeV is expected. The machines will allow us to perform precision tests of the heavy particles in the Standard Model, the top quark and the electroweak bosons. They are ideal facilities for exploring the properties of Higgs particles, in particular in the intermediate mass range. New vector bosons and novel matter particles in extended gauge theories can be searched for and studied thoroughly. The machines provide unique opportunities for the discovery of particles in supersymmetric extensions of the Standard Model, the spectrum of Higgs particles, the supersymmetric partners of the electroweak gauge and Higgs bosons, and of the matter particles. High precision analyses of their properties and interactions will allow for extrapolations to energy scales close to the Planck scale where gravity becomes significant. In alternative scenarios, i.e. compositeness models, novel matter particles and interactions can be discovered and investigated in the energy range above the existing colliders lip to the TeV scale. Whatever scenario is realized in Nature, the discovery potential of e(+) e(-) linear colliders and the high precision with which the properties of particles and their interactions can be analyzed, define an exciting physics program complementary to hadron machines. (C) 1998 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cis-regulatory regions on DNA serve as binding sites for proteins such as transcription factors and RNA polymerase. The combinatorial interaction of these proteins plays a crucial role in transcription initiation, which is an important point of control in the regulation of gene expression. We present here an analysis of the performance of an in silico method for predicting cis-regulatory regions in the plant genomes of Arabidopsis (Arabidopsis thaliana) and rice (Oryza sativa) on the basis of free energy of DNA melting. For protein-coding genes, we achieve recall and precision of 96% and 42% for Arabidopsis and 97% and 31% for rice, respectively. For noncoding RNA genes, the program gives recall and precision of 94% and 75% for Arabidopsis and 95% and 90% for rice, respectively. Moreover, 96% of the false-positive predictions were located in noncoding regions of primary transcripts, out of which 20% were found in the first intron alone, indicating possible regulatory roles. The predictions for orthologous genes from the two genomes showed a good correlation with respect to prediction scores and promoter organization. Comparison of our results with an existing program for promoter prediction in plant genomes indicates that our method shows improved prediction capability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A shear flexible 4-noded finite element formulation, having five mechanical degrees of freedom per node, is presented for modeling the dynamic as well as the static thermal response of laminated composites containing distributed piezoelectric layers. This element has been developed to have one electrical degree of freedom per piezoelectric layer. The mass, stiffness and thermo-electro-mechanical coupling effects on the actuator and sensor layers have been considered. Numerical studies have been conducted to investigate both the sensory and active responses on piezoelectric composite beam and plate structures. It is. concluded that both the thermal and pyroelectric effects are important and need to be considered in the precision distributed control of intelligent structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We demonstrate a technique for precisely measuring hyperfine intervals in alkali atoms. The atoms form a three-level system in the presence of a strong control laser and a weak probe laser. The dressed states created by the control laser show significant linewidth reduction. We have developed a technique for Doppler-free spectroscopy that enables the separation between the dressed states to be measured with high accuracy even in room temperature atoms. The states go through an avoided crossing as the detuning of the control laser is changed from positive to negative. By studying the separation as a function of detuning, the center of the level-crossing diagram is determined with high precision, which yields the hyperfine interval. Using room temperature Rb vapor, we obtain a precision of 44 kHz. This is a significant improvement over the current precision of similar to1 MHz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thanks to advances in sensor technology, today we have many applications (space-borne imaging, medical imaging, etc.) where images of large sizes are generated. Straightforward application of wavelet techniques for above images involves certain difficulties. Embedded coders such as EZW and SPIHT require that the wavelet transform of the full image be buffered for coding. Since the transform coefficients also require storing in high precision, buffering requirements for large images become prohibitively high. In this paper, we first devise a technique for embedded coding of large images using zero trees with reduced memory requirements. A 'strip buffer' capable of holding few lines of wavelet coefficients from all the subbands belonging to the same spatial location is employed. A pipeline architecure for a line implementation of above technique is then proposed. Further, an efficient algorithm to extract an encoded bitstream corresponding to a region of interest in the image has also been developed. Finally, the paper describes a strip based non-embedded coding which uses a single pass algorithm. This is to handle high-input data rates. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyse the Roy equations for the lowest partial waves of elastic ππ scattering. In the first part of the paper, we review the mathematical properties of these equations as well as their phenomenological applications. In particular, the experimental situation concerning the contributions from intermediate energies and the evaluation of the driving terms are discussed in detail. We then demonstrate that the two S-wave scattering lengths a00 and a02 are the essential parameters in the low energy region: Once these are known, the available experimental information determines the behaviour near threshold to within remarkably small uncertainties. An explicit numerical representation for the energy dependence of the S- and P-waves is given and it is shown that the threshold parameters of the D- and F-waves are also fixed very sharply in terms of a00 and a20. In agreement with earlier work, which is reviewed in some detail, we find that the Roy equations admit physically acceptable solutions only within a band of the (a00,a02) plane. We show that the data on the reactions e+e−→ππ and τ→ππν reduce the width of this band quite significantly. Furthermore, we discuss the relevance of the decay K→ππeν in restricting the allowed range of a00, preparing the grounds for an analysis of the forthcoming precision data on this decay and on pionic atoms. We expect these to reduce the uncertainties in the two basic low energy parameters very substantially, so that a meaningful test of the chiral perturbation theory predictions will become possible.