987 resultados para redundancy analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The xeroderma pigmentosum complementation group B (XPB) protein is involved in both DNA repair and transcription in human cells. It is a component of the transcription factor IIH (TFIIH) and is responsible for DNA helicase activity during nucleotide (nt) excision repair (NER). Its high evolutionary conservation has allowed identification of homologous proteins in different organisms, including plants. In contrast to other organisms, Arabidopsis thaliana harbors a duplication of the XPB orthologue (AtXPB1 and AtXPB2), and the proteins encoded by the duplicated genes are very similar (95% amino acid identity). Complementation assays in yeast rad25 mutant strains suggest the involvement of AtXPB2 in DNA repair, as already shown for AtXPB1, indicating that these proteins may be functionally redundant in the removal of DNA lesions in A. thaliana. Although both genes are expressed in a constitutive manner during the plant life cycle, Northern blot analyses suggest that light modulates the expression level of both XPB copies, and transcript levels increase during early stages of development. Considering the high similarity between AtXPB1 and AtXPB2 and that both of predicted proteins may act in DNA repair, it is possible that this duplication may confer more flexibility and resistance to DNA damaging agents in thale cress. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many older adults have difficulty using modern consumer products due to their complexity both in terms of functionality and interface design. It has been observed that older people also have more problems learning new systems. It was hypothesised that designing technological products that are more intuitive for older people to use can solve this problem. An intuitive interface allows a user’s to employ prior knowledge, thus minimizing the learning needed for effective interaction. This paper discusses an experiment investigating the effectiveness of redundancy in interface design. The primary objective of this experiment was to find out if using more than one modality for a product’s interface improves the speed and intuitiveness of interactions for older adults. Preliminary analysis showed strong correlation between technology familiarity and time on tasks, but redundancy in interface design improved speed and accuracy of use only for participants with moderate to high technology familiarity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the past decade, a significant amount of research has been conducted internationally with the aim of developing, implementing, and verifying "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures. Application of these methods permits comprehensive assessment of the actual failure modes and ultimate strengths of structural systems in practical design situations, without resort to simplified elastic methods of analysis and semi-empirical specification equations. Advanced analysis has the potential to extend the creativity of structural engineers and simplify the design process, while ensuring greater economy and more uniform safety with respect to the ultimate limit state. The application of advanced analysis methods has previously been restricted to steel frames comprising only members with compact cross-sections that are not subject to the effects of local buckling. This precluded the use of advanced analysis from the design of steel frames comprising a significant proportion of the most commonly used Australian sections, which are non-compact and subject to the effects of local buckling. This thesis contains a detailed description of research conducted over the past three years in an attempt to extend the scope of advanced analysis by developing methods that include the effects of local buckling in a non-linear analysis formulation, suitable for practical design of steel frames comprising non-compact sections. Two alternative concentrated plasticity formulations are presented in this thesis: the refined plastic hinge method and the pseudo plastic zone method. Both methods implicitly account for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. The accuracy and precision of the methods for the analysis of steel frames comprising non-compact sections has been established by comparison with a comprehensive range of analytical benchmark frame solutions. Both the refined plastic hinge and pseudo plastic zone methods are more accurate and precise than the conventional individual member design methods based on elastic analysis and specification equations. For example, the pseudo plastic zone method predicts the ultimate strength of the analytical benchmark frames with an average conservative error of less than one percent, and has an acceptable maximum unconservati_ve error of less than five percent. The pseudo plastic zone model can allow the design capacity to be increased by up to 30 percent for simple frames, mainly due to the consideration of inelastic redistribution. The benefits may be even more significant for complex frames with significant redundancy, which provides greater scope for inelastic redistribution. The analytical benchmark frame solutions were obtained using a distributed plasticity shell finite element model. A detailed description of this model and the results of all the 120 benchmark analyses are provided. The model explicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. Its accuracy was verified by comparison with a variety of analytical solutions and the results of three large-scale experimental tests of steel frames comprising non-compact sections. A description of the experimental method and test results is also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach to promote the integrity of perception systems for outdoor unmanned ground vehicles (UGV) operating in challenging environmental conditions (presence of dust or smoke). The proposed technique automatically evaluates the consistency of the data provided by two sensing modalities: a 2D laser range finder and a millimetre-wave radar, allowing for perceptual failure mitigation. Experimental results, obtained with a UGV operating in rural environments, and an error analysis validate the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conventional definition of redundancy is applicable to skeletal structural systems only, whereas the concept of redundancy has never been discussed in the context of a continuum. Generally, structures in civil engineering constitute a combination of both skeletal and continuum segments. Hence, this gaper presents a generalized definition of redundancy that has been defined in terms of structural response sensitivity, which is applicable to both continuum and discrete structures. In contrast to the conventional definition of redundancy, which is assumed to be fixed for a given structure and is believed to be independent of loading and material properties, the new definition would depend on strength and response of the structure at a given stage of its service life. The redundancy measure proposed in this paper is linked to the structural response sensitivities. Thus, the structure can have different degrees of redundancy during its lifetime, depending on the response sensitivity under consideration It is believed that this new redundancy measure would be more relevant in structural evaluation, damage assessment, and reliability analysis of structures at large.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The von Neumann entropy of a generic quantum state is not unique unless the state can be uniquely decomposed as a sum of extremal or pure states. Therefore one reaches the remarkable possibility that there may be many entropies for a given state. We show that this happens if the GNS representation (of the algebra of observables in some quantum state) is reducible, and some representations in the decomposition occur with non-trivial degeneracy. This ambiguity in entropy, which can occur at zero temperature, can often be traced to a gauge symmetry emergent from the non-trivial topological character of the configuration space of the underlying system. We also establish the analogue of an H-theorem for this entropy by showing that its evolution is Markovian, determined by a stochastic matrix. After demonstrating this entropy ambiguity for the simple example of the algebra of 2 x 2 matrices, we argue that the degeneracies in the GNS representation can be interpreted as an emergent broken gauge symmetry, and play an important role in the analysis of emergent entropy due to non-Abelian anomalies. We work out the simplest situation with such non-Abelian symmetry, that of an ethylene molecule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In current practice the strength evaluation of a bridge system is typically based on firstly using elastic analysis to determine the distribution of load effects in the elements and then checking the ultimate section capacity of those elements. Ductility of the components in most bridge structures permits local yield and subsequent redistribution of the applied loads from the most heavily loaded elements. As a result a bridge can continue to carry additional loading even after one member has yielded, which has conventionally been adopted as the "failure criterion" in bridge strength evaluation. This means that a bridge with inherent redundancy has additional reserves of strength such that the failure of one element does not result in the failure of the complete system. For these bridges warning signs will show up and measures can be undertaken before the ultimate collapse is happening. This paper proposes a rational methodology for calculating the ultimate system strength and including in bridge evaluation the warning level due to redundancy. © 2004 Taylor & Francis Group, London.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces the application of linear multivariate statistical techniques, including partial least squares (PLS), canonical correlation analysis (CCA) and reduced rank regression (RRR), into the area of Systems Biology. This new approach aims to extract the important proteins embedded in complex signal transduction pathway models.The analysis is performed on a model of intracellular signalling along the janus-associated kinases/signal transducers and transcription factors (JAK/STAT) and mitogen activated protein kinases (MAPK) signal transduction pathways in interleukin-6 (IL6) stimulated hepatocytes, which produce signal transducer and activator of transcription factor 3 (STAT3).A region of redundancy within the MAPK pathway that does not affect the STAT3 transcription was identified using CCA. This is the core finding of this analysis and cannot be obtained by inspecting the model by eye. In addition, RRR was found to isolate terms that do not significantly contribute to changes in protein concentrations, while the application of PLS does not provide such a detailed picture by virtue of its construction.This analysis has a similar objective to conventional model reduction techniques with the advantage of maintaining the meaning of the states prior to and after the reduction process. A significant model reduction is performed, with a marginal loss in accuracy, offering a more concise model while maintaining the main influencing factors on the STAT3 transcription.The findings offer a deeper understanding of the reaction terms involved, confirm the relevance of several proteins to the production of Acute Phase Proteins and complement existing findings regarding cross-talk between the two signalling pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – This paper aims to examine the antecedent influences and merits of workplace occupations as a tactical response to employer redundancy initiatives.

Design/methodology/approach – The data are based on analysis of secondary documentary material reporting on three workplace occupations in the Republic of Ireland during 2009.

Findings – Perceptions of both procedural (e.g. employer unilateral action) and substantive (e.g. pay and entitlements) justice appear pivotal influences. Spillover effects from other known occupations may also be influential. Workplace occupations were found to produce some modest substantive gains, such as enhancing redundancy payments. The tactic of workplace occupation was also found to transform unilateral employer action into scenarios based upon negotiated settlement supported by third-party mediation. However the tactic of workplace occupation in response to redundancy runs the risks of potential judicial injunction and sanction.

Research limitations/implications – Although operationally difficult, future studies should strive to collect primary data workplace occupations as they occur.

Originality/value – The paper identifies conditions conducive to the genesis of workplace occupations and the extent to which the tactic may be of benefit in particular circumstances to workers facing redundancy. It also contextualises the tactic in relation to both collective mobilisation and bargaining theories in employment relations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reducing wafer metrology continues to be a major target in semiconductor manufacturing efficiency initiatives due to it being a high cost, non-value added operation that impacts on cycle-time and throughput. However, metrology cannot be eliminated completely given the important role it plays in process monitoring and advanced process control. To achieve the required manufacturing precision, measurements are typically taken at multiple sites across a wafer. The selection of these sites is usually based on a priori knowledge of wafer failure patterns and spatial variability with additional sites added over time in response to process issues. As a result, it is often the case that in mature processes significant redundancy can exist in wafer measurement plans. This paper proposes a novel methodology based on Forward Selection Component Analysis (FSCA) for analyzing historical metrology data in order to determine the minimum set of wafer sites needed for process monitoring. The paper also introduces a virtual metrology (VM) based approach for reconstructing the complete wafer profile from the optimal sites identified by FSCA. The proposed methodology is tested and validated on a wafer manufacturing metrology dataset. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Os sistemas distribuídos embarcados (Distributed Embedded Systems – DES) têm sido usados ao longo dos últimos anos em muitos domínios de aplicação, da robótica, ao controlo de processos industriais passando pela aviónica e pelas aplicações veiculares, esperando-se que esta tendência continue nos próximos anos. A confiança no funcionamento é uma propriedade importante nestes domínios de aplicação, visto que os serviços têm de ser executados em tempo útil e de forma previsível, caso contrário, podem ocorrer danos económicos ou a vida de seres humanos poderá ser posta em causa. Na fase de projecto destes sistemas é impossível prever todos os cenários de falhas devido ao não determinismo do ambiente envolvente, sendo necessária a inclusão de mecanismos de tolerância a falhas. Adicionalmente, algumas destas aplicações requerem muita largura de banda, que também poderá ser usada para a evolução dos sistemas, adicionandolhes novas funcionalidades. A flexibilidade de um sistema é uma propriedade importante, pois permite a sua adaptação às condições e requisitos envolventes, contribuindo também para a simplicidade de manutenção e reparação. Adicionalmente, nos sistemas embarcados, a flexibilidade também é importante por potenciar uma melhor utilização dos, muitas vezes escassos, recursos existentes. Uma forma evidente de aumentar a largura de banda e a tolerância a falhas dos sistemas embarcados distribuídos é a replicação dos barramentos do sistema. Algumas soluções existentes, quer comerciais quer académicas, propõem a replicação dos barramentos para aumento da largura de banda ou para aumento da tolerância a falhas. No entanto e quase invariavelmente, o propósito é apenas um, sendo raras as soluções que disponibilizam uma maior largura de banda e um aumento da tolerância a falhas. Um destes raros exemplos é o FlexRay, com a limitação de apenas ser permitido o uso de dois barramentos. Esta tese apresentada e discute uma proposta para usar a replicação de barramentos de uma forma flexível com o objectivo duplo de aumentar a largura de banda e a tolerância a falhas. A flexibilidade dos protocolos propostos também permite a gestão dinâmica da topologia da rede, sendo o número de barramentos apenas limitado pelo hardware/software. As propostas desta tese foram validadas recorrendo ao barramento de campo CAN – Controller Area Network, escolhido devido à sua grande implantação no mercado. Mais especificamente, as soluções propostas foram implementadas e validadas usando um paradigma que combina flexibilidade com comunicações event-triggered e time-triggered: o FTT – Flexible Time- Triggered. No entanto, uma generalização para CAN nativo é também apresentada e discutida. A inclusão de mecanismos de replicação do barramento impõe a alteração dos antigos protocolos de replicação e substituição do nó mestre, bem como a definição de novos protocolos para esta finalidade. Este trabalho tira partido da arquitectura centralizada e da replicação do nó mestre para suportar de forma eficiente e flexível a replicação de barramentos. Em caso de ocorrência de uma falta num barramento (ou barramentos) que poderia provocar uma falha no sistema, os protocolos e componentes propostos nesta tese fazem com que o sistema reaja, mudando para um modo de funcionamento degradado. As mensagens que estavam a ser transmitidas nos barramentos onde ocorreu a falta são reencaminhadas para os outros barramentos. A replicação do nó mestre baseia-se numa estratégia líder-seguidores (leaderfollowers), onde o líder (leader) controla todo o sistema enquanto os seguidores (followers) servem como nós de reserva. Se um erro ocorrer no nó líder, um dos nós seguidores passará a controlar o sistema de uma forma transparente e mantendo as mesmas funcionalidades. As propostas desta tese foram também generalizadas para CAN nativo, tendo sido para tal propostos dois componentes adicionais. É, desta forma possível ter as mesmas capacidades de tolerância a falhas ao nível dos barramentos juntamente com a gestão dinâmica da topologia de rede. Todas as propostas desta tese foram implementadas e avaliadas. Uma implementação inicial, apenas com um barramento foi avaliada recorrendo a uma aplicação real, uma equipa de futebol robótico onde o protocolo FTT-CAN foi usado no controlo de movimento e da odometria. A avaliação do sistema com múltiplos barramentos foi feita numa plataforma de teste em laboratório. Para tal foi desenvolvido um sistema de injecção de faltas que permite impor faltas nos barramentos e nos nós mestre, e um sistema de medida de atrasos destinado a medir o tempo de resposta após a ocorrência de uma falta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les temps de réponse dans une tache de reconnaissance d’objets visuels diminuent de façon significative lorsque les cibles peuvent être distinguées à partir de deux attributs redondants. Le gain de redondance pour deux attributs est un résultat commun dans la littérature, mais un gain causé par trois attributs redondants n’a été observé que lorsque ces trois attributs venaient de trois modalités différentes (tactile, auditive et visuelle). La présente étude démontre que le gain de redondance pour trois attributs de la même modalité est effectivement possible. Elle inclut aussi une investigation plus détaillée des caractéristiques du gain de redondance. Celles-ci incluent, outre la diminution des temps de réponse, une diminution des temps de réponses minimaux particulièrement et une augmentation de la symétrie de la distribution des temps de réponse. Cette étude présente des indices que ni les modèles de course, ni les modèles de coactivation ne sont en mesure d’expliquer l’ensemble des caractéristiques du gain de redondance. Dans ce contexte, nous introduisons une nouvelle méthode pour évaluer le triple gain de redondance basée sur la performance des cibles doublement redondantes. Le modèle de cascade est présenté afin d’expliquer les résultats de cette étude. Ce modèle comporte plusieurs voies de traitement qui sont déclenchées par une cascade d’activations avant de satisfaire un seul critère de décision. Il offre une approche homogène aux recherches antérieures sur le gain de redondance. L’analyse des caractéristiques des distributions de temps de réponse, soit leur moyenne, leur symétrie, leur décalage ou leur étendue, est un outil essentiel pour cette étude. Il était important de trouver un test statistique capable de refléter les différences au niveau de toutes ces caractéristiques. Nous abordons la problématique d’analyser les temps de réponse sans perte d’information, ainsi que l’insuffisance des méthodes d’analyse communes dans ce contexte, comme grouper les temps de réponses de plusieurs participants (e. g. Vincentizing). Les tests de distributions, le plus connu étant le test de Kolmogorov- Smirnoff, constituent une meilleure alternative pour comparer des distributions, celles des temps de réponse en particulier. Un test encore inconnu en psychologie est introduit : le test d’Anderson-Darling à deux échantillons. Les deux tests sont comparés, et puis nous présentons des indices concluants démontrant la puissance du test d’Anderson-Darling : en comparant des distributions qui varient seulement au niveau de (1) leur décalage, (2) leur étendue, (3) leur symétrie, ou (4) leurs extrémités, nous pouvons affirmer que le test d’Anderson-Darling reconnait mieux les différences. De plus, le test d’Anderson-Darling a un taux d’erreur de type I qui correspond exactement à l’alpha tandis que le test de Kolmogorov-Smirnoff est trop conservateur. En conséquence, le test d’Anderson-Darling nécessite moins de données pour atteindre une puissance statistique suffisante.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Study Design. Reliability study. Objectives. To assess between-acquisition reliability of new multilevel trunk cross sections measurements, in order to define what is a real change when comparing 2 trunk surface acquisitions of a same patient, before and after surgery or throughout the clinical monitoring. Summary of Background Data. Several cross-sectional surface measurements have been proposed in the literature for noninvasive assessment of trunk deformity in patients with adolescent idiopathic scoliosis (AIS). However, only the maximum values along the trunk are evaluated and used for monitoring progression and assessing treatment outcome. Methods. Back surface rotation (BSR), trunk rotation (TR), and coronal and sagittal trunk deviation are computed on 300 cross sections of the trunk. Each set of 300 measures is represented as a single functional data, using a set of basis functions. To evaluate between-acquisition variability at all trunk levels, a test-retest reliability study is conducted on 35 patients with AIS. A functional correlation analysis is also carried out to evaluate any redundancy between the measurements. Results. Each set of 300 measures was successfully described using only 10 basis functions. The test-retest reliability of the functional measurements is good to very good all over the trunk, except above the shoulders level. The typical errors of measurement are between 1.20° and 2.2° for the rotational measures and between 2 and 6 mm for deviation measures. There is a very strong correlation between BSR and TR all over the trunk, a moderate correlation between coronal trunk deviation and both BSR and TR, and no correlation between sagittal trunk deviation and any other measurement. Conclusion. This novel representation of trunk surface measurements allows for a global assessment of trunk surface deformity. Multilevel trunk measurements provide a broader perspective of the trunk deformity and allow a reliable multilevel monitoring during clinical follow-up of patients with AIS and a reliable assessment of the esthetic outcome after surgery.