814 resultados para requirements defining
Resumo:
Tom40 is the major subunit of the translocase of the outer mitochondrial membrane (the TOM complex). To study the assembly pathway of Tom40, we have followed the integration of the protein into the TOM complex in vitro and in vivo using wild-type and altered versions of the Neurospora crassa Tom40 protein. Upon import into isolated mitochondria, Tom40 precursor proteins lacking the first 20 or the first 40 amino acid residues were assembled as the wild-type protein. In contrast, a Tom40 precursor lacking residues 41 to 60, which contains a highly conserved region of the protein, was arrested at an intermediate stage of assembly. We constructed mutant versions of Tom40 affecting this region and transformed the genes into a sheltered heterokaryon containing a tom40 null nucleus. Homokaryotic strains expressing the mutant Tom40 proteins had growth rate defects and were deficient in their ability to form conidia. Analysis of the TOM complex in these strains by blue native gel electrophoresis revealed alterations in electrophoretic mobility and a tendency to lose Tom40 subunits from the complex. Thus, both in vitro and in vivo studies implicate residues 41 to 60 as containing a sequence required for proper assembly/stability of Tom40 into the TOM complex. Finally, we found that TOM complexes in the mitochondrial outer membrane were capable of exchanging subunits in vitro. A model is proposed for the integration of Tom40 subunits into the TOM complex.
Resumo:
Crystal structures and biochemical analyses of PcrA helicase provide evidence for a model for processive DNA unwinding that involves coupling of single-stranded DNA (ssDNA) tracking to a duplex destabilization activity. The DNA tracking model invokes ATP-dependent flipping of bases between several pockets on the enzyme formed by conserved aromatic amino acid residues. We have used site-directed mutagenesis to confirm the requirement of all of these residues for helicase activity. We also demonstrate that the duplex unwinding defects correlate with an inability of certain mutant proteins to translocate effectively on ssDNA. Moreover, the results define an essential triad of residues within the ssDNA binding site that comprise the ATP-driven DNA motor itself.
Resumo:
The Fc gamma receptor-associated gamma and zeta subunits contain a conserved cytoplasmic motif, termed the immunoglobulin gene tyrosine activation motif, which contains a pair of YXXL sequences. The tyrosine residues within these YXXL sequences have been shown to be required for transduction of a phagocytic signal. We have previously reported that the gamma subunit of the type IIIA Fc gamma receptor (Fc gamma RIIIA) is approximately 6 times more efficient in mediating phagocytosis than the zeta subunit of Fc gamma RIIIA. By exchanging regions of the cytoplasmic domains of the homologous gamma and zeta chains, we observed that the cytoplasmic area of the gamma chain bearing a pair of the conserved YXXL sequences is important in phagocytic signaling. Further specificity of phagocytic signaling is largely determined by the two internal XX amino acids in the YXXL sequences. In contrast, the flanking amino acids of the YXXL sequences including the seven intervening amino acids between the two YXXL sequences do not significantly affect the phagocytic signal. Furthermore, the protein-tyrosine kinase Syk, but not the related kinase ZAP-70, stimulated Fc gamma RIIIA-mediated phagocytosis. ZAP-70, however, increased phagocytosis when coexpressed with the Src family kinase Fyn. These data demonstrate the importance of the two specific amino acids within the gamma subunit YXXL cytoplasmic sequences in phagocytic signaling and explain the difference in phagocytic efficiency of the gamma and zeta chains. These results indicate the importance of Syk in Fc gamma RIIIA-mediated phagocytosis and demonstrate that ZAP-70 and syk differ in their requirement for a Src-related kinase in signal transduction.
Resumo:
Este trabalho apresenta uma análise de algoritmos computacionais aplicados à estimação de fasores elétricos em SEPs. A medição dos fasores é realizada por meio da alocação de Unidades de Medição Fasorial nestes sistemas e encontra diversas aplicações nas áreas de operação, controle, proteção e planejamento. Para que os fasores possam ser aplicados, são definidos padrões de medição, sincronização e comunicação, por meio da norma IEEE C37.118.1. A norma apresenta os padrões de mensagens, timetag, fasores, sistema de sincronização, e define testes para avaliar a estimação. Apesar de abranger todos esses critérios, a diretriz não define um algoritmo de estimação padrão, abrindo espaço para uso de diversos métodos, desde que a precisão seja atendida. Nesse contexto, o presente trabalho analisa alguns algoritmos de estimação de fasores definidos na literatura, avaliando o comportamento deles em determinados casos. Foram considerados, dessa forma, os métodos: Transformada Discreta de Fourier, Método dos Mínimos Quadrados e Transformada Wavelet Discreta, nas versões recursivas e não-recursivas. Esses métodos foram submetidos a sinais sintéticos, a fim de verificar o comportamento diante dos testes propostos pela norma, avaliando o Total Vector Error, tempo de resposta e atraso e overshoot. Os algoritmos também foram embarcados em um hardware, denominado PC104, e avaliados de acordo com os sinais medidos pelo equipamento na saída analógica de um simulador em tempo real (Real Time Digital Simulator).
Resumo:
Computational Swarms (enxames computacionais), consistindo da integração de sensores e atuadores inteligentes no nosso mundo conectado, possibilitam uma extensão da info-esfera no mundo físico. Nós chamamos esta info-esfera extendida, cíber-física, de Swarm. Este trabalho propõe uma visão de Swarm onde dispositivos computacionais cooperam dinâmica e oportunisticamente, gerando redes orgânicas e heterogêneas. A tese apresenta uma arquitetura computacional do Plano de Controle do Sistema Operacional do Swarm, que é uma camada de software distribuída embarcada em todos os dispositivos que fazem parte do Swarm, responsável por gerenciar recursos, definindo atores, como descrever e utilizar serviços e recursos (como divulgá-los e descobrí-los, como realizar transações, adaptações de conteúdos e cooperação multiagentes). O projeto da arquitetura foi iniciado com uma revisão da caracterização do conceito de Swarm, revisitando a definição de termos e estabelecendo uma terminologia para ser utilizada. Requisitos e desafios foram identificados e uma visão operacional foi proposta. Esta visão operacional foi exercitada com casos de uso e os elementos arquiteturais foram extraídos dela e organizados em uma arquitetura. A arquitetura foi testada com os casos de uso, gerando revisões do sistema. Cada um dos elementos arquiteturais requereram revisões do estado da arte. Uma prova de conceito do Plano de Controle foi implementada e uma demonstração foi proposta e implementada. A demonstração selecionada foi o Smart Jukebox, que exercita os aspectos distribuídos e a dinamicidade do sistema proposto. Este trabalho apresenta a visão do Swarm computacional e apresenta uma plataforma aplicável na prática. A evolução desta arquitetura pode ser a base de uma rede global, heterogênea e orgânica de redes de dispositivos computacionais alavancando a integração de sistemas cíber-físicos na núvem permitindo a cooperação de sistemas escaláveis e flexíveis, interoperando para alcançar objetivos comuns.
Resumo:
Successful HR departments should support key business objectives by establishing metrics that determine the effectiveness of their processes. Functions such as recruiting, benefits, and training are processes that should have metrics. Understanding who measures what, when, and how often is the first step in measuring how much it costs to run HR. The next step is determining which processes are most critical, and then determining the metrics that fit the business needs. Slight adjustments will need to be made as business needs change, but the process for measuring outcomes should not change. This paper will focus on multinational corporations that employ at least ten thousand employees and have a ratio of one HR professional to every hundred fulltime equivalents (FTEs).
Resumo:
The aim of this study is to analyse the physical and physiological factors in soccer training at different categories of training. The participants were 30 soccer players of 8-aside soccer in the under 10’s age group (9.93±0.25 years) who participated in the under 10 Provincial Tournament in Alicante. During training, the variables of covered distance, heart rate, speed (average and maximum values) as well as the methodology used and position were registered. After the statistical analysis and its related discussion, it was concluded that the players do not show differences in the covered total distance in relation to the category. Notwithstanding, there are differences with regards to speed and heart rate, which are caused by the greater physical development of the players in comparison to the under10’s age group category. Regarding the methodology employed, it is worth stressing that the coaches used, to a greater extend, the global method, followed by the mixed method.
Resumo:
Relatório de Estágio apresentado à Escola Superior de Artes Aplicadas do Instituto Politécnico de Castelo Branco, em associação com a Faculdade de Arquitetura da Universidade de Lisboa, para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Design Gráfico.
Resumo:
The Development Permit System has been introduce with minimal directives for establishing a decision making process. This is in opposition to the long established process for minor variances and suggests that the Development Permit System does not necessarily incorporate all of Ontario’s fundamental planning principles. From this concept, the study aimed to identify how minor variances are incorporated into the Development Permit System. In order to examine this topic, the research was based around the following research questions: • How are ‘minor variance’ applications processed within the DPS? • To what extent do the four tests of a minor variance influence the outcomes of lower level applications in the DPS approval process? A case study approach was used for this research. The single-case design employed both qualitative and quantitative research methods including a review of academic literature, court cases, and official documents, as well as a content analysis of Class 1, 1A, and 2 Development Permit application files from the Town of Carleton Place that were decided between 2011 and 2015. Upon the completion of the content analysis, it was found that minor variance issues were most commonly assigned to Class 1 applications. Planning staff generally met approval timelines and embraced their delegated approval authority, readily attaching conditions to applications in order to mitigate off-site impacts. While staff met the regulatory requirements of the DPS, ‘minor variance’ applications were largely decided on impact alone, demonstrating that the principles established by the four tests, the defining quality of the minor variance approval process, had not transferred to the Development Permit System. Alternatively, there was some evidence that the development community has not fully adjusted to the requirements of the new approvals process, as some applications were supported using a rationale containing the four tests. Subsequently, a set of four recommendations were offered which reflect the main themes established by the findings. The first two recommendations are directed towards the Province, the third to municipalities and the fourth to developers and planning consultants: 1) Amend Ontario Regulation 608/06 so that provisions under Section 4(3)(e) fall under Section 4(2). 2) Change the rhetoric from “combining elements of minor variances” to “replacing minor variances”. 3) Establish clear evaluation criteria. 4) Understand the evaluative criteria of the municipality in which you are working.
Report of the High-Performance Computing Applications Requirements Group. III/6074/94-EN, April 1994