934 resultados para QUANTITATIVE GENETIC-ANALYSIS
Resumo:
Recently, we identified a large number of ultraconserved (uc) sequences in noncoding regions of human, mouse, and rat genomes that appear to be essential for vertebrate and amniote ontogeny. Here, we used similar methods to identify ultraconserved genomic regions between the insect species Drosophila melanogaster and Drosophila pseudoobscura, as well as the more distantly related Anopheles gambiae. As with vertebrates, ultraconserved sequences in insects appear to Occur primarily in intergenic and intronic sequences, and at intron-exon junctions. The sequences are significantly associated with genes encoding developmental regulators and transcription factors, but are less frequent and are smaller in size than in vertebrates. The longest identical, nongapped orthologous match between the three genomes was found within the homothorax (hth) gene. This sequence spans an internal exon-intron junction, with the majority located within the intron, and is predicted to form a highly stable stem-loop RNA structure. Real-time quantitative PCR analysis of different hth splice isoforms and Northern blotting showed that the conserved element is associated with a high incidence of intron retention in hth pre-mRNA, suggesting that the conserved intronic element is critically important in the post-transcriptional regulation of hth expression in Diptera.
Resumo:
The heritability of conscientiousness has been one of the least explored of the NEO PI domains. Here we focus on the facet scales of the conscientiousness domain, estimating both their heritability and their correlations with measures of IQ and academic achievement (Queensland Core Skills Test; QCST) in a sample of adolescent twins and their non-twin siblings. Our findings confirmed positive associations between IQ and the facets of Competence and Dutifulness (ranging 0.11-0.27), with academic achievement showing correlations of 0.27 and 0.15 with these same facets and 0.15 with Deliberation. All conscientiousness facets were influenced by genes (broad sense heritabilities ranging 0.18-0.49) and unique environment, but common environment was judged unimportant. A multivariate genetic analysis including Competence, Dutifulness, IQ (verbal, performance) and QCST scores showed that common variance was primarily explained by a general additive genetic factor (loadings ranging 0.15-0.84). Future multivariate genetic analysis which incorporates Openness to Experience dimensions may improve the interpretation of these findings. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The natural history of the development of epithelial ovarian cancer remains obscure and no effective screening test exists. In several human malignancies progression from benign to invasive tumour occurs, but this sequence has not been established for epithelial ovarian cancer. We have reviewed epidemiological, histopathological and molecular studies of benign epithelial ovarian tumours to assess the evidence for and against such a progression in ovarian cancer. These data suggest that a diagnosis of a benign ovarian cyst or tumour is associated with an increased risk of ovarian cancer later in life. Current evidence also suggests that benign serous tumours can progress to low-grade serous cancer and that benign mucinous tumours can progress to mucinous cancer. The more common high-grade serous ovarian cancers are likely to arise de novo.
Resumo:
Nodulation in legumes provides a major conduit of available nitrogen into the biosphere. The development of nitrogen-fixing nodules results from a symbiotic interaction between soil bacteria, commonly called rhizobia, and legume plants. Molecular genetic analysis in both model and agriculturally important legume species has resulted in the identification of a variety of genes that are essential for the establishment, maintenance and regulation of this symbiosis. Autoregulation of nodulation (AON) is a major internal process by which nodule numbers are controlled through prior nodulation events. Characterisation of AON-deficient mutants has revealed a novel systemic signal transduction pathway controlled by a receptor-like kinase. This review reports our present level of understanding on the short- and long-distance signalling networks controlling early nodulation events and AON.
Resumo:
A pesquisa possui como objetivo geral levantar, analisar, quantificar e classificar por níveis de competências quais foram os profissionais recrutados pela Petrobras no período pós-descoberta da camada do pré-sal brasileiro. A pesquisa se justifica pela previsão de crescimento da produção nacional de petróleo e gás natural estimada para os próximos anos o que poderá causar um descompasso entre a oferta e a demanda de mão de obra para o seu desenvolvimento. A abordagem metodológica desenvolvida para realização da pesquisa foi a da pesquisa exploratória, descritiva e documental, através de análise qualitativa e quantitativa longitudinal. Como resultado, a pesquisa revelou que a Petrobras não recruta profissionais para posições de nível gerencial. Os resultados demonstraram ainda que 56,8% das vagas abertas ao recrutamento são destinadas aos profissionais com formação de nível médio e que 76,4porcento das vagas são relacionadas ao processo fabril evidenciando que a Petrobras utiliza como porta de entrada a contratação de profissionais de nível médio com formação técnica. Ao realizar a classificação e qualificação da oferta de vagas abertas ao recrutamento a pesquisa identificou cinco grupos de profissionais distribuídos por três eixos de carreira e quatro níveis salariais que quando categorizados por níveis de competências que foram responsáveis por 69porcento de todas as vagas abertas ao recrutamento. Os dois primeiros grupos em relevância estão relacionados ao eixo de carreira de operações industriais onde o nível superior (O6) e o nível inferior (O1) foram os responsáveis por 22porcento e 21porcento respectivamente do total da oferta de vagas no período. O terceiro grupo em importância diz respeito ao eixo de carreira engenharia, processos e projetos onde os profissionais categorizados com nível médio (E3) numa escala de dois a cinco foram os responsáveis por 13porcento do total de vagas abertas. O quarto e quinto grupos estão relacionados ao eixo de carreira gestão de negócios e categorizados por níveis de competências nos níveis três (G3) e quatro (G4) em uma escala de um a cinco sendo estes responsáveis 7porcento e 6porcento do total de vagas.
Resumo:
A tese é constituída por duas etapas complementares, a teórica e a empírica, essenciais para que os seguintes objetivos propostos sejam devidamente cumpridos. 1) confirmar uma mudança da práxis publicitária na qual elementos materiais e funcionais passam a ser minimizados e elementos imateriais e intangíveis evidenciados. 2) investigar de forma objetiva e comparativa o conteúdo do corpus e demonstrar quais temáticas prevalecem em diferentes períodos sociais, pontuando sua natureza tangível e intangível. 3) Apresentar e delinear o Mercado do Imaterial , o esboço de um novo modelo de segmentação de mercado que tem a imaterialidade como base. O aprofundamento teórico-conceitual alicerçado nos Estudos Culturais e em autores que discorrem sobre a pós-modernidade e suas relações de consumo é ponto de partida desta pesquisa e só por meio dele chegamos às premissas que dão origem às duas hipóteses, extremamente interligadas, que sustentam esta tese. A pesquisa empírica utiliza como procedimento metodológico a análise de conteúdo quantitativa do corpus, composto por vídeos publicitários criteriosamente definidos. Por meio do trajeto analítico e teórico, é possível concluir que estamos em um período pós-material regido pelo intangível, o que modifica a comunicação de mercado.
Resumo:
A tese apresenta um estudo comparativo entre a atuação das organizações públicas e privadas do Brasil e, as caixas de poupança e as empresas privadas da Espanha, no que concerne à política de marketing cultural e suas estratégias de comunicação corporativa, tendo como pano de fundo os recursos incentivados pelos governos através das leis de isenção fiscal. Com dois objetivos principais, procurou-se identificar, primeiramente, a parceria entre o Estado e as companhias na promoção da cultura. O segundo e principal intuito foi descobrir as múltiplas estratégias de visibilidade cultural no mundo corporativo através da comunicação e do marketing. Além disso, se verificou os reflexos dessa relação na responsabilidade social corporativa e na reputação, como também, no mercado cultural. O trabalho foi pautado, principalmente, na análise de conteúdo qualitativa por emparelhamento e ainda na AIM- Auditoria de Imagem na Mídia. Ao final, conclui-se que embora as leis de isenção sejam divergentes nos dois países, o comportamento das companhias é similar tanto no que concerne aos investimentos em cultura, como no que se refere às estratégias de comunicação. Contudo, no Brasil o dinheiro é quase, majoritariamente, vindo da renúncia fiscal do Estado, enquanto que na Espanha vem das próprias organizações.
Resumo:
A series of N1-benzylideneheteroarylcarboxamidrazones was prepared in an automated fashion, and tested against Mycobacterium fortuitum in a rapid screen for antimycobacterial activity. Many of the compounds from this series were also tested against Mycobacterium tuberculosis, and the usefulness as M.fortuitum as a rapid, initial screen for anti-tubercular activity evaluated. Various deletions were made to the N1-benzylideneheteroarylcarboxamidrazone structure in order to establish the minimum structural requirements for activity. The N1-benzylideneheteroarylcarbox-amidrazones were then subjected to molecular modelling studies and their activities against M.fortuitum and M.tuberculosis were analysed using quantitative structure-analysis relationship (QSAR) techniques in the computational package TSAR (Oxford Molecular Ltd.). A set of equations predictive of antimycobacterial activity was hereby obtained. The series of N1-benzylidenehetero-arylcarboxamidrazones was also tested against a multidrug-resistant strain of Staphylococcus aureus (MRSA), followed by a panel of Gram-positive and Gram-negative bacteria, if activity was observed for MRSA. A set of antimycobacterial N1-benzylideneheteroarylcarboxamidrazones was hereby discovered, the best of which had MICs against m. fortuitum in the range 4-8μgml-1 and displayed 94% inhibition of M.tuberculosis at a concentration of 6.25μgml-1. The antimycobacterial activity of these compounds appeared to be specific, since the same compounds were shown to be inactive against other classes of organisms. Compounds which were found to be sufficiently active in any screen were also tested for their toxicity against human mononuclear leucocytes. Polyethylene glycol (PEG) was used as a soluble polymeric support for the synthesis of some fatty acid derivatives, containing an isoxazoline group, which may inhibit mycolic acid synthesis in mycobacteria. Both the PEG-bound products and the cleaved, isolated products themselves were tested against M.fortuitum and some low levels of antimycobacterial activity were observed, which may serve as lead compounds for further studies.
Resumo:
Preface. The evolution of cognitive neuroscience has been spurred by the development of increasingly sophisticated investigative techniques to study human cognition. In Methods in Mind, experts examine the wide variety of tools available to cognitive neuroscientists, paying particular attention to the ways in which different methods can be integrated to strengthen empirical findings and how innovative uses for established techniques can be developed. The book will be a uniquely valuable resource for the researcher seeking to expand his or her repertoire of investigative techniques. Each chapter explores a different approach. These include transcranial magnetic stimulation, cognitive neuropsychiatry, lesion studies in nonhuman primates, computational modeling, psychophysiology, single neurons and primate behavior, grid computing, eye movements, fMRI, electroencephalography, imaging genetics, magnetoencephalography, neuropharmacology, and neuroendocrinology. As mandated, authors focus on convergence and innovation in their fields; chapters highlight such cross-method innovations as the use of the fMRI signal to constrain magnetoencephalography, the use of electroencephalography (EEG) to guide rapid transcranial magnetic stimulation at a specific frequency, and the successful integration of neuroimaging and genetic analysis. Computational approaches depend on increased computing power, and one chapter describes the use of distributed or grid computing to analyze massive datasets in cyberspace. Each chapter author is a leading authority in the technique discussed.
Resumo:
Understanding the structures and functions of membrane proteins is an active area of research within bioscience. Membrane proteins are key players in essential cellular processes such as the uptake of nutrients, the export of waste products, and the way in which cells communicate with their environment. It is therefore not surprising that membrane proteins are targeted by over half of all prescription drugs. Since most membrane proteins are not abundant in their native membranes, it is necessary to produce them in recombinant host cells to enable further structural and functional studies. Unfortunately, achieving the required yields of functional recombinant membrane proteins is still a bottleneck in contemporary bioscience. This has highlighted the need for defined and rational optimization strategies based upon experimental observation rather than relying on trial and error. We have published a transcriptome and subsequent genetic analysis that has identified genes implicated in high-yielding yeast cells. These results have highlighted a role for alterations to a cell's protein synthetic capacity in the production of high yields of recombinant membrane protein: paradoxically, reduced protein synthesis favors higher yields. These results highlight a potential bottleneck at the protein folding or translocation stage of protein production.
Resumo:
This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.
Resumo:
The evolution of cognitive neuroscience has been spurred by the development of increasingly sophisticated investigative techniques to study human cognition. In Methods in Mind, experts examine the wide variety of tools available to cognitive neuroscientists, paying particular attention to the ways in which different methods can be integrated to strengthen empirical findings and how innovative uses for established techniques can be developed. The book will be a uniquely valuable resource for the researcher seeking to expand his or her repertoire of investigative techniques. Each chapter explores a different approach. These include transcranial magnetic stimulation, cognitive neuropsychiatry, lesion studies in nonhuman primates, computational modeling, psychophysiology, single neurons and primate behavior, grid computing, eye movements, fMRI, electroencephalography, imaging genetics, magnetoencephalography, neuropharmacology, and neuroendocrinology. As mandated, authors focus on convergence and innovation in their fields; chapters highlight such cross-method innovations as the use of the fMRI signal to constrain magnetoencephalography, the use of electroencephalography (EEG) to guide rapid transcranial magnetic stimulation at a specific frequency, and the successful integration of neuroimaging and genetic analysis. Computational approaches depend on increased computing power, and one chapter describes the use of distributed or grid computing to analyze massive datasets in cyberspace. Each chapter author is a leading authority in the technique discussed.
Resumo:
Framing plays an important role in public policy. Interest groups strategically highlight some aspects of a policy proposal while downplaying others in order to steer the policy debate in a favorable direction. Despite the importance of framing, we still know relatively little about the framing strategies of interest groups due to methodological difficulties that have prevented scholars from systematically studying interest group framing across a large number of interest groups and multiple policy debates. This article therefore provides an overview of three novel research methods that allow researchers to systematically measure interest group frames. More specifically, this article introduces a word-based quantitative text analysis technique, a manual, computer-assisted content analysis approach and face-to-face interviews designed to systematically identify interest group frames. The results generated by all three techniques are compared on the basis of a case study of interest group framing in an environmental policy debate in the European Union.
Resumo:
A detailed quantitative numerical analysis of partially coherent quasi-CW fiber laser is performed on the example of high-Q cavity Raman fiber laser. The key role of precise spectral performances of fiber Bragg gratings forming the laser cavity is clarified. It is shown that cross phase modulation between the pump and Stokes waves does not affect the generation. Amplitudes of different longitudinal modes strongly fluctuate obeying the Gaussian distribution. As intensity statistics is noticeably non-exponential, longitudinal modes should be correlated. © 2011 SPIE.
Resumo:
The aim of the case study is to express the delayed repair time impact on the revenues and profit in numbers with the example of the outage of power plant units. Main steps of risk assessment: • creating project plan suitable for risk assessment • identification of the risk factors for each project activities • scenario-analysis based evaluation of risk factors • selection of the critical risk factors based on the results of quantitative risk analysis • formulating risk response actions for the critical risks • running Monte-Carlo simulation [1] using the results of scenario-analysis • building up a macro which creates the connection among the results of the risk assessment, the production plan and the business plan.