998 resultados para Algorithms genetics
Resumo:
Dissertation presented in fulfillment of the requirements for the Degree of Doctor of Philosophy in Biology (Molecular Genetics) at the Instituto de Tecnologia Química e Biológica da Universidade Nova de Lisboa
Resumo:
Tese de doutoramento em Antropologia, especialidade em Antropologia Biológica e Etnoecologia
Resumo:
Muitas vezes é necessário trabalhar com variáveis categóricas, porem há um número restrito de análisesque as abordam. Uma boa técnica de segmentação é a grade of membership (GoM), muito utilizada na área médica, em psicologia e em sociologia. Essa metodologia possui uma interpretação interessante baseada em perfis extremos (segmentos) e grau de pertencimento. Porém o modelo possui grande complexidade de estimação dos parâmetros pormáxima verossimilhança. Assim, neste trabalho propõe-se o uso de algoritmos genéticos para diminuir a complexidade e o tempo de cálculo, e aumentar a acurácia. A técnica é nomeada de Genetics Algorithms grade of membership (GA-GoM). Para averiguar a efetividade, o modelo foi primeiramente abordado por simulação – foi executado um experimento fatorial levando em conta o número de segmentos e variáveis trabalhadas. Em seguida, foi abordado um caso prático de segmentação de engajamento em redes sociais. Os resultados são superiores para modelos de maior complexidade. Conclui-se, assim, que é útil a abordagem para grandes bases de dados que contenham dados categóricos.
Resumo:
In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniques for maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables, and an approach for performing parallel addition of N input symbols.
Resumo:
In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniquesfor maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables,and an approach for performing parallel addition of N input symbols.
Resumo:
The dispersal process, by which individuals or other dispersing agents such as gametes or seeds move from birthplace to a new settlement locality, has important consequences for the dynamics of genes, individuals, and species. Many of the questions addressed by ecology and evolutionary biology require a good understanding of species' dispersal patterns. Much effort has thus been devoted to overcoming the difficulties associated with dispersal measurement. In this context, genetic tools have long been the focus of intensive research, providing a great variety of potential solutions to measuring dispersal. This methodological diversity is reviewed here to help (molecular) ecologists find their way toward dispersal inference and interpretation and to stimulate further developments.
Resumo:
The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.
Resumo:
A new study shows that wood ant queens selectively pass the maternally-inherited half of their genome to their daughters and the paternally-inherited half to their sons. This system, which most likely evolved from ancestral hybridization, creates distinct genetic lineages.
Resumo:
Hypertension is a common, modifiable and heritable cardiovascular risk factor. Some rare monogenic forms of hypertension have been described, but the majority of patients suffer from "essential" hypertension, for whom the underlying pathophysiological mechanism is not clear. Essential hypertension is a complex trait, involving multiple genes and environmental factors. Recently, progress in the identification of common genetic variants associated with blood pressure and hypertension has been made thanks to large-scale international collaborative projects involving geneticists, epidemiologists, statisticians and clinicians. In this article, we review some basic genetic concepts and the main research methods used to study the genetics of hypertension, as well as selected recent findings in this field.
Resumo:
To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).
Resumo:
One third of the population is affected by a sleep disorder with a major social, medical, and economic impact. Although very little is known about the genetics of normal sleep, familial and twin studies indicate an important influence of genetic factors. Most sleep disorders run in families and in several of them the contribution of genetic factors is increasingly recognised. With recent advances in the genetics of narcolepsy and the role of the hypocretin/orexin system, the possibility that other gene defects may contribute to the pathophysiology of major sleep disorders is worth indepth investigation.
Resumo:
Extracellular calcium participates in several key physiological functions, such as control of blood coagulation, bone calcification or muscle contraction. Calcium homeostasis in humans is regulated in part by genetic factors, as illustrated by rare monogenic diseases characterized by hypo or hypercalcaemia. Both serum calcium and urinary calcium excretion are heritable continuous traits in humans. Serum calcium levels are tightly regulated by two main hormonal systems, i.e. parathyroid hormone and vitamin D, which are themselves also influenced by genetic factors. Recent technological advances in molecular biology allow for the screening of the human genome at an unprecedented level of detail and using hypothesis-free approaches, such as genome-wide association studies (GWAS). GWAS identified novel loci for calcium-related phenotypes (i.e. serum calcium and 25-OH vitamin D) that shed new light on the biology of calcium in humans. The substantial overlap (i.e. CYP24A1, CASR, GATA3; CYP2R1) between genes involved in rare monogenic diseases and genes located within loci identified in GWAS suggests a genetic and phenotypic continuum between monogenic diseases of calcium homeostasis and slight disturbances of calcium homeostasis in the general population. Future studies using whole-exome and whole-genome sequencing will further advance our understanding of the genetic architecture of calcium homeostasis in humans. These findings will likely provide new insight into the complex mechanisms involved in calcium homeostasis and hopefully lead to novel preventive and therapeutic approaches. Keyword: calcium, monogenic, genome-wide association studies, genetics.