922 resultados para Genetic Algorithms and Simulated Annealing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a flexible visual data mining framework which combines advanced projection algorithms from the machine learning domain and visual techniques developed in the information visualization domain. The advantage of such an interface is that the user is directly involved in the data mining process. We integrate principled projection algorithms, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates and billboarding, to provide a visual data mining framework. Results on a real-life chemoinformatics dataset using GTM are promising and have been analytically compared with the results from the traditional projection methods. It is also shown that the HGTM algorithm provides additional value for large datasets. The computational complexity of these algorithms is discussed to demonstrate their suitability for the visual data mining framework. Copyright 2006 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Four bar mechanisms are basic components of many important mechanical devices. The kinematic synthesis of four bar mechanisms is a difficult design problem. A novel method that combines the genetic programming and decision tree learning methods is presented. We give a structural description for the class of mechanisms that produce desired coupler curves. Constructive induction is used to find and characterize feasible regions of the design space. Decision trees constitute the learning engine, and the new features are created by genetic programming.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To compare monochromatic aberrations of keratoconic eyes when uncorrected, corrected with spherically-powered RGP (rigid gas-permeable) contact lenses and corrected using simulations of customised soft contact lenses for different magnitudes of rotation (up to 15°) and translation (up to 1mm) from their ideal position. Methods: The ocular aberrations of examples of mild, moderate and severe keratoconic eyes were measured when uncorrected and when wearing their habitual RGP lenses. Residual aberrations and point-spread functions of each eye were simulated using an ideal, customised soft contact lens (designed to neutralise higher-order aberrations, HOA) were calculated as a function of the angle of rotation of the lens from its ideal orientation, and its horizontal and vertical translation. Results: In agreement with the results of other authors, the RGP lenses markedly reduced both lower-order aberrations and HOA for all three patients. When compared with the RGP lens corrections, the customised lens simulations only provided optical improvements if their movements were constrained within limits which appear to be difficult to achieve with current technologies. Conclusions: At the present time, customised contact lens corrections appear likely to offer, at best, only minor optical improvements over RGP lenses for patients with keratoconus. If made in soft materials, however, these lenses may be preferred by patients in term of comfort. © 2012 The College of Optometrists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is distributed genetic algorithm implementation (so called island algorithm) to accelerate the optimum searching process in space of solutions. Distributed genetic algorithm has also smaller chances to fall in local optimum. This conception depends on mutual cooperation of the clients which realize separate working of genetic algorithms on local machines. As a tool for implementation of distributed genetic algorithm, created to produce net's applications Java technology was chosen. In Java technology, there is a technique of remote methods invocation - Java RMI. By means of invoking remote methods it can send objects between clients and server RMI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Factors associated with survival were studied in 84 neuropathologically documented cases of the pre-senile dementia frontotemporal dementia lobar degeneration (FTLD) with transactive response (TAR) DNA-binding protein of 43 kDa (TDP-43) proteinopathy (FTLD-TDP). Kaplan-Meier survival analysis estimated mean survival as 7.9 years (range: 1-19 years, SD = 4.64). Familial and sporadic cases exhibited similar survival, including progranulin (GRN) gene mutation cases. No significant differences in survival were associated with sex, disease onset, Braak disease stage, or disease subtype, but higher survival was associated with lower post-mortem brain weight. Survival was significantly reduced in cases with associated motor neuron disease (FTLD-MND) but increased with Alzheimer's disease (AD) or hippocampal sclerosis (HS) co-morbidity. Cox regression analysis suggested that reduced survival was associated with increased densities of neuronal cytoplasmic inclusions (NCI) while increased survival was associated with greater densities of enlarged neurons (EN) in the frontal and temporal lobes. The data suggest that: (1) survival in FTLD-TDP is more prolonged than typical in pre-senile dementia but shorter than some clinical subtypes such as the semantic variant of primary progressive aphasia (svPPA), (2) MND co-morbidity predicts poor survival, and (3) NCI may develop early and EN later in the disease. The data have implications for both neuropathological characterization and subtyping of FTLD-TDP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate warming is predicted to cause an increase in the growing season by as much as 30% for regions of the arctic tundra. This will have a significant effect on the physiological activity of the vascular plant species and the ecosystem as a whole. The need to understand the possible physiological change within this ecosystem is confounded by the fact that research in this extreme environment has been limited to periods when conditions are most favorable, mid June–mid August. This study attempted to develop the most comprehensive understanding to date of the physiological activity of seven tundra plant species in the Alaskan Arctic under natural and lengthened growing season conditions. Four interrelated lines of research, scaling from cellular signals to ecosystem processes, set the foundation for this study. ^ I established an experiment looking at the physiological response of arctic sedges to soil temperature stress with emphasis on the role of the hormone abscisic acid (ABA). A manipulation was also developed where the growing season was lengthened and soils were warmed in an attempt to determine the maximum physiological capacity of these seven vascular species. Additionally, the physiological capacities of four evergreens were tested in the subnivean environment along with the potential role anthocyanins play in their activity. The measurements were scaled up to determine the physiological role of these evergreens in maintaining ecosystem carbon fluxes. ^ These studies determined that soil temperature differentials significantly affect vascular plant physiology. ABA appears to be a physiological modifier that limits stomatal processes when root temperatures are low. Photosynthetic capacity was limited by internal plant physiological mechanisms in the face of a lengthened growing season. Therefore shifts in ecosystem carbon dynamics are driven by changes in species composition and biomass production on a per/unit area basis. These studies also found that changes in soil temperatures will have a greater effect of physiological processes than would the same magnitude of change in air temperature. The subnivean environment exhibits conditions that are favorable for photosynthetic activity in evergreen species. These measurements when scaled to the ecosystem have a significant role in limiting the system's carbon source capacity. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Antenna design is an iterative process in which structures are analyzed and changed to comply with certain performance parameters required. The classic approach starts with analyzing a "known" structure, obtaining the value of its performance parameter and changing this structure until the "target" value is achieved. This process relies on having an initial structure, which follows some known or "intuitive" patterns already familiar to the designer. The purpose of this research was to develop a method of designing UWB antennas. What is new in this proposal is that the design process is reversed: the designer will start with the target performance parameter and obtain a structure as the result of the design process. This method provided a new way to replicate and optimize existing performance parameters. The base of the method was the use of a Genetic Algorithm (GA) adapted to the format of the chromosome that will be evaluated by the Electromagnetic (EM) solver. For the electromagnetic study we used XFDTD™ program, based in the Finite-Difference Time-Domain technique. The programming portion of the method was created under the MatLab environment, which serves as the interface for converting chromosomes, file formats and transferring of data between the XFDTD™ and GA. A high level of customization had to be written into the code to work with the specific files generated by the XFDTD™ program. Two types of cost functions were evaluated; the first one seeking broadband performance within the UWB band, and the second one searching for curve replication of a reference geometry. The performance of the method was evaluated considering the speed provided by the computer resources used. Balance between accuracy, data file size and speed of execution was achieved by defining parameters in the GA code as well as changing the internal parameters of the XFDTD™ projects. The results showed that the GA produced geometries that were analyzed by the XFDTD™ program and changed following the search criteria until reaching the target value of the cost function. Results also showed how the parameters can change the search criteria and influence the running of the code to provide a variety of geometries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A job shop with one batch processing and several discrete machines is analyzed. Given a set of jobs, their process routes, processing requirements, and size, the objective is to schedule the jobs such that the makespan is minimized. The batch processing machine can process a batch of jobs as long as the machine capacity is not violated. The batch processing time is equal to the longest processing job in the batch. The problem under study can be represented as Jm:batch:Cmax. If no batches were formed, the scheduling problem under study reduces to the classical job shop scheduling problem (i.e. Jm:: Cmax), which is known to be NP-hard. This research extends the scheduling literature by combining Jm::Cmax with batch processing. The primary contributions are the mathematical formulation, a new network representation and several solution approaches. The problem under study is observed widely in metal working and other industries, but received limited or no attention due to its complexity. A novel network representation of the problem using disjunctive and conjunctive arcs, and a mathematical formulation are proposed to minimize the makespan. Besides that, several algorithms, like batch forming heuristics, dispatching rules, Modified Shifting Bottleneck, Tabu Search (TS) and Simulated Annealing (SA), were developed and implemented. An experimental study was conducted to evaluate the proposed heuristics, and the results were compared to those from a commercial solver (i.e., CPLEX). TS and SA, with the combination of MWKR-FF as the initial solution, gave the best solutions among all the heuristics proposed. Their results were close to CPLEX; and for some larger instances, with total operations greater than 225, they were competitive in terms of solution quality and runtime. For some larger problem instances, CPLEX was unable to report a feasible solution even after running for several hours. Between SA and the experimental study indicated that SA produced a better average Cmax for all instances. The solution approaches proposed will benefit practitioners to schedule a job shop (with both discrete and batch processing machines) more efficiently. The proposed solution approaches are easier to implement and requires short run times to solve large problem instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of resource supply and herbivory in driving competitive interactions among species has been an important but contentious issue within ecology. These variables exhibit different effects on species competition when manipulated in isolation but interact when manipulated together. I tested the direct and interactive effects of nutrient addition and simulated grazing (clipping) on the competitive performance of primary producers and community structure of a seagrass bed in South Florida. One square meter experimental plots were established in a mixed seagrass meadow from August 2007 to July 2009. The experiment was a 3 x 3 factorial experiment: 3 fertility treatments: control, medium (2.4 mg N d−1 and 80 µg P day −1) and high (4.8 mg N d−1 and 160 µg P day−1) x 3 clipping intensities (0, 25% and 50 % biomass removal (G)) x 5 replicates for each treatment = 45 plots). Nutrient additions and simulated grazing were done every two months. Fertilization and simulated grazing decreased sexual reproduction in S. filiforme. Fertilization increased competitive dominance within the primary producers while simulated grazing counteracted this effect by removal of the dominant species. Fertilization ameliorated the negative impacts of simulated grazing while simulated grazing prevented competitive exclusion in the fertilized plots. Nutrient addition and simulated grazing both exerted strong control on plant performance and community structure. Neither bottom up nor top down influences was eliminated in treatments where both factors where present. The effects of fertilization on plant performance were marked under all clipping intensities indicating that the system is regulated by nutrient availability both in the presence or absence of grazers. Clipping effects were strong under both fertilized and unfertilized conditions indicating that the seagrass bed can be simultaneously under top-down control by grazers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personalized recommender systems aim to assist users in retrieving and accessing interesting items by automatically acquiring user preferences from the historical data and matching items with the preferences. In the last decade, recommendation services have gained great attention due to the problem of information overload. However, despite recent advances of personalization techniques, several critical issues in modern recommender systems have not been well studied. These issues include: (1) understanding the accessing patterns of users (i.e., how to effectively model users' accessing behaviors); (2) understanding the relations between users and other objects (i.e., how to comprehensively assess the complex correlations between users and entities in recommender systems); and (3) understanding the interest change of users (i.e., how to adaptively capture users' preference drift over time). To meet the needs of users in modern recommender systems, it is imperative to provide solutions to address the aforementioned issues and apply the solutions to real-world applications. ^ The major goal of this dissertation is to provide integrated recommendation approaches to tackle the challenges of the current generation of recommender systems. In particular, three user-oriented aspects of recommendation techniques were studied, including understanding accessing patterns, understanding complex relations and understanding temporal dynamics. To this end, we made three research contributions. First, we presented various personalized user profiling algorithms to capture click behaviors of users from both coarse- and fine-grained granularities; second, we proposed graph-based recommendation models to describe the complex correlations in a recommender system; third, we studied temporal recommendation approaches in order to capture the preference changes of users, by considering both long-term and short-term user profiles. In addition, a versatile recommendation framework was proposed, in which the proposed recommendation techniques were seamlessly integrated. Different evaluation criteria were implemented in this framework for evaluating recommendation techniques in real-world recommendation applications. ^ In summary, the frequent changes of user interests and item repository lead to a series of user-centric challenges that are not well addressed in the current generation of recommender systems. My work proposed reasonable solutions to these challenges and provided insights on how to address these challenges using a simple yet effective recommendation framework.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Family health history (FHH) in the context of risk assessment has been shown to positively impact risk perception and behavior change. The added value of genetic risk testing is less certain. The aim of this study was to determine the impact of Type 2 Diabetes (T2D) FHH and genetic risk counseling on behavior and its cognitive precursors. Subjects were non-diabetic patients randomized to counseling that included FHH +/- T2D genetic testing. Measurements included weight, BMI, fasting glucose at baseline and 12 months and behavioral and cognitive precursor (T2D risk perception and control over disease development) surveys at baseline, 3, and 12 months. 391 subjects enrolled of which 312 completed the study. Behavioral and clinical outcomes did not differ across FHH or genetic risk but cognitive precursors did. Higher FHH risk was associated with a stronger perceived T2D risk (pKendall < 0.001) and with a perception of "serious" risk (pKendall < 0.001). Genetic risk did not influence risk perception, but was correlated with an increase in perception of "serious" risk for moderate (pKendall = 0.04) and average FHH risk subjects (pKendall = 0.01), though not for the high FHH risk group. Perceived control over T2D risk was high and not affected by FHH or genetic risk. FHH appears to have a strong impact on cognitive precursors of behavior change, suggesting it could be leveraged to enhance risk counseling, particularly when lifestyle change is desirable. Genetic risk was able to alter perceptions about the seriousness of T2D risk in those with moderate and average FHH risk, suggesting that FHH could be used to selectively identify individuals who may benefit from genetic risk testing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic decoding is not ‘frozen’ as was earlier thought, but dynamic. One facet of this is frameshifting that often results in synthesis of a C-terminal region encoded by a new frame. Ribosomal frameshifting is utilized for the synthesis of additional products, for regulatory purposes and for translational ‘correction’ of problem or ‘savior’ indels. Utilization for synthesis of additional products occurs prominently in the decoding of mobile chromosomal element and viral genomes. One class of regulatory frameshifting of stable chromosomal genes governs cellular polyamine levels from yeasts to humans. In many cases of productively utilized frameshifting, the proportion of ribosomes that frameshift at a shift-prone site is enhanced by specific nascent peptide or mRNA context features. Such mRNA signals, which can be 5′ or 3′ of the shift site or both, can act by pairing with ribosomal RNA or as stem loops or pseudoknots even with one component being 4 kb 3′ from the shift site. Transcriptional realignment at slippage-prone sequences also generates productively utilized products encoded trans-frame with respect to the genomic sequence. This too can be enhanced by nucleic acid structure. Together with dynamic codon redefinition, frameshifting is one of the forms of recoding that enriches gene expression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Caribbean genus Pseudophoenix (Arecaceae) has its center of taxonomic diversity in Hispaniola (Haiti and the Dominican Republic). Three species (P. ekmanii, P. lediniana, and P. vinifera) are restricted to this island. In this thesis I investigated the population genetic diversity and structure of Pseudophoenix using ten microsatellite loci. Results showed homozygote excess and high inbreeding coefficients in all populations across all polymorphic loci. Overall, there was high differentiation among populations. Results from the Bayesian and Neighbor Joining cluster analyses identified groups that were consistence with currently accepted species delimitation. We included the only known population of an undescribed morph from the Dominican Republic that has been suggested to represent a new species. Results from the cluster analyses suggested that this putative species is closely related to P. sargentii from Turk and Caicos. Our study provided insights pertinent to the conservation genetics and management of this genus in Hispaniola.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data mining can be defined as the extraction of implicit, previously un-known, and potentially useful information from data. Numerous re-searchers have been developing security technology and exploring new methods to detect cyber-attacks with the DARPA 1998 dataset for Intrusion Detection and the modified versions of this dataset KDDCup99 and NSL-KDD, but until now no one have examined the performance of the Top 10 data mining algorithms selected by experts in data mining. The compared classification learning algorithms in this thesis are: C4.5, CART, k-NN and Naïve Bayes. The performance of these algorithms are compared with accuracy, error rate and average cost on modified versions of NSL-KDD train and test dataset where the instances are classified into normal and four cyber-attack categories: DoS, Probing, R2L and U2R. Additionally the most important features to detect cyber-attacks in all categories and in each category are evaluated with Weka’s Attribute Evaluator and ranked according to Information Gain. The results show that the classification algorithm with best performance on the dataset is the k-NN algorithm. The most important features to detect cyber-attacks are basic features such as the number of seconds of a network connection, the protocol used for the connection, the network service used, normal or error status of the connection and the number of data bytes sent. The most important features to detect DoS, Probing and R2L attacks are basic features and the least important features are content features. Unlike U2R attacks, where the content features are the most important features to detect attacks.