902 resultados para Dynamic search fireworks algorithm with covariance mutation
Resumo:
Plant diseases represent a major economic and environmental problem in agriculture and forestry. Upon infection, a plant develops symptoms that affect different parts of the plant causing a significant agronomic impact. As many such diseases spread in time over the whole crop, a system for early disease detection can aid to mitigate the losses produced by the plant diseases and can further prevent their spread [1]. In recent years, several mathematical algorithms of search have been proposed [2,3] that could be used as a non-invasive, fast, reliable and cost-effective methods to localize in space infectious focus by detecting changes in the profile of volatile organic compounds. Tracking scents and locating odor sources is a major challenge in robotics, on one hand because odour plumes consists of non-uniform intermittent odour patches dispersed by the wind and on the other hand because of the lack of precise and reliable odour sensors. Notwithstanding, we have develop a simple robotic platform to study the robustness and effectiveness of different search algorithms [4], with respect to specific problems to be found in their further application in agriculture, namely errors committed in the motion and sensing and to the existence of spatial constraints due to land topology or the presence of obstacles.
Resumo:
Encontrar el árbol de expansión mínimo con restricción de grado de un grafo (DCMST por sus siglas en inglés) es un problema NP-complejo ampliamente estudiado. Una de sus aplicaciones más importantes es el dise~no de redes. Aquí nosotros tratamos una nueva variante del problema DCMST, que consiste en encontrar el árbol de expansión mínimo no solo con restricciones de grado, sino también con restricciones de rol (DRCMST), es decir, a~nadimos restricciones para restringir el rol que los nodos tienen en el árbol. Estos roles pueden ser nodo raíz, nodo intermedio o nodo hoja. Por otra parte, no limitamos el número de nodos raíz a uno, por lo que, en general, construiremos bosques de DRCMSTs. El modelado en los problemas de dise~no de redes puede beneficiarse de la posibilidad de generar más de un árbol y determinar el rol de los nodos en la red. Proponemos una nueva representación basada en permutaciones para codificar los bosques de DRCMSTs. En esta nueva representación, una permutación codifica simultáneamente todos los árboles que se construirán. Nosotros simulamos una amplia variedad de problemas DRCMST que optimizamos utilizando ocho algoritmos de computación evolutiva diferentes que codifican los individuos de la población utilizando la representación propuesta. Los algoritmos que utilizamos son: algoritmo de estimación de distribuciones (EDA), algoritmo genético generacional (gGA), algoritmo genético de estado estacionario (ssGA), estrategia evolutiva basada en la matriz de covarianzas (CMAES), evolución diferencial (DE), estrategia evolutiva elitista (ElitistES), estrategia evolutiva no elitista (NonElitistES) y optimización por enjambre de partículas (PSO). Los mejores resultados fueron para el algoritmo de estimación de distribuciones utilizado y ambos tipos de algoritmos genéticos, aunque los algoritmos genéticos fueron significativamente más rápidos.---ABSTRACT---Finding the degree-constrained minimum spanning tree (DCMST) of a graph is a widely studied NP-hard problem. One of its most important applications is network design. Here we deal with a new variant of the DCMST problem, which consists of finding not only the degree- but also the role-constrained minimum spanning tree (DRCMST), i.e., we add constraints to restrict the role of the nodes in the tree to root, intermediate or leaf node. Furthermore, we do not limit the number of root nodes to one, thereby, generally, building a forest of DRCMSTs. The modeling of network design problems can benefit from the possibility of generating more than one tree and determining the role of the nodes in the network. We propose a novel permutation-based representation to encode the forest of DRCMSTs. In this new representation, one permutation simultaneously encodes all the trees to be built. We simulate a wide variety of DRCMST problems which we optimize using eight diferent evolutionary computation algorithms encoding individuals of the population using the proposed representation. The algorithms we use are: estimation of distribution algorithm (EDA), generational genetic algorithm (gGA), steady-state genetic algorithm (ssGA), covariance matrix adaptation evolution strategy (CMAES), diferential evolution (DE), elitist evolution strategy (ElististES), non-elitist evolution strategy (NonElististES) and particle swarm optimization (PSO). The best results are for the estimation of distribution algorithm and both types of genetic algorithms, although the genetic algorithms are significantly faster. iv
Resumo:
Cognitive radio represents a promising paradigm to further increase transmission rates in wireless networks, as well as to facilitate the deployment of self-organized networks such as femtocells. Within this framework, secondary users (SU) may exploit the channel under the premise to maintain the quality of service (QoS) on primary users (PU) above a certain level. To achieve this goal, we present a noncooperative game where SU maximize their transmission rates, and may act as well as relays of the PU in order to hold their perceived QoS above the given threshold. In the paper, we analyze the properties of the game within the theory of variational inequalities, and provide an algorithm that converges to one Nash Equilibrium of the game. Finally, we present some simulations and compare the algorithm with another method that does not consider SU acting as relays.
Resumo:
Debido al creciente aumento del tamaño de los datos en muchos de los actuales sistemas de información, muchos de los algoritmos de recorrido de estas estructuras pierden rendimento para realizar búsquedas en estos. Debido a que la representacion de estos datos en muchos casos se realiza mediante estructuras nodo-vertice (Grafos), en el año 2009 se creó el reto Graph500. Con anterioridad, otros retos como Top500 servían para medir el rendimiento en base a la capacidad de cálculo de los sistemas, mediante tests LINPACK. En caso de Graph500 la medicion se realiza mediante la ejecución de un algoritmo de recorrido en anchura de grafos (BFS en inglés) aplicada a Grafos. El algoritmo BFS es uno de los pilares de otros muchos algoritmos utilizados en grafos como SSSP, shortest path o Betweeness centrality. Una mejora en este ayudaría a la mejora de los otros que lo utilizan. Analisis del Problema El algoritmos BFS utilizado en los sistemas de computación de alto rendimiento (HPC en ingles) es usualmente una version para sistemas distribuidos del algoritmo secuencial original. En esta versión distribuida se inicia la ejecución realizando un particionado del grafo y posteriormente cada uno de los procesadores distribuidos computará una parte y distribuirá sus resultados a los demás sistemas. Debido a que la diferencia de velocidad entre el procesamiento en cada uno de estos nodos y la transfencia de datos por la red de interconexión es muy alta (estando en desventaja la red de interconexion) han sido bastantes las aproximaciones tomadas para reducir la perdida de rendimiento al realizar transferencias. Respecto al particionado inicial del grafo, el enfoque tradicional (llamado 1D-partitioned graph en ingles) consiste en asignar a cada nodo unos vertices fijos que él procesará. Para disminuir el tráfico de datos se propuso otro particionado (2D) en el cual la distribución se haciá en base a las aristas del grafo, en vez de a los vertices. Este particionado reducía el trafico en la red en una proporcion O(NxM) a O(log(N)). Si bien han habido otros enfoques para reducir la transferecnia como: reordemaniento inicial de los vertices para añadir localidad en los nodos, o particionados dinámicos, el enfoque que se va a proponer en este trabajo va a consistir en aplicar técnicas recientes de compression de grandes sistemas de datos como Bases de datos de alto volume o motores de búsqueda en internet para comprimir los datos de las transferencias entre nodos.---ABSTRACT---The Breadth First Search (BFS) algorithm is the foundation and building block of many higher graph-based operations such as spanning trees, shortest paths and betweenness centrality. The importance of this algorithm increases each day due to it is a key requirement for many data structures which are becoming popular nowadays. These data structures turn out to be internally graph structures. When the BFS algorithm is parallelized and the data is distributed into several processors, some research shows a performance limitation introduced by the interconnection network [31]. Hence, improvements on the area of communications may benefit the global performance in this key algorithm. In this work it is presented an alternative compression mechanism. It differs with current existing methods in that it is aware of characteristics of the data which may benefit the compression. Apart from this, we will perform a other test to see how this algorithm (in a dis- tributed scenario) benefits from traditional instruction-based optimizations. Last, we will review the current supercomputing techniques and the related work being done in the area.
Resumo:
Epidermolysis bullosa simplex (EBS) is a group of autosomal dominant skin diseases characterized by blistering, due to mechanical stress-induced degeneration of basal epidermal cells. It is now well-established that the three major subtypes of EBS are genetic disorders of the basal epidermal keratins, keratin 5 (K5) and keratin 14 (K14). Here we show that a rare subtype, referred to as EBS with mottled pigmentation (MP), is also a disorder of these keratins. Affected members of two seemingly unrelated families with EBS-MP had a C to T point mutation in the second base position of codon 24 of one of two K5 alleles, leading to a Pro: Leu mutation. This mutation was not present in unaffected members nor in 100 alleles from normal individuals. Linkage analyses mapped the defect to this type II keratin gene (peak logarithm of odds score at phi = 0 of 3.9), which is located on chromosome 12q11-q13. This provides strong evidence that this mutation is responsible for the EBS-MP phenotype. Only conserved between K5 and K6, and not among any of the other type II keratins, Pro-24 is in the nonhelical head domain of K5, and only mildly perturbs the length of 10-nm keratin filaments assembled in vitro. However, this part of the K5 head domain is likely to protrude on the filament surface, perhaps leading to additional aberrations in intermediate filament architecture and/or in melanosome distribution that are seen ultrastructurally in patients with the mutation.
Resumo:
Tuberculosis continues to be responsible for the deaths of millions of people, yet the virulence factors of the causative pathogens remain unknown. Genetic complementation experiments with strains of the Mycobacterium tuberculosis complex have identified a gene from a virulent strain that restores virulence to an attenuated strain. The gene, designated rpoV, has a high degree of homology with principal transcription or sigma factors from other bacteria, particularly Mycobacterium smegmatis and Streptomyces griseus. The homologous rpoV gene of the attenuated strain has a point mutation causing an arginine-->histidine change in a domain known to interact with promoters. To our knowledge, association of loss of bacterial virulence with a mutation in the principal sigma factor has not been previously reported. The results indicate either that tuberculosis organisms have an alternative principal sigma factor that promotes virulence genes or, more probably, that this particular mutant principal sigma factor is unable to promote expression of one or more genes required for virulence. Study of genes and proteins differentially regulated by the mutant transcription factor should facilitate identification of further virulence factors.
Resumo:
Tuning compilations is the process of adjusting the values of a compiler options to improve some features of the final application. In this paper, a strategy based on the use of a genetic algorithm and a multi-objective scheme is proposed to deal with this task. Unlike previous works, we try to take advantage of the knowledge of this domain to provide a problem-specific genetic operation that improves both the speed of convergence and the quality of the results. The evaluation of the strategy is carried out by means of a case of study aimed to improve the performance of the well-known web server Apache. Experimental results show that a 7.5% of overall improvement can be achieved. Furthermore, the adaptive approach has shown an ability to markedly speed-up the convergence of the original strategy.
Resumo:
The paper presents the analysis of an important historical building: the Saint James Theater in the city of Corfù (Greece) actually used as the Municipality House. The building, located in the center of the city, is made of carves stones and is characterized by a stocky shape and by the presence of wooden floors. The study deals with the structural identification of such structure through the analysis of its ambient vibrations recorded by means of accelerometers with high accuracy. A full dynamic testing was developed using ambient vibrations to identify the main modal parameters and to make a non-destructive characterization of this building. The results of these dynamic tests are compared with the modal analysis of a complex finite element (FE) simulation of the structure. This analysis may present several problems and uncertainties for this stocky building. Due to the presence of wooden floors, the local modes can be highly excited and, as a consequence, the evaluation of the structural modal parameters presents some difficulties.
Resumo:
In today's internet world, web browsers are an integral part of our day-to-day activities. Therefore, web browser security is a serious concern for all of us. Browsers can be breached in different ways. Because of the over privileged access, extensions are responsible for many security issues. Browser vendors try to keep safe extensions in their official extension galleries. However, their security control measures are not always effective and adequate. The distribution of unsafe extensions through different social engineering techniques is also a very common practice. Therefore, before installation, users should thoroughly analyze the security of browser extensions. Extensions are not only available for desktop browsers, but many mobile browsers, for example, Firefox for Android and UC browser for Android, are also furnished with extension features. Mobile devices have various resource constraints in terms of computational capabilities, power, network bandwidth, etc. Hence, conventional extension security analysis techniques cannot be efficiently used by end users to examine mobile browser extension security issues. To overcome the inadequacies of the existing approaches, we propose CLOUBEX, a CLOUd-based security analysis framework for both desktop and mobile Browser EXtensions. This framework uses a client-server architecture model. In this framework, compute-intensive security analysis tasks are generally executed in a high-speed computing server hosted in a cloud environment. CLOUBEX is also enriched with a number of essential features, such as client-side analysis, requirements-driven analysis, high performance, and dynamic decision making. At present, the Firefox extension ecosystem is most susceptible to different security attacks. Hence, the framework is implemented for the security analysis of the Firefox desktop and Firefox for Android mobile browser extensions. A static taint analysis is used to identify malicious information flows in the Firefox extensions. In CLOUBEX, there are three analysis modes. A dynamic decision making algorithm assists us to select the best option based on some important parameters, such as the processing speed of a client device and network connection speed. Using the best analysis mode, performance and power consumption are improved significantly. In the future, this framework can be leveraged for the security analysis of other desktop and mobile browser extensions, too.
Resumo:
Master thesis discusses the analysis of changes in biological signals on time based on dynamic time warping algorithm (DTW). Special attention is paid to problems of tiny changes analysis incomplex nonstationary biological signals. Electrocardiographic (ECG) signals are used as an example inthis study; in particular, repolarization segments of heart beat cycles. The aim of the research is studyingthe possibility of applying DTW algorithm for the analysis of small changes in the repolarization segments of heart beat cycles. The research has the following tasks:- Studying repolarization segments of heart beat cycles, andmethods of their analysis;- Studying DTW algorithm and its modifications, finding the most appropriate modification for analyzing changes in biological signals;- Development of methods for analyzing the warping path(output parameter of DTW algorithm).
Resumo:
"January, 1971."
Resumo:
Contrast enhanced magnetic resonance imaging (CE MRI) is the most sensitive tool for screening women who are at high familial risk of breast cancer. Our aim in this study was to assess the cost-effectiveness of X-ray mammography (XRM), CE MRI or both strategies combined. In total, 649 women were enrolled in the MARIBS study and screened with both CE MRI and mammography resulting in 1881 screens and 1-7 individual annual screening events. Women aged 35-49 years at high risk of breast cancer, either because they have a strong family history of breast cancer or are tested carriers of a BRCA1, BRCA2 or TP53 mutation or are at a 50% risk of having inherited such a mutation, were recruited from 22 centres and offered annual MRI and XRM for between 2 and 7 years. Information on the number and type of further investigations was collected and specifically calculated unit costs were used to calculate the incremental cost per cancer detected. The numbers of cancer detected was 13 for mammography, 27 for CE MRI and 33 for mammography and CE MRI combined. In the subgroup of BRCA1 (BRCA2) mutation carriers or of women having a first degree relative with a mutation in BRCA1 (BRCA2) corresponding numbers were 3 (6), 12 (7) and 12 (11), respectively. For all women, the incremental cost per cancer detected with CE MRI and mammography combined was 28 pound 284 compared to mammography. When only BRCA1 or the BRCA2 groups were considered, this cost would be reduced to 11 pound 731 (CE MRI vs mammography) and 15 pound 302 (CE MRI and mammography vs mammography). Results were most sensitive to the unit cost estimate for a CE MRI screening test. Contrast-enhanced MRI might be a cost-effective screening modality for women at high risk, particularly for the BRCA1 and BRCA2 subgroups. Further work is needed to assess the impact of screening on mortality and health-related quality of life.
Resumo:
We revisit the one-unit gradient ICA algorithm derived from the kurtosis function. By carefully studying properties of the stationary points of the discrete-time one-unit gradient ICA algorithm, with suitable condition on the learning rate, convergence can be proved. The condition on the learning rate helps alleviate the guesswork that accompanies the problem of choosing suitable learning rate in practical computation. These results may be useful to extract independent source signals on-line.
Resumo:
Objective: The description and evaluation of the performance of a new real-time seizure detection algorithm in the newborn infant. Methods: The algorithm includes parallel fragmentation of EEG signal into waves; wave-feature extraction and averaging; elementary, preliminary and final detection. The algorithm detects EEG waves with heightened regularity, using wave intervals, amplitudes and shapes. The performance of the algorithm was assessed with the use of event-based and liberal and conservative time-based approaches and compared with the performance of Gotman's and Liu's algorithms. Results: The algorithm was assessed on multi-channel EEG records of 55 neonates including 17 with seizures. The algorithm showed sensitivities ranging 83-95% with positive predictive values (PPV) 48-77%. There were 2.0 false positive detections per hour. In comparison, Gotman's algorithm (with 30 s gap-closing procedure) displayed sensitivities of 45-88% and PPV 29-56%; with 7.4 false positives per hour and Liu's algorithm displayed sensitivities of 96-99%, and PPV 10-25%; with 15.7 false positives per hour. Conclusions: The wave-sequence analysis based algorithm displayed higher sensitivity, higher PPV and a substantially lower level of false positives than two previously published algorithms. Significance: The proposed algorithm provides a basis for major improvements in neonatal seizure detection and monitoring. Published by Elsevier Ireland Ltd. on behalf of International Federation of Clinical Neurophysiology.
Resumo:
Deformable models are an attractive approach to recognizing objects which have considerable within-class variability such as handwritten characters. However, there are severe search problems associated with fitting the models to data which could be reduced if a better starting point for the search were available. We show that by training a neural network to predict how a deformable model should be instantiated from an input image, such improved starting points can be obtained. This method has been implemented for a system that recognizes handwritten digits using deformable models, and the results show that the search time can be significantly reduced without compromising recognition performance. © 1997 Academic Press.