943 resultados para Search space reduction


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Previous studies in speculative prefetching focus on building and evaluating access models for the purpose of access prediction. This paper investigates a complementary area which has been largely ignored, that of performance modelling. We use improvement in access time as the performance metric, for which we derive a formula in terms of resource parameters (time available and time required for prefetching) and speculative parameters (probabilities for next access). The performance maximization problem is expressed as a stretch knapsack problem. We develop an algorithm to maximize the improvement in access time by solving the stretch knapsack problem, using theoretically proven apparatus to reduce the search space. Integration between speculative prefetching and caching is also investigated, albeit under the assumption of equal item sizes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes a technique for improving the performance of parallel genetic algorithms on multi-modal numerical optimisation problems. It employs a cluster analysis algorithm to identify regions of the search space in which more than one sub-population is sampling. Overlapping clusters are merged in one sub-population whilst a simple derating function is applied to samples in all other sub-populations to discourage them from further sampling in that region. This approach leads to a better distribution of the search effort across multiple subpopulations and helps to prevent premature convergence. On the test problems used, significant performance improvements over the traditional island model implementation are realised.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, two evolutionary artificial neural network (EANN) models that are based on integration of two supervised adaptive resonance theory (ART)-based artificial neural networks with a hybrid genetic algorithm (HGA) are proposed. The search process of the proposed EANN models is guided by a knowledge base established by ART with respect to the training data samples. The EANN models explore the search space for “coarse” solutions, and such solutions are then refined using the local search process of the HGA. The performances of the proposed EANN models are evaluated and compared with those from other classifiers using more than ten benchmark data sets. The applicability of the EANN models to a real medical classification task is also demonstrated. The results from the experimental studies demonstrate the effectiveness and usefulness of the proposed EANN models in undertaking pattern classification problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The 3S (Shrinking-Search-Space) multi-thresholding method which have been used for segmentation of medical images according to their intensities, now have been implemented and compared with FCM method in terms of segmentation quality and segmentation time as a benchmark in thresholding. The results show that 3S method produced almost the same segmentation quality or in some occasions better quality than FCM, and the computation time of 3S method is much lower than FCM. This is another superiority of this method with respect to others. Also, the performance of C-means has been compared with two other methods. This comparison shows that, C-means is not a reliable clustering algorithm and it needs several run to give us a reliable result.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis proposes a multi-constraint one-to-many bilateral e-Trade negotiation framework. It deploys mobile agents in negotiation, considers trading competition between vendors and search space, efficiently manages the risk of losing top utility offers that expire before the negotiation deadline, accurately evaluates offers, and truly maintains the security of negotiation data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Current single-locus-based analyses and candidate disease gene prediction methodologies used in genome-wide association studies (GWAS) do not capitalize on the wealth of the underlying genetic data, nor functional data available from molecular biology. Here, we analyzed GWAS data from the Wellcome Trust Case Control Consortium (WTCCC) on coronary artery disease (CAD). Gentrepid uses a multiple-locus-based approach, drawing on protein pathway- or domain-based data to make predictions. Known disease genes may be used as additional information (seeded method) or predictions can be based entirely on GWAS single nucleotide polymorphisms (SNPs) (ab initio method). We looked in detail at specific predictions made by Gentrepid for CAD and compared these with known genetic data and the scientific literature. Gentrepid was able to extract known disease genes from the candidate search space and predict plausible novel disease genes from both known and novel WTCCC-implicated loci. The disease gene candidates are consistent with known biological information. The results demonstrate that this computational approach is feasible and a valuable discovery tool for geneticists.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Metaheuristic algorithm is one of the most popular methods in solving many optimization problems. This paper presents a new hybrid approach comprising of two natures inspired metaheuristic algorithms i.e. Cuckoo Search (CS) and Accelerated Particle Swarm Optimization (APSO) for training Artificial Neural Networks (ANN). In order to increase the probability of the egg’s survival, the cuckoo bird migrates by traversing more search space. It can successfully search better solutions by performing levy flight with APSO. In the proposed Hybrid Accelerated Cuckoo Particle Swarm Optimization (HACPSO) algorithm, the communication ability for the cuckoo birds have been provided by APSO, thus making cuckoo bird capable of searching for the best nest with better solution. Experimental results are carried-out on benchmarked datasets, and the performance of the proposed hybrid algorithm is compared with Artificial Bee Colony (ABC) and similar hybrid variants. The results show that the proposed HACPSO algorithm performs better than other algorithms in terms of convergence and accuracy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Stochastic search techniques such as evolutionary algorithms (EA) are known to be better explorer of search space as compared to conventional techniques including deterministic methods. However, in the era of big data like most other search methods and learning algorithms, suitability of evolutionary algorithms is naturally questioned. Big data pose new computational challenges including very high dimensionality and sparseness of data. Evolutionary algorithms' superior exploration skills should make them promising candidates for handling optimization problems involving big data. High dimensional problems introduce added complexity to the search space. However, EAs need to be enhanced to ensure that majority of the potential winner solutions gets the chance to survive and mature. In this paper we present an evolutionary algorithm with enhanced ability to deal with the problems of high dimensionality and sparseness of data. In addition to an informed exploration of the solution space, this technique balances exploration and exploitation using a hierarchical multi-population approach. The proposed model uses informed genetic operators to introduce diversity by expanding the scope of search process at the expense of redundant less promising members of the population. Next phase of the algorithm attempts to deal with the problem of high dimensionality by ensuring broader and more exhaustive search and preventing premature death of potential solutions. To achieve this, in addition to the above exploration controlling mechanism, a multi-tier hierarchical architecture is employed, where, in separate layers, the less fit isolated individuals evolve in dynamic sub-populations that coexist alongside the original or main population. Evaluation of the proposed technique on well known benchmark problems ascertains its superior performance. The algorithm has also been successfully applied to a real world problem of financial portfolio management. Although the proposed method cannot be considered big data-ready, it is certainly a move in the right direction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

 Photovoltaic based microgrid have been increasingly investigated in recent years, ascribable to their fundamental advantages such as the infinite energy source, environmentally friendly aspect and low upkeep cost. However, in practice, they are still considered as an expensive and low output option of renewable energy resources. To extract the maximum possible power from the output of the PV system, a reliable maximum power point tracker (MPPT) is required. Numerous studies have been conducted to introduce the best MPPT techniques suitable for different types of PV systems. However, they are mostly able to track the MPP from the PV system when the output signals (Voltage and Current) of individual array are available. In this study, a meta-heuristic method, based on particle swarm optimization theory, is used to determine the actual MPP of PV system, including several PV arrays, by only single current sensor at the output terminal. The results of the proposed PSO based technique, for tracking the global MPP in a multidimensional search space, have been presented at the end of this paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, an evolutionary algorithm is used for developing a decision support tool to undertake multi-objective job-shop scheduling problems. A modified micro genetic algorithm (MmGA) is adopted to provide optimal solutions according to the Pareto optimality principle in solving multi-objective optimisation problems. MmGA operates with a very small population size to explore a wide search space of function evaluations and to improve the convergence score towards the true Pareto optimal front. To evaluate the effectiveness of the MmGA-based decision support tool, a multi-objective job-shop scheduling problem with actual information from a manufacturing company is deployed. The statistical bootstrap method is used to evaluate the experimental results, and compared with those from the enumeration method. The outcome indicates that the decision support tool is able to achieve those optimal solutions as generated by the enumeration method. In addition, the proposed decision support tool has advantage of achieving the results within a fraction of the time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A condução das operações de preparo de forma inadequada ocasiona sérios problemas de conservação do solo, destacando-se a compactação, que acarreta a redução do espaço poroso, principalmente dos macroporos, e altera os atributos físico-hídricos. Este trabalho teve como objetivo verificar a influência dos diferentes sistemas e tempos de adoção de manejos em Latossolo Vermelho de Jaboticabal, Estado de São Paulo, por meio da densidade máxima, e correlacioná-la com a produtividade da soja, a densidade relativa e a umidade crítica de compactação. O delineamento experimental foi o inteiramente casualizado com parcelas subdivididas (cinco sistemas de uso e três camadas), com quatro repetições. Os cinco sistemas de uso foram: plantio direto por cino anos (SPD5), plantio direto por sete anos (SPD7), plantio direto por nove anos (SPD9), preparo convencional (SPC) e uma área adjacente de mata nativa (MN). As camadas do solo avaliadas foram as de 0-0,10, 0,10-0,20 e 0,20-0,30 m, nas quais foram determinados a densidade máxima do solo (Ds máx), a umidade crítica de compactação (Ugc), a densidade relativa do solo (Dsr), a composição granulométrica, a porosidade e o teor de matéria orgânica do solo. Os resultados mostraram que o comportamento das curvas de compactação do solo foi o mesmo em todas as camadas dos diferentes manejos e que os teores de matéria orgânica não justificaram as pequenas alterações da Ds máx. Para o Latossolo Vermelho, as operações mecanizadas nos sistemas de manejo podem ser executadas na faixa de 0,13 a 0,19 kg kg-1 de umidade sem causar degradação física. Verificou-se que a Dsr ótima e a umidade crítica de compactação foram de 0,86 e 0,15 kg kg-1, respectivamente, embora os diferentes sistemas e tempos de adoção de manejo tenham apresentado comportamento semelhante quanto à produtividade da soja.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The frequency selective surfaces, or FSS (Frequency Selective Surfaces), are structures consisting of periodic arrays of conductive elements, called patches, which are usually very thin and they are printed on dielectric layers, or by openings perforated on very thin metallic surfaces, for applications in bands of microwave and millimeter waves. These structures are often used in aircraft, missiles, satellites, radomes, antennae reflector, high gain antennas and microwave ovens, for example. The use of these structures has as main objective filter frequency bands that can be broadcast or rejection, depending on the specificity of the required application. In turn, the modern communication systems such as GSM (Global System for Mobile Communications), RFID (Radio Frequency Identification), Bluetooth, Wi-Fi and WiMAX, whose services are highly demanded by society, have required the development of antennas having, as its main features, and low cost profile, and reduced dimensions and weight. In this context, the microstrip antenna is presented as an excellent choice for communications systems today, because (in addition to meeting the requirements mentioned intrinsically) planar structures are easy to manufacture and integration with other components in microwave circuits. Consequently, the analysis and synthesis of these devices mainly, due to the high possibility of shapes, size and frequency of its elements has been carried out by full-wave models, such as the finite element method, the method of moments and finite difference time domain. However, these methods require an accurate despite great computational effort. In this context, computational intelligence (CI) has been used successfully in the design and optimization of microwave planar structures, as an auxiliary tool and very appropriate, given the complexity of the geometry of the antennas and the FSS considered. The computational intelligence is inspired by natural phenomena such as learning, perception and decision, using techniques such as artificial neural networks, fuzzy logic, fractal geometry and evolutionary computation. This work makes a study of application of computational intelligence using meta-heuristics such as genetic algorithms and swarm intelligence optimization of antennas and frequency selective surfaces. Genetic algorithms are computational search methods based on the theory of natural selection proposed by Darwin and genetics used to solve complex problems, eg, problems where the search space grows with the size of the problem. The particle swarm optimization characteristics including the use of intelligence collectively being applied to optimization problems in many areas of research. The main objective of this work is the use of computational intelligence, the analysis and synthesis of antennas and FSS. We considered the structures of a microstrip planar monopole, ring type, and a cross-dipole FSS. We developed algorithms and optimization results obtained for optimized geometries of antennas and FSS considered. To validate results were designed, constructed and measured several prototypes. The measured results showed excellent agreement with the simulated. Moreover, the results obtained in this study were compared to those simulated using a commercial software has been also observed an excellent agreement. Specifically, the efficiency of techniques used were CI evidenced by simulated and measured, aiming at optimizing the bandwidth of an antenna for wideband operation or UWB (Ultra Wideband), using a genetic algorithm and optimizing the bandwidth, by specifying the length of the air gap between two frequency selective surfaces, using an optimization algorithm particle swarm

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Capacitated Centered Clustering Problem (CCCP) consists of defining a set of p groups with minimum dissimilarity on a network with n points. Demand values are associated with each point and each group has a demand capacity. The problem is well known to be NP-hard and has many practical applications. In this paper, the hybrid method Clustering Search (CS) is implemented to solve the CCCP. This method identifies promising regions of the search space by generating solutions with a metaheuristic, such as Genetic Algorithm, and clustering them into clusters that are then explored further with local search heuristics. Computational results considering instances available in the literature are presented to demonstrate the efficacy of CS. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper discusses the expression of informality in contemporary capitalism. Thematic of relevance to the analysis of the reality of work today and the logic that moves the capital, its real presence in the lives of individuals. The street trading of Pau dos Ferros town, popularly known as "street market" was chosen as the search space. The main objective is to seize and examine the articulations and logic, present in the configuration of the street trading of this city, located in the state of Rio Grande do Norte, explaining the functionality of informality for capitalist accumulation, but also for the reproduction of segments of the working class. Our analysis is based in the perspective of totality, trying to grasp the historical determinations of the phenomenon in focus. It includes the analysis of the mechanisms used by the capital to reproduce itself in the current historical context, which has been implicated in the composition of the labor markets of different countries and in various forms of exploitation to which workers in general are subject. It also means discussing the development of capitalism in Brazil, the logic that permeates its dependence, and especially the use of over-exploitation of labor, as a lever for internal accumulation. The course of investigation consisted of theoretical research to form the basis of theoretical and methodological analysis and to outline the context in which our research object is inserted, and field research conducted in two phases: systematic observation, which allowed to map traders features and the infrastructure of commerce, and the conduction of interviews with key informants. The material collected was scrutinized according to analytical scheme inspired by the content analysis. Among the main considerations developed from the research process we include: the street trading of Pau dos Ferros remains shrouded in the majority sale of agricultural products, this demonstrates the structural characteristics of the region. However, the supply of this product is no longer restricted to the excess of small local producers. The presence of the dealer changed the distribution of the product, streamlining it. In parallel, business practices are developed, practices in which traded goods (industrial) reflect the moment of capitalist restoration, a larger business network. The reflections also made it possible to show that street trading follows developing on the basis of informal work, which gains functionality to the system, as it is configured as a space commonly used to drain part of the production, of industries (clothing/shoes), especially if the distribution is considered as an essential element of the complex process that aims at capital appreciation. This activity has been functioning as a place of employment and income generation for the subjects who are away from formal employment, masking, this way, unemployment, moreover, they allow them to continue as consumers. Such expressions reflect the ability and the logic of capital to expand and aggregate into so many realities. It is underway today, the logic that has led many workers to join the project of domination of capital, by the illusory chance to become capitalists. The aim has been to turn the subject into a consumer and the worker an enterprising