963 resultados para random search algorithms
Resumo:
In this paper we determine the local and global resilience of random graphs G(n,p) (p >> n(-1)) with respect to the property of containing a cycle of length at least (1 - alpha)n. Roughly speaking, given alpha > 0, we determine the smallest r(g) (G, alpha) with the property that almost surely every subgraph of G = G(n,p) having more than r(g) (G, alpha)vertical bar E(G)vertical bar edges contains a cycle of length at least (1 - alpha)n (global resilience). We also obtain, for alpha < 1/2, the smallest r(l) (G, alpha) such that any H subset of G having deg(H) (v) larger than r(l) (G, alpha) deg(G) (v) for all v is an element of V(G) contains a cycle of length at least (1 - alpha)n (local resilience). The results above are in fact proved in the more general setting of pseudorandom graphs.
Resumo:
Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.
Resumo:
We consider a Random Walk in Random Environment (RWRE) moving in an i.i.d. random field of obstacles. When the particle hits an obstacle, it disappears with a positive probability. We obtain quenched and annealed bounds on the tails of the survival time in the general d-dimensional case. We then consider a simplified one-dimensional model (where transition probabilities and obstacles are independent and the RWRE only moves to neighbour sites), and obtain finer results for the tail of the survival time. In addition, we study also the ""mixed"" probability measures (quenched with respect to the obstacles and annealed with respect to the transition probabilities and vice-versa) and give results for tails of the survival time with respect to these probability measures. Further, we apply the same methods to obtain bounds for the tails of hitting times of Branching Random Walks in Random Environment (BRWRE).
Resumo:
We consider binary infinite order stochastic chains perturbed by a random noise. This means that at each time step, the value assumed by the chain can be randomly and independently flipped with a small fixed probability. We show that the transition probabilities of the perturbed chain are uniformly close to the corresponding transition probabilities of the original chain. As a consequence, in the case of stochastic chains with unbounded but otherwise finite variable length memory, we show that it is possible to recover the context tree of the original chain, using a suitable version of the algorithm Context, provided that the noise is small enough.
Resumo:
Consider a discrete locally finite subset Gamma of R(d) and the cornplete graph (Gamma, E), with vertices Gamma and edges E. We consider Gibbs measures on the set of sub-graphs with vertices Gamma and edges E` subset of E. The Gibbs interaction acts between open edges having a vertex in common. We study percolation properties of the Gibbs distribution of the graph ensemble. The main results concern percolation properties of the open edges in two cases: (a) when Gamma is sampled from a homogeneous Poisson process; and (b) for a fixed Gamma with sufficiently sparse points. (c) 2010 American Institute of Physics. [doi:10.1063/1.3514605]
Resumo:
We consider the problem of interaction neighborhood estimation from the partial observation of a finite number of realizations of a random field. We introduce a model selection rule to choose estimators of conditional probabilities among natural candidates. Our main result is an oracle inequality satisfied by the resulting estimator. We use then this selection rule in a two-step procedure to evaluate the interacting neighborhoods. The selection rule selects a small prior set of possible interacting points and a cutting step remove from this prior set the irrelevant points. We also prove that the Ising models satisfy the assumptions of the main theorems, without restrictions on the temperature, on the structure of the interacting graph or on the range of the interactions. It provides therefore a large class of applications for our results. We give a computationally efficient procedure in these models. We finally show the practical efficiency of our approach in a simulation study.
Resumo:
The Random Parameter model was proposed to explain the structure of the covariance matrix in problems where most, but not all, of the eigenvalues of the covariance matrix can be explained by Random Matrix Theory. In this article, we explore the scaling properties of the model, as observed in the multifractal structure of the simulated time series. We use the Wavelet Transform Modulus Maxima technique to obtain the multifractal spectrum dependence with the parameters of the model. The model shows a scaling structure compatible with the stylized facts for a reasonable choice of the parameter values. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The adaptive process in motor learning was examined in terms of effects of varying amounts of constant practice performed before random practice. Participants pressed five response keys sequentially, the last one coincident with the lighting of a final visual stimulus provided by a complex coincident timing apparatus. Different visual stimulus speeds were used during the random practice. 33 children (M age=11.6 yr.) were randomly assigned to one of three experimental groups: constant-random, constant-random 33%, and constant-random 66%. The constant-random group practiced constantly until they reached a criterion of performance stabilization three consecutive trials within 50 msec. of error. The other two groups had additional constant practice of 33 and 66%, respectively, of the number of trials needed to achieve the stabilization criterion. All three groups performed 36 trials under random practice; in the adaptation phase, they practiced at a different visual stimulus speed adopted in the stabilization phase. Global performance measures were absolute, constant, and variable errors, and movement pattern was analyzed by relative timing and overall movement time. There was no group difference in relation to global performance measures and overall movement time. However, differences between the groups were observed on movement pattern, since constant-random 66% group changed its relative timing performance in the adaptation phase.
Resumo:
A process has been elaborated for one-step low lignin content sugarcane bagasse hemicellulose extraction using alkaline solution of hydrogen peroxide. To maximize the hemicellulose yields several extraction conditions were examined applying the 2(4) factorial design: H(2)O(2) concentration from 2 to 6% (w/v), reaction time from 4 to 16 h, temperature from 20 to 60 degrees C, and magnesium sulfate absence or presence (0.5%, w/v). This approach allowed selection of conditions for the extraction of low and high lignin content hemicellulose. At midpoint the yield of hemicellulose was 94.5% with more than 88% of lignin removed. Lignin removal is suppressed at low extraction temperatures and in the absence of magnesium sulfate. Hemicellulose in 86% yield with low lignin content (5.9%) was obtained with 6% H(2)O(2) treatment for 4 h and 20 degrees C. This hemicellulose is much lighter in color than samples obtained at the midpoint condition and was found suitable for subsequent enzymatic hydrolysis. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A heuristic algorithm that employs fuzzy logic is proposed to the power system transmission expansion planning problem. The algorithm is based on the divide to conquer strategy, which is controlled by the fuzzy system. The algorithm provides high quality solutions with the use of fuzzy decision making, which is based on nondeterministic criteria to guide the search. The fuzzy system provides a self-adjusting mechanism that eliminates the manual adjustment of parameters to each system being solved. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.
Resumo:
Voltage and current waveforms of a distribution or transmission power system are not pure sinusoids. There are distortions in these waveforms that can be represented as a combination of the fundamental frequency, harmonics and high frequency transients. This paper presents a novel approach to identifying harmonics in power system distorted waveforms. The proposed method is based on Genetic Algorithms, which is an optimization technique inspired by genetics and natural evolution. GOOAL, a specially designed intelligent algorithm for optimization problems, was successfully implemented and tested. Two kinds of representations concerning chromosomes are utilized: binary and real. The results show that the proposed method is more precise than the traditional Fourier Transform, especially considering the real representation of the chromosomes.
Resumo:
The purpose of this paper is to propose a multiobjective optimization approach for solving the manufacturing cell formation problem, explicitly considering the performance of this said manufacturing system. Cells are formed so as to simultaneously minimize three conflicting objectives, namely, the level of the work-in-process, the intercell moves and the total machinery investment. A genetic algorithm performs a search in the design space, in order to approximate to the Pareto optimal set. The values of the objectives for each candidate solution in a population are assigned by running a discrete-event simulation, in which the model is automatically generated according to the number of machines and their distribution among cells implied by a particular solution. The potential of this approach is evaluated via its application to an illustrative example, and a case from the relevant literature. The obtained results are analyzed and reviewed. Therefore, it is concluded that this approach is capable of generating a set of alternative manufacturing cell configurations considering the optimization of multiple performance measures, greatly improving the decision making process involved in planning and designing cellular systems. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The general flowshop scheduling problem is a production problem where a set of n jobs have to be processed with identical flow pattern on in machines. In permutation flowshops the sequence of jobs is the same on all machines. A significant research effort has been devoted for sequencing jobs in a flowshop minimizing the makespan. This paper describes the application of a Constructive Genetic Algorithm (CGA) to makespan minimization on flowshop scheduling. The CGA was proposed recently as an alternative to traditional GA approaches, particularly, for evaluating schemata directly. The population initially formed only by schemata, evolves controlled by recombination to a population of well-adapted structures (schemata instantiation). The CGA implemented is based on the NEH classic heuristic and a local search heuristic used to define the fitness functions. The parameters of the CGA are calibrated using a Design of Experiments (DOE) approach. The computational results are compared against some other successful algorithms from the literature on Taillard`s well-known standard benchmark. The computational experience shows that this innovative CGA approach provides competitive results for flowshop scheduling; problems. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The aim of this article is to present the main contributions of human resource management to develop sustainable organizations. The relationship between human resources and organizational sustainability, which is based on economical, social and environmental performance, involves some important aspects concerning management such as innovation, cultural diversity and the environment. The integration of items from the triple bottom line approach leads to developing a model based on a strategic and central posture of human resource management. Based on this model, propositions and recommendations for future research on this theme are presented.