809 resultados para blocking algorithm
Resumo:
An uncomplicated and easy handling prescription that converts the task of checking the unitarity of massive, topologically massive, models into a straightforward algebraic exercise, is developed. The algorithm is used to test the unitarity of both topologically massive higher-derivative electromagnetism (TMHDE) and topologically massive higher-derivative gravity (TMHDG). The novel and amazing features of these effective field models are also discussed.
Resumo:
A non-twist Hamiltonian system perturbed by two waves with particular wave numbers can present Robust Tori, barriers created by the vanishing of the perturbing Hamiltonian at some defined positions. When Robust Tori exist, any trajectory in phase space passing close to them is blocked by emergent invariant curves that prevent the chaotic transport. We analyze the breaking up of the RT as well the transport dependence on the wave numbers and on the wave amplitudes. Moreover, we report the chaotic web formation in the phase space and how this pattern influences the transport.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper introduces an improved tabu-based vector optimal algorithm for multiobjective optimal designs of electromagnetic devices. The improvements include a division of the entire search process, a new method for fitness assignment, a novel scheme for the generation and selection of neighborhood solutions, and so forth. Numerical results on a mathematical function and an engineering multiobjective design problem demonstrate that the proposed method can produce virtually the exact Pareto front, in both parameter and objective spaces, even though the iteration number used by it is only about 70% of that required by its ancestor.
Resumo:
A novel constructive heuristic algorithm to the network expansion planning problem is presented the basic idea comes from Garver's work applied to the transportation model, nevertheless the proposed algorithm is for the DC model. Tests results with most known systems in the literature are carried out to show the efficiency of the method.
Resumo:
Objective-To develop and apply the liquid-phase blocking sandwich ELISA (BLOCKING-ELISA) for the quantification of antibodies against foot-and-mouth disease virus (FMDV) strains O-1 Campos, A(24) Cruzeiro, and C-3 Indaial.Design-Antibody quantification.Sample Population-158 water buffalo from various premises of São Paulo Stale-Brazil. The sera were collected either from systemically vaccinated or nonvaccinated animals.Procedure-The basic reagents of BLOCKING-ELISA (capture and detector antibodies, virus antigens, and conjugate) were prepared and the reaction was optimized and standardized to quantify water buffalo antibodies against FMDV. An alternative procedure based on mathematical interpolation was adopted to estimate more precisely the antibody 50% competition liters in the BLOCKING-ELISA. These titers were compared with the virus-neutralization test (VNT) titers to determine the correlation between these techniques. The percentages of agreement, cutoff points, and reproducibility also were determined.Results-The antibody liters obtained in the BLOCKING-ELISA had high positive correlation coefficients with VNT, reaching values of 0.90 for O-1 Campos and C-3 Indaial, and 0.82 for the A(24) Cruzeiro (P < 0.0005). The cutoff points obtained by use of the copositivity and conegativity curves allowed determination of high levels of agreement between BLOCKLNG-ELISA and VNT antibody titers against the 3 FMDV strains analyzed.Conclusions-The results characterized by high cor relation coefficients, levels of agreement, and reproducibility indicate that the BLOCKING-ELISA may replace the conventional VNT for detection and quantification of antibodies from water buffalo sera to FMDV.
Resumo:
Factorial experiments are widely used in industry to investigate the effects of process factors on quality response variables. Many food processes, for example, are not only subject to variation between days, but also between different times of the day. Removing this variation using blocking factors leads to row-column designs. In this paper, an algorithm is described for constructing factorial row-column designs when the factors are quantitative, and the data are to be analysed by fitting a polynomial model. The row-column designs are constructed using an iterative interchange search, where interchanges that result in an improvement in the weighted mean of the efficiency factors corresponding to the parameters of interest are accepted. Some examples illustrating the performance of the algorithm are given.
Resumo:
A constructive heuristic algorithm to solve the transmission system expansion planning problem is proposed with the aim of circumventing some critical problems of classical heuristic algorithms that employ relaxed mathematical models to calculate a sensitivity index that guides the circuit additions. The proposed heuristic algorithm is in a branch-and-bound algorithm structure, which can be used with any planning model, such as Transportation model, DC model, AC model or Hybrid models. Tests of the proposed algorithm are presented on real Brazilian systems.
Resumo:
Two applications of the modified Chebyshev algorithm are considered. The first application deals with the generation of orthogonal polynomials associated with a weight function having singularities on or near the end points of the interval of orthogonality. The other application involves the generation of real Szego polynomials.
Resumo:
A method for optimal transmission network expansion planning is presented. The transmission network is modelled as a transportation network. The problem is solved using hierarchical Benders decomposition in which the problem is decomposed into master and slave subproblems. The master subproblem models the investment decisions and is solved using a branch-and-bound algorithm. The slave subproblem models the network operation and is solved using a specialised linear program. Several alternative implementations of the branch-and-bound algorithm have been rested. Special characteristics of the transmission expansion problem have been taken into consideration in these implementations. The methods have been tested on various test systems available in the literature.
Resumo:
An algorithm is presented that finds the optimal plan long-term transmission for till cases studied, including relatively large and complex networks. The knowledge of optimal plans is becoming more important in the emerging competitive environment, to which the correct economic signals have to be sent to all participants. The paper presents a new specialised branch-and-bound algorithm for transmission network expansion planning. Optimality is obtained at a cost, however: that is the use of a transportation model for representing the transmission network, in this model only the Kirchhoff current law is taken into account (the second law being relaxed). The expansion problem then becomes an integer linear program (ILP) which is solved by the proposed branch-and-bound method without any further approximations. To control combinatorial explosion the branch- and bound algorithm is specialised using specific knowledge about the problem for both the selection of candidate problems and the selection of the next variable to be used for branching. Special constraints are also used to reduce the gap between the optimal integer solution (ILP program) and the solution obtained by relaxing the integrality constraints (LP program). Tests have been performed with small, medium and large networks available in the literature.
Resumo:
The transmission network planning problem is a non-linear integer mixed programming problem (NLIMP). Most of the algorithms used to solve this problem use a linear programming subroutine (LP) to solve LP problems resulting from planning algorithms. Sometimes the resolution of these LPs represents a major computational effort. The particularity of these LPs in the optimal solution is that only some inequality constraints are binding. This task transforms the LP into an equivalent problem with only one equality constraint (the power flow equation) and many inequality constraints, and uses a dual simplex algorithm and a relaxation strategy to solve the LPs. The optimisation process is started with only one equality constraint and, in each step, the most unfeasible constraint is added. The logic used is similar to a proposal for electric systems operation planning. The results show a higher performance of the algorithm when compared to primal simplex methods.
Resumo:
The multilayer perceptron network has become one of the most used in the solution of a wide variety of problems. The training process is based on the supervised method where the inputs are presented to the neural network and the output is compared with a desired value. However, the algorithm presents convergence problems when the desired output of the network has small slope in the discrete time samples or the output is a quasi-constant value. The proposal of this paper is presenting an alternative approach to solve this convergence problem with a pre-conditioning method of the desired output data set before the training process and a post-conditioning when the generalization results are obtained. Simulations results are presented in order to validate the proposed approach.
Resumo:
A low-cost computer procedure to determine the orbit of an artificial satellite by using short arc data from an onboard GPS receiver is proposed. Pseudoranges are used as measurements to estimate the orbit via recursive least squares method. The algorithm applies orthogonal Givens rotations for solving recursive and sequential orbit determination problems. To assess the procedure, it was applied to the TOPEX/POSEIDON satellite for data batches of one orbital period (approximately two hours), and force modelling, due to the full JGM-2 gravity field model, was considered. When compared with the reference Precision Orbit Ephemeris (POE) of JPL/NASA, the results have indicated that precision better than 9 m is easily obtained, even when short batches of data are used. Copyright (c) 2007.