11 resultados para Infeasible solution space search

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main feature of partition of unity methods such as the generalized or extended finite element method is their ability of utilizing a priori knowledge about the solution of a problem in the form of enrichment functions. However, analytical derivation of enrichment functions with good approximation properties is mostly limited to two-dimensional linear problems. This paper presents a procedure to numerically generate proper enrichment functions for three-dimensional problems with confined plasticity where plastic evolution is gradual. This procedure involves the solution of boundary value problems around local regions exhibiting nonlinear behavior and the enrichment of the global solution space with the local solutions through the partition of unity method framework. This approach can produce accurate nonlinear solutions with a reduced computational cost compared to standard finite element methods since computationally intensive nonlinear iterations can be performed on coarse global meshes after the creation of enrichment functions properly describing localized nonlinear behavior. Several three-dimensional nonlinear problems based on the rate-independent J (2) plasticity theory with isotropic hardening are solved using the proposed procedure to demonstrate its robustness, accuracy and computational efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current SoC design trends are characterized by the integration of larger amount of IPs targeting a wide range of application fields. Such multi-application systems are constrained by a set of requirements. In such scenario network-on-chips (NoC) are becoming more important as the on-chip communication structure. Designing an optimal NoC for satisfying the requirements of each individual application requires the specification of a large set of configuration parameters leading to a wide solution space. It has been shown that IP mapping is one of the most critical parameters in NoC design, strongly influencing the SoC performance. IP mapping has been solved for single application systems using single and multi-objective optimization algorithms. In this paper we propose the use of a multi-objective adaptive immune algorithm (M(2)AIA), an evolutionary approach to solve the multi-application NoC mapping problem. Latency and power consumption were adopted as the target multi-objective functions. To compare the efficiency of our approach, our results are compared with those of the genetic and branch and bound multi-objective mapping algorithms. We tested 11 well-known benchmarks, including random and real applications, and combines up to 8 applications at the same SoC. The experimental results showed that the M(2)AIA decreases in average the power consumption and the latency 27.3 and 42.1 % compared to the branch and bound approach and 29.3 and 36.1 % over the genetic approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Support Vector Machines (SVMs) have achieved very good performance on different learning problems. However, the success of SVMs depends on the adequate choice of the values of a number of parameters (e.g., the kernel and regularization parameters). In the current work, we propose the combination of meta-learning and search algorithms to deal with the problem of SVM parameter selection. In this combination, given a new problem to be solved, meta-learning is employed to recommend SVM parameter values based on parameter configurations that have been successfully adopted in previous similar problems. The parameter values returned by meta-learning are then used as initial search points by a search technique, which will further explore the parameter space. In this proposal, we envisioned that the initial solutions provided by meta-learning are located in good regions of the search space (i.e. they are closer to optimum solutions). Hence, the search algorithm would need to evaluate a lower number of candidate solutions when looking for an adequate solution. In this work, we investigate the combination of meta-learning with two search algorithms: Particle Swarm Optimization and Tabu Search. The implemented hybrid algorithms were used to select the values of two SVM parameters in the regression domain. These combinations were compared with the use of the search algorithms without meta-learning. The experimental results on a set of 40 regression problems showed that, on average, the proposed hybrid methods obtained lower error rates when compared to their components applied in isolation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of a network is a solution to several engineering and science problems. Several network design problems are known to be NP-hard, and population-based metaheuristics like evolutionary algorithms (EAs) have been largely investigated for such problems. Such optimization methods simultaneously generate a large number of potential solutions to investigate the search space in breadth and, consequently, to avoid local optima. Obtaining a potential solution usually involves the construction and maintenance of several spanning trees, or more generally, spanning forests. To efficiently explore the search space, special data structures have been developed to provide operations that manipulate a set of spanning trees (population). For a tree with n nodes, the most efficient data structures available in the literature require time O(n) to generate a new spanning tree that modifies an existing one and to store the new solution. We propose a new data structure, called node-depth-degree representation (NDDR), and we demonstrate that using this encoding, generating a new spanning forest requires average time O(root n). Experiments with an EA based on NDDR applied to large-scale instances of the degree-constrained minimum spanning tree problem have shown that the implementation adds small constants and lower order terms to the theoretical bound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An analogue of the Newton-Wigner position operator is defined for a massive neutral scalar field in de Sitter space. The one-particle subspace of the theory, consisting of positive-energy solutions of the Klein-Gordon equation selected by the Hadamard condition, is identified with an irreducible representation of the de Sitter group. Postulates of localizability analogous to those written by Wightman for fields in Minkowski space are formulated on it, and a unique solution is shown to exist. Representations in both the principal and the complementary series are considered. A simple expression for the time evolution of the Newton-Wigner operator is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the detection of CoRoT-23b, a hot Jupiter transiting in front of its host star with a period of 3.6314 +/- 0.0001 days. This planet was discovered thanks to photometric data secured with the CoRoT satellite, combined with spectroscopic radial velocity (RV) measurements. A photometric search for possible background eclipsing binaries conducted at CFHT and OGS concluded with a very low risk of false positives. The usual techniques of combining RV and transit data simultaneously were used to derive stellar and planetary parameters. The planet has a mass of M-p = 2.8 +/- 0.3 M-Jup, a radius of R-pl = 1.05 +/- 0.13 R-Jup, a density of approximate to 3 gcm(-3). RV data also clearly reveal a nonzero eccentricity of e = 0.16 +/- 0.02. The planet orbits a mature G0 main sequence star of V = 15.5 mag, with a mass M-star = 1.14 +/- 0.08 M-circle dot, a radius R-star = 1. 61 +/- 0.18 R-circle dot and quasi-solar abundances. The age of the system is evaluated to be 7 Gyr, not far from the transition to subgiant, in agreement with the rather large stellar radius. The two features of a significant eccentricity of the orbit and of a fairly high density are fairly uncommon for a hot Jupiter. The high density is, however, consistent with a model of contraction of a planet at this mass, given the age of the system. On the other hand, at such an age, circularization is expected to be completed. In fact, we show that for this planetary mass and orbital distance, any initial eccentricity should not totally vanish after 7 Gyr, as long as the tidal quality factor Q(p) is more than a few 10(5), a value that is the lower bound of the usually expected range. Even if CoRoT-23b features a density and an eccentricity that are atypical of a hot Jupiter, it is thus not an enigmatic object.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solution of structural reliability problems by the First Order method require optimization algorithms to find the smallest distance between a limit state function and the origin of standard Gaussian space. The Hassofer-Lind-Rackwitz-Fiessler (HLRF) algorithm, developed specifically for this purpose, has been shown to be efficient but not robust, as it fails to converge for a significant number of problems. On the other hand, recent developments in general (augmented Lagrangian) optimization techniques have not been tested in aplication to structural reliability problems. In the present article, three new optimization algorithms for structural reliability analysis are presented. One algorithm is based on the HLRF, but uses a new differentiable merit function with Wolfe conditions to select step length in linear search. It is shown in the article that, under certain assumptions, the proposed algorithm generates a sequence that converges to the local minimizer of the problem. Two new augmented Lagrangian methods are also presented, which use quadratic penalties to solve nonlinear problems with equality constraints. Performance and robustness of the new algorithms is compared to the classic augmented Lagrangian method, to HLRF and to the improved HLRF (iHLRF) algorithms, in the solution of 25 benchmark problems from the literature. The new proposed HLRF algorithm is shown to be more robust than HLRF or iHLRF, and as efficient as the iHLRF algorithm. The two augmented Lagrangian methods proposed herein are shown to be more robust and more efficient than the classical augmented Lagrangian method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CoRoT-21, a F8IV star of magnitude V = 16 mag, was observed by the space telescope CoRoT during the Long Run 01 ( LRa01) in the first winter field (constellation Monoceros) from October 2007 to March 2008. Transits were discovered during the light curve processing. Radial velocity follow-up observations, however, were performed mainly by the 10-m Keck telescope in January 2010. The companion CoRoT-21b is a Jupiter-like planet of 2.26 +/- 0.33 Jupiter masses and 1.30 +/- 0.14 Jupiter radii in an circular orbit of semi-major axis 0.0417 +/- 0.0011 AU and an orbital period of 2.72474 +/- 0.00014 days. The planetary bulk density is ( 1.36 +/- 0.48) x 10(3) kg m(-3), very similar to the bulk density of Jupiter, and follows an M-1/3 - R relation like Jupiter. The F8IV star is a sub-giant star of 1.29 +/- 0.09 solar masses and 1.95 +/- 0.2 solar radii. The star and the planet exchange extreme tidal forces that will lead to orbital decay and extreme spin-up of the stellar rotation within 800 Myr if the stellar dissipation is Q(*)/k2(*) <= 107.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the m-machine no-wait flow shop problem where the set-up time of a job is separated from its processing time. The performance measure considered is the total flowtime. A new hybrid metaheuristic Genetic Algorithm-Cluster Search is proposed to solve the scheduling problem. The performance of the proposed method is evaluated and the results are compared with the best method reported in the literature. Experimental tests show superiority of the new method for the test problems set, regarding the solution quality. (c) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The topographical character of conical intersections (CIs)-either sloped or peaked-has played a fundamental and important role in the discussion of the efficiency of CIs as photochemical "funnels." Here this perspective is employed in connection with a recent study of a model protonated Schiff base (PSB) cis to trans photoisomerization in solution [Malhado et al., J. Phys. Chem. A 115, 3720 (2011)]. In that study, the calculated reduced photochemical quantum yield for the successful production of trans product versus cis reactant in acetonitrile solvent compared to water was interpreted in terms of a dynamical solvent effect related to the dominance, for the acetonitrile case, of S-1 to S-0 nonadiabatic transitions prior to the reaching the seam of CIs. The solvent influence on the quantum yield is here re-examined in the sloped/peaked CI topographical perspective via conversion of the model's two PSB internal coordinates and a nonequilibrium solvent coordinate into an effective branching space description, which is then used to re-analyze the generalized Langevin equation/surface hopping results. The present study supports the original interpretation and enriches it in terms of topographical detail. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4754505]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The University of São Paulo has been experiencing the increase in contents in electronic and digital formats, distributed by different suppliers and hosted remotely or in clouds, and is faced with the also increasing difficulties related to facilitating access to this digital collection by its users besides coexisting with the traditional world of physical collections. A possible solution was identified in the new generation of systems called Web Scale Discovery, which allow better management, data integration and agility of search. Aiming to identify if and how such a system would meet the USP demand and expectation and, in case it does, to identify what the analysis criteria of such a tool would be, an analytical study with an essentially documental base was structured, as from a revision of the literature and from data available in official websites and of libraries using this kind of resources. The conceptual base of the study was defined after the identification of software assessment methods already available, generating a standard with 40 analysis criteria, from details on the unique access interface to information contents, web 2.0 characteristics, intuitive interface, facet navigation, among others. The details of the studies conducted into four of the major systems currently available in this software category are presented, providing subsidies for the decision-making of other libraries interested in such systems.