879 resultados para Egocentric Constraint


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A mixed integer continuous nonlinear model and a solution method for the problem of orthogonally packing identical rectangles within an arbitrary convex region are introduced in the present work. The convex region is assumed to be made of an isotropic material in such a way that arbitrary rotations of the items, preserving the orthogonality constraint, are allowed. The solution method is based on a combination of branch and bound and active-set strategies for bound-constrained minimization of smooth functions. Numerical results show the reliability of the presented approach. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Nonlinear Programming algorithm that converges to second-order stationary points is introduced in this paper. The main tool is a second-order negative-curvature method for box-constrained minimization of a certain class of functions that do not possess continuous second derivatives. This method is used to define an Augmented Lagrangian algorithm of PHR (Powell-Hestenes-Rockafellar) type. Convergence proofs under weak constraint qualifications are given. Numerical examples showing that the new method converges to second-order stationary points in situations in which first-order methods fail are exhibited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given an algorithm A for solving some mathematical problem based on the iterative solution of simpler subproblems, an outer trust-region (OTR) modification of A is the result of adding a trust-region constraint to each subproblem. The trust-region size is adaptively updated according to the behavior of crucial variables. The new subproblems should not be more complex than the original ones, and the convergence properties of the OTR algorithm should be the same as those of Algorithm A. In the present work, the OTR approach is exploited in connection with the ""greediness phenomenon"" of nonlinear programming. Convergence results for an OTR version of an augmented Lagrangian method for nonconvex constrained optimization are proved, and numerical experiments are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Augmented Lagrangian methods for large-scale optimization usually require efficient algorithms for minimization with box constraints. On the other hand, active-set box-constraint methods employ unconstrained optimization algorithms for minimization inside the faces of the box. Several approaches may be employed for computing internal search directions in the large-scale case. In this paper a minimal-memory quasi-Newton approach with secant preconditioners is proposed, taking into account the structure of Augmented Lagrangians that come from the popular Powell-Hestenes-Rockafellar scheme. A combined algorithm, that uses the quasi-Newton formula or a truncated-Newton procedure, depending on the presence of active constraints in the penalty-Lagrangian function, is also suggested. Numerical experiments using the Cute collection are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we introduce a necessary sequential Approximate-Karush-Kuhn-Tucker (AKKT) condition for a point to be a solution of a continuous variational inequality, and we prove its relation with the Approximate Gradient Projection condition (AGP) of Garciga-Otero and Svaiter. We also prove that a slight variation of the AKKT condition is sufficient for a convex problem, either for variational inequalities or optimization. Sequential necessary conditions are more suitable to iterative methods than usual punctual conditions relying on constraint qualifications. The AKKT property holds at a solution independently of the fulfillment of a constraint qualification, but when a weak one holds, we can guarantee the validity of the KKT conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Thesis Work will concentrate on a very interesting problem, the Vehicle Routing Problem (VRP). In this problem, customers or cities have to be visited and packages have to be transported to each of them, starting from a basis point on the map. The goal is to solve the transportation problem, to be able to deliver the packages-on time for the customers,-enough package for each Customer,-using the available resources- and – of course - to be so effective as it is possible.Although this problem seems to be very easy to solve with a small number of cities or customers, it is not. In this problem the algorithm have to face with several constraints, for example opening hours, package delivery times, truck capacities, etc. This makes this problem a so called Multi Constraint Optimization Problem (MCOP). What’s more, this problem is intractable with current amount of computational power which is available for most of us. As the number of customers grow, the calculations to be done grows exponential fast, because all constraints have to be solved for each customers and it should not be forgotten that the goal is to find a solution, what is best enough, before the time for the calculation is up. This problem is introduced in the first chapter: form its basics, the Traveling Salesman Problem, using some theoretical and mathematical background it is shown, why is it so hard to optimize this problem, and although it is so hard, and there is no best algorithm known for huge number of customers, why is it a worth to deal with it. Just think about a huge transportation company with ten thousands of trucks, millions of customers: how much money could be saved if we would know the optimal path for all our packages.Although there is no best algorithm is known for this kind of optimization problems, we are trying to give an acceptable solution for it in the second and third chapter, where two algorithms are described: the Genetic Algorithm and the Simulated Annealing. Both of them are based on obtaining the processes of nature and material science. These algorithms will hardly ever be able to find the best solution for the problem, but they are able to give a very good solution in special cases within acceptable calculation time.In these chapters (2nd and 3rd) the Genetic Algorithm and Simulated Annealing is described in details, from their basis in the “real world” through their terminology and finally the basic implementation of them. The work will put a stress on the limits of these algorithms, their advantages and disadvantages, and also the comparison of them to each other.Finally, after all of these theories are shown, a simulation will be executed on an artificial environment of the VRP, with both Simulated Annealing and Genetic Algorithm. They will both solve the same problem in the same environment and are going to be compared to each other. The environment and the implementation are also described here, so as the test results obtained.Finally the possible improvements of these algorithms are discussed, and the work will try to answer the “big” question, “Which algorithm is better?”, if this question even exists.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the cost efficiency in achieving the Swedish national air quality objectives under uncertainty. To realize an ecologically sustainable society, the parliament has approved a set of interim and long-term pollution reduction targets. However, there are considerable quantification uncertainties on the effectiveness of the proposed pollution reduction measures. In this paper, we develop a multivariate stochastic control framework to deal with the cost efficiency problem with multiple pollutants. Based on the cost and technological data collected by several national authorities, we explore the implications of alternative probabilistic constraints. It is found that a composite probabilistic constraint induces considerably lower abatement cost than separable probabilistic restrictions. The trend is reinforced by the presence of positive correlations between reductions in the multiple pollutants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The specification of Quality of Service (QoS) constraints over software design requires measures that ensure such requirements are met by the delivered product. Achieving this goal is non-trivial, as it involves, at least, identifying how QoS constraint specifications should be checked at the runtime. In this paper we present an implementation of a Model Driven Architecture (MDA) based framework for the runtime monitoring of QoS properties. We incorporate the UML2 superstructure and the UML profile for Quality of Service to provide abstract descriptions of component-and-connector systems. We then define transformations that refine the UML2 models to conform with the Distributed Management Taskforce (DMTF) Common Information Model (CIM) (Distributed Management Task Force Inc. 2006), a schema standard for management and instrumentation of hardware and software. Finally, we provide a mapping the CIM metamodel to a .NET-based metamodel for implementation of the monitoring infrastructure utilising various .NET features including the Windows Management Instrumentation (WMI) interface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O ambiente concorrencial atual está impondo novos fatores de competição às empresas. Com a globalização e a abertura do mercado, as empresas brasileiras estão implantando programas de qualidade e produtividade tendo como referencial principal (benchmarking) as empresas japonesas. As conseqüentes mudanças no ambiente empresarial não estão, entretanto, sendo consistentemente acompanhadas por evoluções nas sistemáticas de controle e custeio. As informações relativas ao controle de uma empresa devem ser acuradas o suficiente para subsidiar o processo de tomada de decisões no atual ambiente competitivo. Porém, as tradicionais práticas de controle e custeio, além de obsoletas, podem constituir uma restrição para a continuidade dos programas de melhoria das empresas. Este trabalho mostra a evolução dos sistemas de manufatura, com ênfase particular no Modelo Japonês / Sistema Toyota de Produção. Uma atenção especial é dada à necessidade de mudanças nos sistemas de controle das empresas, principalmente na parte de custeio. Mostra-se ainda algumas características das sistemáticas de controle e custeio nas empresas japonesas em comparação com a lógica predominante nas empresas ocidentais. Apóia-se o trabalho em um caso real de uma empresa que já passou por um processo de racionalização, sob forte influência dos conceitos japoneses de qualidade e produtividade, e que, agora, sente a necessidade de uma maior transparência e melhor entendimento do comportamento dos custos em seu ambiente, para poder dar continuidade a este processo de melhorias.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A computação de tempo real é uma das áreas mais desafiadoras e de maior demanda tecnológica da atualidade. Está diretamente ligada a aplicações que envolvem índices críticos de confiabilidade e segurança. Estas características, inerentes a esta área da computação, vêm contribuindo para o aumento da complexidade dos sistemas tempo real e seu conseqüente desenvolvimento. Isto fez com que mecanismos para facilitar especificação, delimitação e solução de problemas passem a ser itens importantes para tais aplicações. Este trabalho propõe mecanismos para atuarem no desenvolvimento de sistemas de tempo real, com o objetivo de serem empregados como ferramenta de apoio no problema da verificação de presença de inconsistências, que podem vir a ocorrer nos vários modelos gerados partir da notação da linguagem de modelagem gráfica para sistemas de tempo real - UML-RT(Unified Modeling Language for Real Time). Estes mecanismos foram projetados através da construção de um metamodelo dos conceitos presentes nos diagramas de classe, de objetos, de seqüência, de colaboração e de estados. Para construir o metamodelo, utiliza-se a notação do diagrama de classes da UML (Unified Modeling Language). Contudo, por intermédio das representações gráficas do diagrama de classes não é possível descrever toda a semântica presente em tais diagramas. Assim, regras descritas em linguagem de modelagem OCL (Object Constraint Language) são utilizadas como um formalismo adicional ao metamodelo. Com estas descrições em OCL será possível a diminuição das possíveis ambigüidades e inconsistências, além de complementar as limitações impostas pelo caráter gráfico da UML. O metamodelo projetado é mapeado para um modelo Entidade&Relacionamento. A partir deste modelo, são gerados os scripts DDL (Data Definition Language) que serão usados na criação do dicionário de dados, no banco de dados Oracle. As descrições semânticas escritas através de regras em OCL são mapeadas para triggers, que disparam no momento em que o dicionário de dados é manipulado. O MET Editor do SiMOO-RT é a ferramenta diagramática que faz o povoamento dos dados no dicionário de dados. SiMOO-RT é uma ferramenta orientada a objetos para a modelagem, simulação e geração automática de código para sistemas de tempo real.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este estudo trata da abordagem macroergonômica participativa para a identificação das demandas ergonômicas dos motoristas de ônibus urbano da cidade de Joinville, com utilização da metodologia participativa da Análise Macroergonômica do Trabalho (AMT) (GUIMARÃES, 2001c) e ferramental proposto no Design Macroergonômico (DM) (FOGLIATTO E GUIMARÃES, 1999). O estudo de caso foi realizado em uma empresa privada de transporte coletivo da cidade de Joinville. A aplicação da metodologia permitiu identificar as demandas ergonômicas prioritárias levantadas pelos motoristas de ônibus urbano da cidade de Joinville e os itens de design do seu posto de trabalho através da fase de apreciação. As demandas ergonômicas, bem como os itens de design foram comparados através do Teste Exato de Fisher com determinadas características da população constatando-se algumas associações significativas entre a satisfação dos motoristas e as variáveis que compõe cada construto. Estes resultados possibilitaram a formulação de recomendações que viabilize, em estudos futuros, a introdução de melhorias para o aumento da qualidade de vida dos motoristas. Os estudos também permitiram identificar uma afinidade da metodologia participativa com os motoristas de ônibus urbano, em que as mudanças podem ocorrer de forma gradativa e experiencial através de protótipos no caso das demandas referentes à posto de trabalho e físico ambiental, ou através de possíveis adaptações no conteúdo da tarefa do motorista no caso das demandas referentes à organização do trabalho. Tudo isto vislumbrando o atendimento, por ordem de importância, dos itens de demanda ergonômica levantados. Por fim, concluiu-se que para os motoristas de Joinville alguns fatores referentes a organização do trabalho estão entre os principais causadores dos constrangimentos aos quais são expostos enquanto executam sua tarefa, seguido por fatores físicos ambientais e posto do trabalho.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present explicit formulas for evaluating the difference between Markowitz weights and those from optimal portfolios, with the same given return, considering either asymmetry or kurtosis. We prove that, whenever the higher moment constraint is not binding, the weights are never the same. If, due to special features of the first and second moments, the difference might be negligible, in quite many cases it will be very significant. An appealing illustration, when the designer wants to incorporate an asset with quite heavy tails, but wants to moderate this effect, further supports the argument.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Highly indebted countries, particularly the Latin American ones, presented dismal economic outcomes in the 1990s, which are the consequence of the ‘growth cum foreign savings strategy’, or the Second Washington Consensus. Coupled with liberalization of international financial flows, such strategy, which did not make part of the first consensus, led the countries, in the wave of a new world wide capital flow cycle, to high current account deficits and increase in foreign debt, ignoring the solvency constraint and the debt threshold. In practical terms it involved overvalued currencies (low exchange rates) and high interest rates; in policy terms, the attempt to control de budget deficit while the current account deficit was ignored. The paradoxical consequence was the adoption by highly indebted countries of ‘exchange rate populism’, a less obvious but more dangerous form of economic populism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Década de 90 foi Marcada Pela Ocorrência de Crises Financeiras. este Trabalho Investiga a Ligação entre a Recorrência À Poupança Externa, a Deterioração das Restrições de Solvência e Liquidez e a Eclosão das Crises Financeiras.