868 resultados para Lagrangian bounds in optimization problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

SEXTANTE es un marco para el desarrollo de algoritmos dedicados al procesamiento de información geográficamente referenciada, que actualmente cuenta con más de doscientos algoritmos que son capaces de operar sobre datos vectoriales, alfanuméricos y raster. Por otra parte, GearScape es un sistema de información geográfico orientado al geoprocesamiento, que dispone de un lenguaje declarativo que permite el desarrollo de geoprocesos sin necesidad de herramientas de desarrollo complejas. Dicho lenguaje está basado en el estándar SQL y extendido mediante la norma OGC para el acceso a fenómenos simples. Al ser un lenguaje mucho más simple que los lenguajes de programación imperativos (java, .net, python, etc.) la creación de geoprocesos es también más simple, más fácil de documentar, menos propensa a bugs y además la ejecución es optimizada de manera automática mediante el uso de índices y otras técnicas. La posibilidad de describir cadenas de operaciones complejas tiene también valor a modo de documentación: es posible escribir todos los pasos para la resolución de un determinado problema y poder recuperarlo tiempo después, reutilizarlo fácilmente, comunicárselo a otra persona, etc. En definitiva, el lenguaje de geoprocesamiento de GearScape permite "hablar" de geoprocesos. La integración de SEXTANTE en GearScape tiene un doble objetivo. Por una parte se pretende proporcionar la posibilidad de usar cualquiera de los algoritmos con la interfaz habitual de SEXTANTE. Por la otra, se pretende añadir al lenguaje de geoprocesamiento de GearScape la posibilidad de utilizar algoritmos de SEXTANTE. De esta manera, cualquier problema que se resuelva mediante la utilización de varios de estos algoritmes puede ser descrito con el lenguaje de geoprocesamiento de GearScape. A las ventajas del lenguaje de GearScape para la definición de geoprocesos, se añade el abanico de geoprocesos disponible en SEXTANTE, por lo que el lenguaje de geoprocesamiento de GearScape nos permite "hablar" utilizando vocabulario de SEXTANTE

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estimating the magnitude of Agulhas leakage, the volume flux of water from the Indian to the Atlantic Ocean, is difficult because of the presence of other circulation systems in the Agulhas region. Indian Ocean water in the Atlantic Ocean is vigorously mixed and diluted in the Cape Basin. Eulerian integration methods, where the velocity field perpendicular to a section is integrated to yield a flux, have to be calibrated so that only the flux by Agulhas leakage is sampled. Two Eulerian methods for estimating the magnitude of Agulhas leakage are tested within a high-resolution two-way nested model with the goal to devise a mooring-based measurement strategy. At the GoodHope line, a section halfway through the Cape Basin, the integrated velocity perpendicular to that line is compared to the magnitude of Agulhas leakage as determined from the transport carried by numerical Lagrangian floats. In the first method, integration is limited to the flux of water warmer and more saline than specific threshold values. These threshold values are determined by maximizing the correlation with the float-determined time series. By using the threshold values, approximately half of the leakage can directly be measured. The total amount of Agulhas leakage can be estimated using a linear regression, within a 90% confidence band of 12 Sv. In the second method, a subregion of the GoodHope line is sought so that integration over that subregion yields an Eulerian flux as close to the float-determined leakage as possible. It appears that when integration is limited within the model to the upper 300 m of the water column within 900 km of the African coast the time series have the smallest root-mean-square difference. This method yields a root-mean-square error of only 5.2 Sv but the 90% confidence band of the estimate is 20 Sv. It is concluded that the optimum thermohaline threshold method leads to more accurate estimates even though the directly measured transport is a factor of two lower than the actual magnitude of Agulhas leakage in this model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present results on the growth of damage in 29 fatigue tests of human femoral cortical bone from four individuals, aged 53–79. In these tests we examine the interdependency of stress, cycles to failure, rate of creep strain, and rate of modulus loss. The behavior of creep rates has been reported recently for the same donors as an effect of stress and cycles (Cotton, J. R., Zioupos, P., Winwood, K., and Taylor, M., 2003, "Analysis of Creep Strain During Tensile Fatigue of Cortical Bone," J. Biomech. 36, pp. 943–949). In the present paper we first examine how the evolution of damage (drop in modulus per cycle) is associated with the stress level or the "normalized stress" level (stress divided by specimen modulus), and results show the rate of modulus loss fits better as a function of normalized stress. However, we find here that even better correlations can be established between either the cycles to failure or creep rates versus rates of damage than any of these three measures versus normalized stress. The data indicate that damage rates can be excellent predictors of fatigue life and creep strain rates in tensile fatigue of human cortical bone for use in practical problems and computer simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background:  Some contend that attachment insecurity increases risk for the development of externalizing behavior problems in children. Method:  Latent-growth curve analyses were applied to data on 1,364 children from the NICHD Study of Early Child Care to evaluate the association between early attachment and teacher-rated externalizing problems across the primary-school years. Results:  Findings indicate that (a) both avoidant and disorganized attachment predict higher levels of externalizing problems but (b) that effects of disorganized attachment are moderated by family cumulative contextual risk, child gender and child age, with disorganized boys from risky social contexts manifesting increases in behavior problems over time. Conclusions:  These findings highlight the potentially conditional role of early attachment in children’s externalizing behavior problems and the need for further research evaluating causation and mediating mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In financial decision-making, a number of mathematical models have been developed for financial management in construction. However, optimizing both qualitative and quantitative factors and the semi-structured nature of construction finance optimization problems are key challenges in solving construction finance decisions. The selection of funding schemes by a modified construction loan acquisition model is solved by an adaptive genetic algorithm (AGA) approach. The basic objectives of the model are to optimize the loan and to minimize the interest payments for all projects. Multiple projects being undertaken by a medium-size construction firm in Hong Kong were used as a real case study to demonstrate the application of the model to the borrowing decision problems. A compromise monthly borrowing schedule was finally achieved. The results indicate that Small and Medium Enterprise (SME) Loan Guarantee Scheme (SGS) was first identified as the source of external financing. Selection of sources of funding can then be made to avoid the possibility of financial problems in the firm by classifying qualitative factors into external, interactive and internal types and taking additional qualitative factors including sovereignty, credit ability and networking into consideration. Thus a more accurate, objective and reliable borrowing decision can be provided for the decision-maker to analyse the financial options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the stratosphere, chemical tracers are drawn systematically from the equator to the pole. This observed Brewer–Dobson circulation is driven by wave drag, which in the stratosphere arises mainly from the breaking and dissipation of planetary-scale Rossby waves. While the overall sense of the circulation follows from fundamental physical principles, a quantitative theoretical understanding of the connection between wave drag and Lagrangian transport is limited to linear, small-amplitude waves. However, planetary waves in the stratosphere generally grow to a large amplitude and break in a strongly nonlinear fashion. This paper addresses the connection between stratospheric wave drag and Lagrangian transport in the presence of strong nonlinearity, using a mechanistic three-dimensional primitive equations model together with offline particle advection. Attention is deliberately focused on a weak forcing regime, such that sudden warmings do not occur and a quasi-steady state is reached, in order to examine this question in the cleanest possible context. Wave drag is directly linked to the transformed Eulerian mean (TEM) circulation, which is often used as a surrogate for mean Lagrangian motion. The results show that the correspondence between the TEM and mean Lagrangian velocities is quantitatively excellent in regions of linear, nonbreaking waves (i.e., outside the surf zone), where streamlines are not closed. Within the surf zone, where streamlines are closed and meridional particle displacements are large, the agreement between the vertical components of the two velocity fields is still remarkably good, especially wherever particle paths are coherent so that diabatic dispersion is minimized. However, in this region the meridional mean Lagrangian velocity bears little relation to the meridional TEM velocity, and reflects more the kinematics of mixing within and across the edges of the surf zone. The results from the mechanistic model are compared with those from the Canadian Middle Atmosphere Model to test the robustness of the conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is concerned with singular perturbations in parabolic problems subjected to nonlinear Neumann boundary conditions. We consider the case for which the diffusion coefficient blows up in a subregion Omega(0) which is interior to the physical domain Omega subset of R(n). We prove, under natural assumptions, that the associated attractors behave continuously as the diffusion coefficient blows up locally uniformly in Omega(0) and converges uniformly to a continuous and positive function in Omega(1) = (Omega) over bar\Omega(0). (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the process laboratory of Metso minerals (Sala) AB, continuous tests have been made with a laboratory unit High-Rate thickener. The tests are made in order to compare three methods of thickening techniques of suspended solids. The three techniques are High-Rate thickening, conventional thickening and lamella thickening. The High-Rate and the conventional trials are based on a continuous method, while the lamella thickener is based on batch trials. Because the lamella thickener is based on batch trials and there were some optimization problems with the adding point of the flocculant at the continuous trials, it was not feasible to compare the lamella thickener with the other two thickener types. On the other hand, since the optimization problems were the same for the other two methods there was no problem comparing them. The result of the comparison between the High-Rate thickener and the conventional thickener, was, that the High-Rate thickener manages to work at a higher rise rate with a lower consumption of flocculant than the conventional thickener. Seeing to the unit area that is needed by each thickener it is apparent that the conventional thickener demands a higher unit area than the High-Rate thickener to achieve the same amount of solids in the underflow. It has also been showed that the High-Rate thickener demands a lesser quantity of flocculant at the same amount of suspended solids in the feed than the conventional thickener.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Thesis Work will concentrate on a very interesting problem, the Vehicle Routing Problem (VRP). In this problem, customers or cities have to be visited and packages have to be transported to each of them, starting from a basis point on the map. The goal is to solve the transportation problem, to be able to deliver the packages-on time for the customers,-enough package for each Customer,-using the available resources- and – of course - to be so effective as it is possible.Although this problem seems to be very easy to solve with a small number of cities or customers, it is not. In this problem the algorithm have to face with several constraints, for example opening hours, package delivery times, truck capacities, etc. This makes this problem a so called Multi Constraint Optimization Problem (MCOP). What’s more, this problem is intractable with current amount of computational power which is available for most of us. As the number of customers grow, the calculations to be done grows exponential fast, because all constraints have to be solved for each customers and it should not be forgotten that the goal is to find a solution, what is best enough, before the time for the calculation is up. This problem is introduced in the first chapter: form its basics, the Traveling Salesman Problem, using some theoretical and mathematical background it is shown, why is it so hard to optimize this problem, and although it is so hard, and there is no best algorithm known for huge number of customers, why is it a worth to deal with it. Just think about a huge transportation company with ten thousands of trucks, millions of customers: how much money could be saved if we would know the optimal path for all our packages.Although there is no best algorithm is known for this kind of optimization problems, we are trying to give an acceptable solution for it in the second and third chapter, where two algorithms are described: the Genetic Algorithm and the Simulated Annealing. Both of them are based on obtaining the processes of nature and material science. These algorithms will hardly ever be able to find the best solution for the problem, but they are able to give a very good solution in special cases within acceptable calculation time.In these chapters (2nd and 3rd) the Genetic Algorithm and Simulated Annealing is described in details, from their basis in the “real world” through their terminology and finally the basic implementation of them. The work will put a stress on the limits of these algorithms, their advantages and disadvantages, and also the comparison of them to each other.Finally, after all of these theories are shown, a simulation will be executed on an artificial environment of the VRP, with both Simulated Annealing and Genetic Algorithm. They will both solve the same problem in the same environment and are going to be compared to each other. The environment and the implementation are also described here, so as the test results obtained.Finally the possible improvements of these algorithms are discussed, and the work will try to answer the “big” question, “Which algorithm is better?”, if this question even exists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic algorithm has been widely used in different areas of optimization problems. Ithas been combined with renewable energy domain, photovoltaic system, in this thesis.To participate and win the solar boat race, a control program is needed and C++ hasbeen chosen for programming. To implement the program, the mathematic model hasbeen built. Besides, the approaches to calculate the boundaries related to conditionhave been explained. Afterward, the processing of the prediction and real time controlfunction are offered. The program has been simulated and the results proved thatgenetic algorithm is helpful to get the good results but it does not improve the resultstoo much since the particularity of the solar driven boat project such as the limitationof energy production

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is a book about solar collectors and the place of these artefacts in a political energy debate that has aroused strong feelings in Sweden during the last twenty-five years. It is a book about the hopes for a less polluted earth, which solar collectors have come to symbolise, and a book about the ways in which problems in utilising solar energy are culturally perceived. One main aims of this study has been to find out more about the conflicting perceptions of solar collectors as 'saviours of the world' and simultaneously as uninteresting or less credible artefacts that 'may come in the future'. Another main purpose of the study has been to describe and explain those cultural processes of modification that are taking place around solar collectors in active attempts to integrate these into established cultural structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Risk assessment in child protection services has been promoted as the most reliable way to ensure that maltreatment to children is prevented and has become central to practice with children and families. However, recent research in Australia has suggested that children are being left in unsafe situations, leading to further maltreatment, by the very agencies responsible for their protection. The present article explores the reasons why child protection has become central to child protection practice and presents a wide ranging critical appraisal of risk assessment and its application. It is argued that risk assessment is a flawed process and, as a central tenet of practice, is implicated in any problems that children's protective services face. Consequently, any future reconfiguration of services for children in need of protection needs to include a re-evaluation of the efficacy of risk  assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines methods of point wise construction of aggregation operators via optimal interpolation. It is shown that several types of application-specific requirements lead to interpolatory type constraints on the aggregation function. These constraints are translated into global optimization problems, which are the focus of this paper. We present several methods of reduction of the number of variables, and formulate suitable numerical algorithms based on Lipschitz optimization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers the Minimum Span Frequency Assignment Problem with Interference Graph on Triangular Grid (MSFAP-TG), a special case of the Minimum Span Frequency/Channel Assignment (MSFAP) for cellular systems and optical networks. The MSFAP-TG is interesting in its own right and thus worth studying. In this paper, we propose strong integer programming formulations for the MSFAP-TG and present polyhedral results on these formulations. In solving the MSFAP-TG, we implement these integer programs to obtain exact solutions. We also develop a heuristic for obtaining feasible solutions and upper bounds for the problems. With the use of these upper bounds, and a simple lower bound, the computation time of the exact algorithm can be improved substantially. The heuristic turns out to be quite good in terms of the quality of upper bounds and is extremely efficient in computation time. Last of all, we present new concepts for tackling large scale MSFAP-TGs.