987 resultados para Nonlinear maximum principle
Resumo:
Some problems of Calculus of Variations do not have solutions in the class of classic continuous and smooth arcs. This suggests the need of a relaxation or extension of the problem ensuring the existence of a solution in some enlarged class of arcs. This work aims at the development of an extension for a more general optimal control problem with nonlinear control dynamics in which the control function takes values in some closed, but not necessarily bounded, set. To achieve this goal, we exploit the approach of R.V. Gamkrelidze based on the generalized controls, but related to discontinuous arcs. This leads to the notion of generalized impulsive control. The proposed extension links various approaches on the issue of extension found in the literature.
Resumo:
In this paper we solve a problem raised by Gutiérrez and Montanari about comparison principles for H−convex functions on subdomains of Heisenberg groups. Our approach is based on the notion of the sub-Riemannian horizontal normal mapping and uses degree theory for set-valued maps. The statement of the comparison principle combined with a Harnack inequality is applied to prove the Aleksandrov-type maximum principle, describing the correct boundary behavior of continuous H−convex functions vanishing at the boundary of horizontally bounded subdomains of Heisenberg groups. This result answers a question by Garofalo and Tournier. The sharpness of our results are illustrated by examples.
Resumo:
2002 Mathematics Subject Classification: 35J15, 35J25, 35B05, 35B50
Resumo:
We consider a parametric semilinear Dirichlet problem driven by the Laplacian plus an indefinite unbounded potential and with a reaction of superdifissive type. Using variational and truncation techniques, we show that there exists a critical parameter value λ_{∗}>0 such that for all λ> λ_{∗} the problem has least two positive solutions, for λ= λ_{∗} the problem has at least one positive solutions, and no positive solutions exist when λ∈(0,λ_{∗}). Also, we show that for λ≥ λ_{∗} the problem has a smallest positive solution.
Stabilized Petrov-Galerkin methods for the convection-diffusion-reaction and the Helmholtz equations
Resumo:
We present two new stabilized high-resolution numerical methods for the convection–diffusion–reaction (CDR) and the Helmholtz equations respectively. The work embarks upon a priori analysis of some consistency recovery procedures for some stabilization methods belonging to the Petrov–Galerkin framework. It was found that the use of some standard practices (e.g. M-Matrices theory) for the design of essentially non-oscillatory numerical methods is not feasible when consistency recovery methods are employed. Hence, with respect to convective stabilization, such recovery methods are not preferred. Next, we present the design of a high-resolution Petrov–Galerkin (HRPG) method for the 1D CDR problem. The problem is studied from a fresh point of view, including practical implications on the formulation of the maximum principle, M-Matrices theory, monotonicity and total variation diminishing (TVD) finite volume schemes. The current method is next in line to earlier methods that may be viewed as an upwinding plus a discontinuity-capturing operator. Finally, some remarks are made on the extension of the HRPG method to multidimensions. Next, we present a new numerical scheme for the Helmholtz equation resulting in quasi-exact solutions. The focus is on the approximation of the solution to the Helmholtz equation in the interior of the domain using compact stencils. Piecewise linear/bilinear polynomial interpolation are considered on a structured mesh/grid. The only a priori requirement is to provide a mesh/grid resolution of at least eight elements per wavelength. No stabilization parameters are involved in the definition of the scheme. The scheme consists of taking the average of the equation stencils obtained by the standard Galerkin finite element method and the classical finite difference method. Dispersion analysis in 1D and 2D illustrate the quasi-exact properties of this scheme. Finally, some remarks are made on the extension of the scheme to unstructured meshes by designing a method within the Petrov–Galerkin framework.
Resumo:
In the analysis of equilibrium policies in a di erential game, if agents have different time preference rates, the cooperative (Pareto optimum) solution obtained by applying the Pontryagin's Maximum Principle becomes time inconsistent. In this work we derive a set of dynamic programming equations (in discrete and continuous time) whose solutions are time consistent equilibrium rules for N-player cooperative di erential games in which agents di er in their instantaneous utility functions and also in their discount rates of time preference. The results are applied to the study of a cake-eating problem describing the management of a common property exhaustible natural resource. The extension of the results to a simple common property renewable natural resource model in in nite horizon is also discussed.
Resumo:
Pontryagin's maximum principle from optimal control theory is used to find the optimal allocation of energy between growth and reproduction when lifespan may be finite and the trade-off between growth and reproduction is linear. Analyses of the optimal allocation problem to date have generally yielded bang-bang solutions, i.e. determinate growth: life-histories in which growth is followed by reproduction, with no intermediate phase of simultaneous reproduction and growth. Here we show that an intermediate strategy (indeterminate growth) can be selected for if the rates of production and mortality either both increase or both decrease with increasing body size, this arises as a singular solution to the problem. Our conclusion is that indeterminate growth is optimal in more cases than was previously realized. The relevance of our results to natural situations is discussed.
Resumo:
In the analysis of equilibrium policies in a di erential game, if agents have different time preference rates, the cooperative (Pareto optimum) solution obtained by applying the Pontryagin's Maximum Principle becomes time inconsistent. In this work we derive a set of dynamic programming equations (in discrete and continuous time) whose solutions are time consistent equilibrium rules for N-player cooperative di erential games in which agents di er in their instantaneous utility functions and also in their discount rates of time preference. The results are applied to the study of a cake-eating problem describing the management of a common property exhaustible natural resource. The extension of the results to a simple common property renewable natural resource model in in nite horizon is also discussed.
Resumo:
We deal with the numerical solution of heat conduction problems featuring steep gradients. In order to solve the associated partial differential equation a finite volume technique is used and unstructured grids are employed. A discrete maximum principle for triangulations of a Delaunay type is developed. To capture thin boundary layers incorporating steep gradients an anisotropic mesh adaptation technique is implemented. Computational tests are performed for an academic problem where the exact solution is known as well as for a real world problem of a computer simulation of the thermoregulation of premature infants.
Resumo:
This paper considers left-invariant control systems defined on the Lie groups SU(2) and SO(3). Such systems have a number of applications in both classical and quantum control problems. The purpose of this paper is two-fold. Firstly, the optimal control problem for a system varying on these Lie Groups, with cost that is quadratic in control is lifted to their Hamiltonian vector fields through the Maximum principle of optimal control and explicitly solved. Secondly, the control systems are integrated down to the level of the group to give the solutions for the optimal paths corresponding to the optimal controls. In addition it is shown here that integrating these equations on the Lie algebra su(2) gives simpler solutions than when these are integrated on the Lie algebra so(3).
Resumo:
This paper tackles the problem of computing smooth, optimal trajectories on the Euclidean group of motions SE(3). The problem is formulated as an optimal control problem where the cost function to be minimized is equal to the integral of the classical curvature squared. This problem is analogous to the elastic problem from differential geometry and thus the resulting rigid body motions will trace elastic curves. An application of the Maximum Principle to this optimal control problem shifts the emphasis to the language of symplectic geometry and to the associated Hamiltonian formalism. This results in a system of first order differential equations that yield coordinate free necessary conditions for optimality for these curves. From these necessary conditions we identify an integrable case and these particular set of curves are solved analytically. These analytic solutions provide interpolating curves between an initial given position and orientation and a desired position and orientation that would be useful in motion planning for systems such as robotic manipulators and autonomous-oriented vehicles.
Resumo:
This note investigates the motion control of an autonomous underwater vehicle (AUV). The AUV is modeled as a nonholonomic system as any lateral motion of a conventional, slender AUV is quickly damped out. The problem is formulated as an optimal kinematic control problem on the Euclidean Group of Motions SE(3), where the cost function to be minimized is equal to the integral of a quadratic function of the velocity components. An application of the Maximum Principle to this optimal control problem yields the appropriate Hamiltonian and the corresponding vector fields give the necessary conditions for optimality. For a special case of the cost function, the necessary conditions for optimality can be characterized more easily and we proceed to investigate its solutions. Finally, it is shown that a particular set of optimal motions trace helical paths. Throughout this note we highlight a particular case where the quadratic cost function is weighted in such a way that it equates to the Lagrangian (kinetic energy) of the AUV. For this case, the regular extremal curves are constrained to equate to the AUV's components of momentum and the resulting vector fields are the d'Alembert-Lagrange equations in Hamiltonian form.
Resumo:
This paper considers the motion planning problem for oriented vehicles travelling at unit speed in a 3-D space. A Lie group formulation arises naturally and the vehicles are modeled as kinematic control systems with drift defined on the orthonormal frame bundles of particular Riemannian manifolds, specifically, the 3-D space forms Euclidean space E-3, the sphere S-3, and the hyperboloid H'. The corresponding frame bundles are equal to the Euclidean group of motions SE(3), the rotation group SO(4), and the Lorentz group SO (1, 3). The maximum principle of optimal control shifts the emphasis for these systems to the associated Hamiltonian formalism. For an integrable case, the extremal curves are explicitly expressed in terms of elliptic functions. In this paper, a study at the singularities of the extremal curves are given, which correspond to critical points of these elliptic functions. The extremal curves are characterized as the intersections of invariant surfaces and are illustrated graphically at the singular points. It. is then shown that the projections, of the extremals onto the base space, called elastica, at these singular points, are curves of constant curvature and torsion, which in turn implies that the oriented vehicles trace helices.
Resumo:
In this paper, we discuss the problem of globally computing sub-Riemannian curves on the Euclidean group of motions SE(3). In particular, we derive a global result for special sub-Riemannian curves whose Hamiltonian satisfies a particular condition. In this paper, sub-Riemannian curves are defined in the context of a constrained optimal control problem. The maximum principle is then applied to this problem to yield an appropriate left-invariant quadratic Hamiltonian. A number of integrable quadratic Hamiltonians are identified. We then proceed to derive convenient expressions for sub-Riemannian curves in SE(3) that correspond to particular extremal curves. These equations are then used to compute sub-Riemannian curves that could potentially be used for motion planning of underwater vehicles.
Resumo:
Pontryagin's maximum principle from optimal control theory is used to find the optimal allocation of energy between growth and reproduction when lifespan may be finite and the trade-off between growth and reproduction is linear. Analyses of the optimal allocation problem to date have generally yielded bang-bang solutions, i.e. determinate growth: life-histories in which growth is followed by reproduction, with no intermediate phase of simultaneous reproduction and growth. Here we show that an intermediate strategy (indeterminate growth) can be selected for if the rates of production and mortality either both increase or both decrease with increasing body size, this arises as a singular solution to the problem. Our conclusion is that indeterminate growth is optimal in more cases than was previously realized. The relevance of our results to natural situations is discussed.