760 resultados para VARIATIONAL PROLAPSE
Resumo:
The first motivation for this note is to obtain a general version of the following result: let E be a Banach space and f : E → R be a differentiable function, bounded below and satisfying the Palais-Smale condition; then, f is coercive, i.e., f(x) goes to infinity as ||x|| goes to infinity. In recent years, many variants and extensions of this result appeared, see [3], [5], [6], [9], [14], [18], [19] and the references therein. A general result of this type was given in [3, Theorem 5.1] for a lower semicontinuous function defined on a Banach space, through an approach based on an abstract notion of subdifferential operator, and taking into account the “smoothness” of the Banach space. Here, we give (Theorem 1) an extension in a metric setting, based on the notion of slope from [11] and coercivity is considered in a generalized sense, inspired by [9]; our result allows to recover, for example, the coercivity result of [19], where a weakened version of the Palais-Smale condition is used. Our main tool (Proposition 1) is a consequence of Ekeland’s variational principle extending [12, Corollary 3.4], and deals with a function f which is, in some sense, the “uniform” Γ-limit of a sequence of functions.
Resumo:
We consider the problem of minimizing the max of two convex functions from both approximation and sensitivity point of view.This lead up to study the epiconvergence of a sequence of level sums of convex functions and the related dual problems.
Resumo:
Bayesian algorithms pose a limit to the performance learning algorithms can achieve. Natural selection should guide the evolution of information processing systems towards those limits. What can we learn from this evolution and what properties do the intermediate stages have? While this question is too general to permit any answer, progress can be made by restricting the class of information processing systems under study. We present analytical and numerical results for the evolution of on-line algorithms for learning from examples for neural network classifiers, which might include or not a hidden layer. The analytical results are obtained by solving a variational problem to determine the learning algorithm that leads to maximum generalization ability. Simulations using evolutionary programming, for programs that implement learning algorithms, confirm and expand the results. The principal result is not just that the evolution is towards a Bayesian limit. Indeed it is essentially reached. In addition we find that evolution is driven by the discovery of useful structures or combinations of variables and operators. In different runs the temporal order of the discovery of such combinations is unique. The main result is that combinations that signal the surprise brought by an example arise always before combinations that serve to gauge the performance of the learning algorithm. This latter structures can be used to implement annealing schedules. The temporal ordering can be understood analytically as well by doing the functional optimization in restricted functional spaces. We also show that there is data suggesting that the appearance of these traits also follows the same temporal ordering in biological systems. © 2006 American Institute of Physics.
Resumo:
In this letter, we derive continuum equations for the generalization error of the Bayesian online algorithm (BOnA) for the one-layer perceptron with a spherical covariance matrix using the Rosenblatt potential and show, by numerical calculations, that the asymptotic performance of the algorithm is the same as the one for the optimal algorithm found by means of variational methods with the added advantage that the BOnA does not use any inaccessible information during learning. © 2007 IEEE.
Resumo:
* Supported by Ministero dell’Università e della Ricerca Scientifica e Tecnologica (40% – 1993). ** Supported by Ministero dell’Università e della Ricerca Scientifica e Tecnologica (40% – 1993).
Resumo:
We prove that if f is a real valued lower semicontinuous function on a Banach space X and if there exists a C^1, real valued Lipschitz continuous function on X with bounded support and which is not identically equal to zero, then f is Lipschitz continuous of constant K provided all lower subgradients of f are bounded by K. As an application, we give a regularity result of viscosity supersolutions (or subsolutions) of Hamilton-Jacobi equations in infinite dimensions which satisfy a coercive condition. This last result slightly improves some earlier work by G. Barles and H. Ishii.
Deformation Lemma, Ljusternik-Schnirellmann Theory and Mountain Pass Theorem on C1-Finsler Manifolds
Resumo:
∗Partially supported by Grant MM409/94 Of the Ministy of Science and Education, Bulgaria. ∗∗Partially supported by Grant MM442/94 of the Ministy of Science and Education, Bulgaria.
Resumo:
We overview our recent developments in the theory of dispersion-managed (DM) solitons within the context of optical applications. First, we present a class of localized solutions with a period multiple to that of the standard DM soliton in the nonlinear Schrödinger equation with periodic variations of the dispersion. In the framework of a reduced ordinary differential equation-based model, we discuss the key features of these structures, such as a smaller energy compared to traditional DM solitons with the same temporal width. Next, we present new results on dissipative DM solitons, which occur in the context of mode-locked lasers. By means of numerical simulations and a reduced variational model of the complex Ginzburg-Landau equation, we analyze the influence of the different dissipative processes that take place in a laser.
Resumo:
Mathematics Subject Classification: 26A33; 70H03, 70H25, 70S05; 49S05
Resumo:
MSC 2010: 26A33, 70H25, 46F12, 34K37 Dedicated to 80-th birthday of Prof. Rudolf Gorenflo
Resumo:
2000 Mathematics Subject Classification: 90C26, 90C20, 49J52, 47H05, 47J20.
Resumo:
2000 Mathematics Subject Classification: 35J40, 49J52, 49J40, 46E30
Resumo:
2000 Mathematics Subject Classification: 49L20, 60J60, 93E20
Resumo:
MSC 2010: 49K05, 26A33
Resumo:
2000 Mathematics Subject Classification: 49J52, 49J50, 58C20, 26B09.