950 resultados para Convex Arcs
Resumo:
There is a natural norm associated with a starting point of the homogeneous self-dual (HSD) embedding model for conic convex optimization. In this norm two measures of the HSD model’s behavior are precisely controlled independent of the problem instance: (i) the sizes of ε-optimal solutions, and (ii) the maximum distance of ε-optimal solutions to the boundary of the cone of the HSD variables. This norm is also useful in developing a stopping-rule theory for HSD-based interior-point methods such as SeDuMi. Under mild assumptions, we show that a standard stopping rule implicitly involves the sum of the sizes of the ε-optimal primal and dual solutions, as well as the size of the initial primal and dual infeasibility residuals. This theory suggests possible criteria for developing starting points for the homogeneous self-dual model that might improve the resulting solution time in practice
Resumo:
M.Hieber, I.Wood: The Dirichlet problem in convex bounded domains for operators with L^\infty-coefficients, Diff. Int. Eq., 20, 7 (2007),721-734.
Resumo:
Iantchenko, A., (2007) 'Scattering poles near the real axis for two strictly convex obstacles', Annales of the Institute Henri Poincar? 8 pp.513-568 RAE2008
Resumo:
Wood, Ian; Hieber, M., (2007) 'The Dirichlet problem in convex bounded domains for operators with L8-coefficients', Differential and Integral Equations 20 pp.721-734 RAE2008
Resumo:
According to the Mickael's selection theorem any surjective continuous linear operator from one Fr\'echet space onto another has a continuous (not necessarily linear) right inverse. Using this theorem Herzog and Lemmert proved that if $E$ is a Fr\'echet space and $T:E\to E$ is a continuous linear operator such that the Cauchy problem $\dot x=Tx$, $x(0)=x_0$ is solvable in $[0,1]$ for any $x_0\in E$, then for any $f\in C([0,1],E)$, there exists a continuos map $S:[0,1]\times E\to E$, $(t,x)\mapsto S_tx$ such that for any $x_0\in E$, the function $x(t)=S_tx_0$ is a solution of the Cauchy problem $\dot x(t)=Tx(t)+f(t)$, $x(0)=x_0$ (they call $S$ a fundamental system of solutions of the equation $\dot x=Tx+f$). We prove the same theorem, replacing "continuous" by "sequentially continuous" for locally convex spaces from a class which contains strict inductive limits of Fr\'echet spaces and strong duals of Fr\'echet--Schwarz spaces and is closed with respect to finite products and sequentially closed subspaces. The key-point of the proof is an extension of the theorem on existence of a sequentially continuous right inverse of any surjective sequentially continuous linear operator to some class of non-metrizable locally convex spaces.
Resumo:
A locally convex space X is said to be integrally complete if each continuous mapping f: [0, 1] --> X is Riemann integrable. A criterion for integral completeness is established. Readily verifiable sufficient conditions of integral completeness are proved.
Resumo:
In the present paper we prove several results on the stratifiability of locally convex spaces. In particular, we show that a free locally convex sum of an arbitrary set of stratifiable LCS is a stratifiable LCS, and that all locally convex F'-spaces whose bounded subsets are metrizable are stratifiable. Moreover, we prove that a strict inductive limit of metrizable LCS is stratifiable and establish the stratifiability of many important general and specific spaces used in functional analysis. We also construct some examples that clarify the relationship between the stratifiability and other properties.
Resumo:
Let $\Gamma$ be the class of sequentially complete locally convex spaces such that an existence theorem holds for the linear Cauchy problem $\dot x = Ax$, $x(0) = x_0$ with respect to functions $x: R\to E$. It is proved that if $E\in \Gamma$, then $E\times R^A$ is-an-element-of $\Gamma$ for an arbitrary set $A$. It is also proved that a topological product of infinitely many infinite-dimensional Frechet spaces, each not isomorphic to $\omega$, does not belong to $\Gamma$.
Resumo:
This letter introduces the convex variable step-size (CVSS) algorithm. The convexity of the resulting cost function is guaranteed. Simulations presented show that with the proposed algorithm, we obtain similar results, as with the VSS algorithm in initial convergence, while there are potential performance gains when abrupt changes occur.