38 resultados para Lipschitz trivial
Resumo:
Els bacteris són la forma dominant de vida del planeta: poden sobreviure en medis molt adversos, i en alguns casos poden generar substàncies que quan les ingerim ens són tòxiques. La seva presència en els aliments fa que la microbiologia predictiva sigui un camp imprescindible en la microbiologia dels aliments per garantir la seguretat alimentària. Un cultiu bacterià pot passar per quatre fases de creixement: latència, exponencial, estacionària i de mort. En aquest treball s’ha avançat en la comprensió dels fenòmens intrínsecs a la fase de latència, que és de gran interès en l’àmbit de la microbiologia predictiva. Aquest estudi, realitzat al llarg de quatre anys, s’ha abordat des de la metodologia Individual-based Modelling (IbM) amb el simulador INDISIM (INDividual DIScrete SIMulation), que ha estat millorat per poder fer-ho. INDISIM ha permès estudiar dues causes de la fase de latència de forma separada, i abordar l’estudi del comportament del cultiu des d’una perspectiva mesoscòpica. S’ha vist que la fase de latència ha de ser estudiada com un procés dinàmic, i no definida per un paràmetre. L’estudi de l’evolució de variables com la distribució de propietats individuals entre la població (per exemple, la distribució de masses) o la velocitat de creixement, han permès distingir dues etapes en la fase de latència, inicial i de transició, i aprofundir en la comprensió del que passa a nivell cel•lular. S’han observat experimentalment amb citometria de flux diversos resultats previstos per les simulacions. La coincidència entre simulacions i experiments no és trivial ni casual: el sistema estudiat és un sistema complex, i per tant la coincidència del comportament al llarg del temps de diversos paràmetres interrelacionats és un aval a la metodologia emprada en les simulacions. Es pot afirmar, doncs, que s’ha verificat experimentalment la bondat de la metodologia INDISIM.
Resumo:
The classical Lojasiewicz inequality and its extensions for partial differential equation problems (Simon) and to o-minimal structures (Kurdyka) have a considerable impact on the analysis of gradient-like methods and related problems: minimization methods, complexity theory, asymptotic analysis of dissipative partial differential equations, tame geometry. This paper provides alternative characterizations of this type of inequalities for nonsmooth lower semicontinuous functions defined on a metric or a real Hilbert space. In a metric context, we show that a generalized form of the Lojasiewicz inequality (hereby called the Kurdyka- Lojasiewicz inequality) relates to metric regularity and to the Lipschitz continuity of the sublevel mapping, yielding applications to discrete methods (strong convergence of the proximal algorithm). In a Hilbert setting we further establish that asymptotic properties of the semiflow generated by -∂f are strongly linked to this inequality. This is done by introducing the notion of a piecewise subgradient curve: such curves have uniformly bounded lengths if and only if the Kurdyka- Lojasiewicz inequality is satisfied. Further characterizations in terms of talweg lines -a concept linked to the location of the less steepest points at the level sets of f- and integrability conditions are given. In the convex case these results are significantly reinforced, allowing in particular to establish the asymptotic equivalence of discrete gradient methods and continuous gradient curves. On the other hand, a counterexample of a convex C2 function in R2 is constructed to illustrate the fact that, contrary to our intuition, and unless a specific growth condition is satisfied, convex functions may fail to fulfill the Kurdyka- Lojasiewicz inequality.
Resumo:
CODEX SEARCH es un motor de recuperación de información especializado en derecho de extranjería que está basado en herramientas y conocimiento lingüísticos. Un motor o Sistema de Recuperación de Información (SRI) es un software capaz de localizar información en grandes colecciones documentales (entorno no trivial) en formato electrónico. Mediante un estudio previo se ha detectado que la extranjería es un ámbito discursivo en el que resulta difícil expresar la necesidad de información en términos de una consulta formal, objeto de los sistemas de recuperación actuales. Por lo tanto, para desarrollar un SRI eficiente en el dominio indicado no basta con emplear un modelo tradicional de RI, es decir, comparar los términos de la pregunta con los de la respuesta, básicamente porque no expresan implicaciones y porque no tiene que haber necesariamente una relación 1 a 1. En este sentido, la solución lingüística propuesta se basa en incorporar el conocimiento del especialista mediante la integración en el sistema de una librería de casos. Los casos son ejemplos de procedimientos aplicados por expertos a la solución de problemas que han ocurrido en la realidad y que han terminado en éxito o fracaso. Los resultados obtenidos en esta primera fase son muy alentadores pero es necesario continuar la investigación en este campo para mejorar el rendimiento del prototipo al que se puede acceder desde &http://161.116.36.139/~codex/&.
Resumo:
We study the existence of solutions to general measure-minimization problems over topological classes that are stable under localized Lipschitz homotopy, including the standard Plateau problem without the need for restrictive assumptions such as orientability or even rectifiability of surfaces. In case of problems over an open and bounded domain we establish the existence of a “minimal candidate”, obtained as the limit for the local Hausdorff convergence of a minimizing sequence for which the measure is lower-semicontinuous. Although we do not give a way to control the topological constraint when taking limit yet— except for some examples of topological classes preserving local separation or for periodic two-dimensional sets — we prove that this candidate is an Almgren-minimal set. Thus, using regularity results such as Jean Taylor’s theorem, this could be a way to find solutions to the above minimization problems under a generic setup in arbitrary dimension and codimension.
Resumo:
In this paper, we investigate the agency costs of government ownership and their impact on corporate governance and firm value. China is used as a laboratory because of the prevalent state shareholdings in exchange-listed firms. In this context, we specifically consider the trade-offs involved in the voluntary formation of an audit committee when the controlling shareholder is the state. The decision to improve corporate governance (in this case, introduce an audit committee) is shown to be value relevant and a function of existing agency relationships and non-trivial implementation costs. Our findings are robust to the level of pyramid groups, the ownership-control wedge, and financial leverage. The research adds to the debate regarding the effect of government shareholdings on corporate culture and performance - a topic that hastaken on renewed importance in recent times.
Resumo:
The usual way to investigate the statistical properties of finitely generated subgroups of free groups, and of finite presentations of groups, is based on the so-called word-based distribution: subgroups are generated (finite presentations are determined) by randomly chosen k-tuples of reduced words, whose maximal length is allowed to tend to infinity. In this paper we adopt a different, though equally natural point of view: we investigate the statistical properties of the same objects, but with respect to the so-called graph-based distribution, recently introduced by Bassino, Nicaud and Weil. Here, subgroups (and finite presentations) are determined by randomly chosen Stallings graphs whose number of vertices tends to infinity. Our results show that these two distributions behave quite differently from each other, shedding a new light on which properties of finitely generated subgroups can be considered frequent or rare. For example, we show that malnormal subgroups of a free group are negligible in the raph-based distribution, while they are exponentially generic in the word-based distribution. Quite surprisingly, a random finite presentation generically presents the trivial group in this new distribution, while in the classical one it is known to generically present an infinite hyperbolic group.
Resumo:
Los mapas de vegetación son a menudo utilizados como proxis de una estratificación de hábitats para generar distribuciones geográficas contínuas de organismos a partir de datos discretos mediante modelos multi-variantes. Sin embargo, los mapas de vegetación suelen ser poco apropiados para ser directamente aplicados a este fin, pues sus categorías no se concibieron con la intención de corresponder a tipos de hábitat. En este artículo presentamos y aplicamos el método de Agrupamiento por Doble Criterio para generalizar un mapa de vegetación extraordinariamente detallado (350 clases) del Parque Natural del Montseny (Cataluña) en categorías que mantienen la coherencia tanto desde el punto de vista estructural (a través de una matriz de disimilaridad espectral calculada mediante una imágen del satélite SPOT-5) como en términos de vegetación (gracias a una matriz de disimilaridad calculada mediante propiedades de vegetación deducidas de la leyenda jerárquica del mapa). El método simplifica de 114 a 18 clases el 67% del área de estudio. Añadiendo otras agregaciones más triviales basadas exclusivamente en criterios de cubierta de suelo, el 73% del área de estudio pasa de 167 a 25 categorías. Como valor añadido, el método identifica el 10% de los polígonos originales como anómalos (a partir de comparar las propiedades espectrales de cada polígono con el resto de los de su clases), lo que implica cambios en la cubierta entre las fechas del soporte utilizado para generar el mapa original y la imagen de satélite, o errores en la producción de éste.
Resumo:
This paper studies global webs on the projective plane with vanishing curvature. The study is based on an interplay of local and global arguments. The main local ingredient is a criterium for the regularity of the curvature at the neighborhood of a generic point of the discriminant. The main global ingredient, the Legendre transform, is an avatar of classical projective duality in the realm of differential equations. We show that the Legendre transform of what we call reduced convex foliations are webs with zero curvature, and we exhibit a countable infinity family of convex foliations which give rise to a family of webs with zero curvature not admitting non-trivial deformations with zero curvature.
Resumo:
The paper develops a stability theory for the optimal value and the optimal set mapping of optimization problems posed in a Banach space. The problems considered in this paper have an arbitrary number of inequality constraints involving lower semicontinuous (not necessarily convex) functions and one closed abstract constraint set. The considered perturbations lead to problems of the same type as the nominal one (with the same space of variables and the same number of constraints), where the abstract constraint set can also be perturbed. The spaces of functions involved in the problems (objective and constraints) are equipped with the metric of the uniform convergence on the bounded sets, meanwhile in the space of closed sets we consider, coherently, the Attouch-Wets topology. The paper examines, in a unified way, the lower and upper semicontinuity of the optimal value function, and the closedness, lower and upper semicontinuity (in the sense of Berge) of the optimal set mapping. This paper can be seen as a second part of the stability theory presented in [17], where we studied the stability of the feasible set mapping (completed here with the analysis of the Lipschitz-like property).
Resumo:
En el món dels videojocs el realisme és un punt molt important a tenir en compte ja que dónamés sensació a l’usuari d’estar immers en el videojoc. Això passa en part per aconseguir realisme en la dinàmica dels objectes i fer que aquests segueixin les lleis de la física de Newton. Per això s’han desenvolupat diverses llibreries que s’anomenen “motors de física” (physics engines), que empren variables com la massa, la velocitat, la fricció i la resistència del vent. Els objectius d’aquest projecte seran l’estudi de diferents llibreries físiques existents, la seva comparació i com s’integren en els motors de jocs. A més a més , la generació de contingut amb comportament que respongui a les funcions definides a aquestes llibreries no és trivial i per aquest motiu també es desenvoluparà una aplicació per generar murs de forma semiautomàtica que respongui a impactes. Per assolir aquests objectius caldrà: d’ una banda, comparar els cossos rígids, unions i funcionament en general de diferents llibreries físiques: Newton Game Dynamics, NVIDIA PhysX Technology, Open Dynamics Engine, Bullet PhysicsLibrary, Tokamak Physics Engine i Havok i d’ altra banda, implementar una aplicació que donant-li una imatge en planta d’una paret o conjunt de parets en format vectorial i les mides d’un maó, generi murs que puguin reaccionar de forma adequada quan rebin l’impacte d’una massa determinada. L’aplicació s’implementarà en C++ i amb l’entorn de desenvolupament Microsoft Visual Studio 2005. La visualització serà amb OpenGL
Resumo:
Motivated by the work of Mateu, Orobitg, Pérez and Verdera, who proved inequalities of the form $T_*f\lesssim M(Tf)$ or $T_*f\lesssim M^2(Tf)$ for certain singular integral operators $T$, such as the Hilbert or the Beurling transforms, we study the possibility of establishing this type of control for the Cauchy transform along a Lipschitz graph. We show that this is not possible in general, and we give a partial positive result when the graph is substituted by a Jordan curve.
Resumo:
This article analyzes Folner sequences of projections for bounded linear operators and their relationship to the class of finite operators introduced by Williams in the 70ies. We prove that each essentially hyponormal operator has a proper Folner sequence (i.e. a Folner sequence of projections strongly converging to 1). In particular, any quasinormal, any subnormal, any hyponormal and any essentially normal operator has a proper Folner sequence. Moreover, we show that an operator is finite if and only if it has a proper Folner sequence or if it has a non-trivial finite dimensional reducing subspace. We also analyze the structure of operators which have no Folner sequence and give examples of them. For this analysis we introduce the notion of strongly non-Folner operators, which are far from finite block reducible operators, in some uniform sense, and show that this class coincides with the class of non-finite operators.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
Recent studies of American politics evidence that political polarization of both the electorate and the political elite have moved 'almost in tandem for the past half century' (McCarty et al., 2003, p.2), and that party polarization has steadily increased since the 1970s. On the other hand, the empirical literature on party platforms and implemented policies has consistently found an imperfect but nonnegligible correlation between electoral platforms and governmental policies: while platforms tend to be polarized, policies are moderate or centrist. However, existing theoretical models of political competition are not manifestly compatible with these observations. In this paper, we distinguish between electoral platforms and implemented policies by incorporating a non-trivial policy-setting process. It follows that voters may care not only about the implemented policy but also about the platform they support with their vote. We find that while parties tend to polarize their positions, the risk of alienating their constituency prevents them from radicalizing. The analysis evidences that the distribution of the electorate, and not only the (expected) location of a pivotal voter, matters in determining policies. Our results are consistent with the observation of polarized platforms and moderate policies, and the alienation and indifference components of abstention.
Resumo:
Most central banks perceive a trade-off between stabilizing inflation and stabilizing the gap between output and desired output. However, the standard new Keynesian framework implies no such trade-off. In that framework, stabilizing inflation is equivalent to stabilizing the welfare-relevant output gap. In this paper, we argue that this property of the new Keynesian framework, which we call the divine coincidence, is due to a special feature of the model: the absence of non trivial real imperfections.We focus on one such real imperfection, namely, real wage rigidities. When the baseline new Keynesian model is extended to allow for real wage rigidities, the divine coincidence disappears, and central banks indeed face a trade-off between stabilizing inflation and stabilizing the welfare-relevant output gap. We show that not only does the extended model have more realistic normative implications, but it also has appealing positive properties. In particular, it provides a natural interpretation for the dynamic inflation-unemployment relation found in the data.