958 resultados para Boyd-Lawton theorem
Resumo:
In this paper, we develop numerical algorithms that use small requirements of storage and operations for the computation of invariant tori in Hamiltonian systems (exact symplectic maps and Hamiltonian vector fields). The algorithms are based on the parameterization method and follow closely the proof of the KAM theorem given in [LGJV05] and [FLS07]. They essentially consist in solving a functional equation satisfied by the invariant tori by using a Newton method. Using some geometric identities, it is possible to perform a Newton step using little storage and few operations. In this paper we focus on the numerical issues of the algorithms (speed, storage and stability) and we refer to the mentioned papers for the rigorous results. We show how to compute efficiently both maximal invariant tori and whiskered tori, together with the associated invariant stable and unstable manifolds of whiskered tori. Moreover, we present fast algorithms for the iteration of the quasi-periodic cocycles and the computation of the invariant bundles, which is a preliminary step for the computation of invariant whiskered tori. Since quasi-periodic cocycles appear in other contexts, this section may be of independent interest. The numerical methods presented here allow to compute in a unified way primary and secondary invariant KAM tori. Secondary tori are invariant tori which can be contracted to a periodic orbit. We present some preliminary results that ensure that the methods are indeed implementable and fast. We postpone to a future paper optimized implementations and results on the breakdown of invariant tori.
Resumo:
Two logically distinct and permissive extensions of iterative weak dominance are introduced for games with possibly vector-valued payoffs. The first, iterative partial dominance, builds on an easy-to check condition but may lead to solutions that do not include any (generalized) Nash equilibria. However, the second and intuitively more demanding extension, iterative essential dominance, is shown to be an equilibrium refinement. The latter result includes Moulin’s (1979) classic theorem as a special case when all players’ payoffs are real-valued. Therefore, essential dominance solvability can be a useful solution concept for making sharper predictions in multicriteria games that feature a plethora of equilibria.
Resumo:
In the line opened by Kalai and Muller (1997), we explore new conditions on prefernce domains which make it possible to avoid Arrow's impossibility result. In our main theorem, we provide a complete characterization of the domains admitting nondictorial Arrovian social welfare functions with ties (i.e. including indifference in the range) by introducing a notion of strict decomposability. In the proof, we use integer programming tools, following an approach first applied to social choice theory by Sethuraman, Teo and Vohra ((2003), (2006)). In order to obtain a representation of Arrovian social welfare functions whose range can include indifference, we generalize Sethuraman et al.'s work and specify integer programs in which variables are allowed to assume values in the set {0, 1/2, 1}: indeed, we show that, there exists a one-to-one correspondence between solutions of an integer program defined on this set and the set of all Arrovian social welfare functions - without restrictions on the range.
Resumo:
We consider nonlinear elliptic problems involving a nonlocal operator: the square root of the Laplacian in a bounded domain with zero Dirichlet boundary conditions. For positive solutions to problems with power nonlinearities, we establish existence and regularity results, as well as a priori estimates of Gidas-Spruck type. In addition, among other results, we prove a symmetry theorem of Gidas-Ni-Nirenberg type.
Resumo:
New sufficient conditions for representation of a function via the absolutely convergent Fourier integral are obtained in the paper. In the main result, Theorem 1.1, this is controlled by the behavior near infinity of both the function and its derivative. This result is extended to any dimension d &= 2.
Resumo:
The relationship between the operator norms of fractional integral operators acting on weighted Lebesgue spaces and the constant of the weights is investigated. Sharp bounds are obtained for both the fractional integral operators and the associated fractional maximal functions. As an application improved Sobolev inequalities are obtained. Some of the techniques used include a sharp off-diagonal version of the extrapolation theorem of Rubio de Francia and characterizations of two-weight norm inequalities.
Resumo:
We study the existence of solutions to general measure-minimization problems over topological classes that are stable under localized Lipschitz homotopy, including the standard Plateau problem without the need for restrictive assumptions such as orientability or even rectifiability of surfaces. In case of problems over an open and bounded domain we establish the existence of a “minimal candidate”, obtained as the limit for the local Hausdorff convergence of a minimizing sequence for which the measure is lower-semicontinuous. Although we do not give a way to control the topological constraint when taking limit yet— except for some examples of topological classes preserving local separation or for periodic two-dimensional sets — we prove that this candidate is an Almgren-minimal set. Thus, using regularity results such as Jean Taylor’s theorem, this could be a way to find solutions to the above minimization problems under a generic setup in arbitrary dimension and codimension.
Resumo:
This paper provides an explicit cofibrant resolution of the operad encoding Batalin-Vilkovisky algebras. Thus it defines the notion of homotopy Batalin-Vilkovisky algebras with the required homotopy properties. To define this resolution we extend the theory of Koszul duality to operads and properads that are defined by quadratic and linear relations. The operad encoding Batalin-Vilkovisky algebras is shown to be Koszul in this sense. This allows us to prove a Poincaré-Birkhoff-Witt Theorem for such an operad and to give an explicit small quasi-free resolution for it. This particular resolution enables us to describe the deformation theory and homotopy theory of BV-algebras and of homotopy BV-algebras. We show that any topological conformal field theory carries a homotopy BV-algebra structure which lifts the BV-algebra structure on homology. The same result is proved for the singular chain complex of the double loop space of a topological space endowed with an action of the circle. We also prove the cyclic Deligne conjecture with this cofibrant resolution of the operad BV. We develop the general obstruction theory for algebras over the Koszul resolution of a properad and apply it to extend a conjecture of Lian-Zuckerman, showing that certain vertex algebras have an explicit homotopy BV-algebra structure.
Resumo:
Treball de recerca realitzat per un alumne d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. La programació al servei de la matemàtica és un programa informàtic fet amb Excel i Visual Basic. Resol equacions de primer grau, equacions de segon grau, sistemes d'equacions lineals de dues equacions i dues incògnites, sistemes d'equacions lineals compatibles determinats de tres equacions i tres incògnites i troba zeros de funcions amb el teorema de Bolzano. En cadascun dels casos, representa les solucions gràficament. Per a això, en el treball s'ha hagut de treballar, en matemàtiques, amb equacions, nombres complexos, la regla de Cramer per a la resolució de sistemes, i buscar la manera de programar un mètode iteratiu pel teorema de Bolzano. En la part gràfica, s'ha resolt com fer taules de valors amb dues i tres variables i treballar amb rectes i plans. Per la part informàtica, s'ha emprat un llenguatge nou per l'alumne i, sobretot, ha calgut saber decidir on posar una determinada instrucció, ja que el fet de variar-ne la posició una sola línea ho pot canviar tot. A més d'això, s'han resolt altres problemes de programació i també s'ha realitzat el disseny de pantalles.
Resumo:
In cognition, common factors play a crucial role. For example, different types of intelligence are highly correlated, pointing to a common factor, which is often called g. One might expect that a similar common factor would also exist for vision. Surprisingly, no one in the field has addressed this issue. Here, we provide the first evidence that there is no common factor for vision. We tested 40 healthy students' performance in six basic visual paradigms: visual acuity, vernier discrimination, two visual backward masking paradigms, Gabor detection, and bisection discrimination. One might expect that performance levels on these tasks would be highly correlated because some individuals generally have better vision than others due to superior optics, better retinal or cortical processing, or enriched visual experience. However, only four out of 15 correlations were significant, two of which were nontrivial. These results cannot be explained by high intraobserver variability or ceiling effects because test-retest reliability was high and the variance in our student population is commensurate with that from other studies with well-sighted populations. Using a variety of tests (e.g., principal components analysis, Bayes theorem, test-retest reliability), we show the robustness of our null results. We suggest that neuroplasticity operates during everyday experience to generate marked individual differences. Our results apply only to the normally sighted population (i.e., restricted range sampling). For the entire population, including those with degenerate vision, we expect different results.
Resumo:
The Athlete Biological Passport (ABP) is an individual electronic document that collects data regarding a specific athlete that is useful in differentiating between natural physiologic variations of selected biomarkers and deviations caused by artificial manipulations. A subsidiary of the endocrine module of the ABP, that which here is called Athlete Steroidal Passport (ASP), collects data on markers of an altered metabolism of endogenous steroidal hormones measured in urine samples. The ASP aims to identify not only doping with anabolic-androgenic steroids, but also most indirect steroid doping strategies such as doping with estrogen receptor antagonists and aromatase inhibitors. Development of specific markers of steroid doping, use of the athlete's previous measurements to define individual limits, with the athlete becoming his or her own reference, the inclusion of heterogeneous factors such as the UDPglucuronosyltransferase B17 genotype of the athlete, the knowledge of potentially confounding effects such as heavy alcohol consumption, the development of an external quality control system to control analytical uncertainty, and finally the use of Bayesian inferential methods to evaluate the value of indirect evidence have made the ASP a valuable alternative to deter steroid doping in elite sports. The ASP can be used to target athletes for gas chromatography/combustion/ isotope ratio mass spectrometry (GC/C/IRMS) testing, to withdraw temporarily the athlete from competing when an abnormality has been detected, and ultimately to lead to an antidoping infraction if that abnormality cannot be explained by a medical condition. Although the ASP has been developed primarily to ensure fairness in elite sports, its application in endocrinology for clinical purposes is straightforward in an evidence-based medicine paradigm.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
The main goal of this article is to give an explicit rigid analytic uniformization of the maximal toric quotient of the Jacobian of a Shimura curve over Q at a prime dividing exactly the level. This result can be viewed as complementary to the classical theorem of Cerednik and Drinfeld which provides rigid analytic uniformizations at primes dividing the discriminant. As a corollary, we offer a proof of a conjecture formulated by M. Greenberg in hispaper on Stark-Heegner points and quaternionic Shimura curves, thus making Greenberg's construction of local points on elliptic curves over Q unconditional.
Resumo:
Forensic scientists working in 12 state or private laboratories participated in collaborative tests to improve the reliability of the presentation of DNA data at trial. These tests were motivated in response to the growing criticism of the power of DNA evidence. The experts' conclusions in the tests are presented and discussed in the context of the Bayesian approach to interpretation. The use of a Bayesian approach and subjective probabilities in trace evaluation permits, in an easy and intuitive manner, the integration into the decision procedure of any revision of the measure of uncertainty in the light of new information. Such an integration is especially useful with forensic evidence. Furthermore, we believe that this probabilistic model is a useful tool (a) to assist scientists in the assessment of the value of scientific evidence, (b) to help jurists in the interpretation of judicial facts and (c) to clarify the respective roles of scientists and of members of the court. Respondents to the survey were reluctant to apply this methodology in the assessment of DNA evidence.
Resumo:
In this paper we included a very broad representation of grass family diversity (84% of tribes and 42% of genera). Phylogenetic inference was based on three plastid DNA regions rbcL, matK and trnL-F, using maximum parsimony and Bayesian methods. Our results resolved most of the subfamily relationships within the major clades (BEP and PACCMAD), which had previously been unclear, such as, among others the: (i) BEP and PACCMAD sister relationship, (ii) composition of clades and the sister-relationship of Ehrhartoideae and Bambusoideae + Pooideae, (iii) paraphyly of tribe Bambuseae, (iv) position of Gynerium as sister to Panicoideae, (v) phylogenetic position of Micrairoideae. With the presence of a relatively large amount of missing data, we were able to increase taxon sampling substantially in our analyses from 107 to 295 taxa. However, bootstrap support and to a lesser extent Bayesian inference posterior probabilities were generally lower in analyses involving missing data than those not including them. We produced a fully resolved phylogenetic summary tree for the grass family at subfamily level and indicated the most likely relationships of all included tribes in our analysis.