987 resultados para Birkhoff and Von Neumann ergodic theorems
Resumo:
Let there be a positive (exogenous) probability that, at each date, the human species will disappear.We postulate an Ethical Observer (EO) who maximizes intertemporal welfare under thisuncertainty, with expected-utility preferences. Various social welfare criteria entail alternativevon Neumann- Morgenstern utility functions for the EO: utilitarian, Rawlsian, and an extensionof the latter that corrects for the size of population. Our analysis covers, first, a cake-eating economy(without production), where the utilitarian and Rawlsian recommend the same allocation.Second, a productive economy with education and capital, where it turns out that the recommendationsof the two EOs are in general different. But when the utilitarian program diverges, thenwe prove it is optimal for the extended Rawlsian to ignore the uncertainty concerning the possibledisappearance of the human species in the future. We conclude by discussing the implicationsfor intergenerational welfare maximization in the presence of global warming.
Resumo:
Se estudian en este trabajo algunas magnitudes relacionadas con el enfoque de la reproducción social. Ante todo se hace hincapié en tres ideas fundamentales, las nociones de: a) Salidas menos entradas; b) Salidas dividido por entradas; c) Subsistemas. A continuación se subrayan los obstáculos para la cuantificación directa de estos conceptos, y se repasan las vías sugeridas para sortear las dificultades (por medio de constructos teóricos propuestos por Leontief, von Neumann y Sraffa). Luego se examinan dos nuevos indicadores: la ¿tasaespecífica de excedente¿, que se refiere a los bienes autorreproducibles, y el ¿coeficiente neto de reproducción¿, que se predica de todos los bienes básicos. De pasada se apuntan algunas pistas para establecer indicadores del mismogénero en campos como la economía ecológica y la economía feminista. Por último, se anotan algunas conjeturas relacionadas con la dirección adecuada del cambio técnico
Resumo:
Recently a new Bell inequality has been introduced by Collins et al. [Phys. Rev. Lett. 88, 040404 (2002)], which is strongly resistant to noise for maximally entangled states of two d-dimensional quantum systems. We prove that a larger violation, or equivalently a stronger resistance to noise, is found for a nonmaximally entangled state. It is shown that the resistance to noise is not a good measure of nonlocality and we introduce some other possible measures. The nonmaximally entangled state turns out to be more robust also for these alternative measures. From these results it follows that two von Neumann measurements per party may be not optimal for detecting nonlocality. For d=3,4, we point out some connections between this inequality and distillability. Indeed, we demonstrate that any state violating it, with the optimal von Neumann settings, is distillable.
Resumo:
Purpose: Atheromatic plaque progression is affected, among others phenomena, by biomechanical, biochemical, and physiological factors. In this paper, the authors introduce a novel framework able to provide both morphological (vessel radius, plaque thickness, and type) and biomechanical (wall shear stress and Von Mises stress) indices of coronary arteries. Methods: First, the approach reconstructs the three-dimensional morphology of the vessel from intravascular ultrasound(IVUS) and Angiographic sequences, requiring minimal user interaction. Then, a computational pipeline allows to automatically assess fluid-dynamic and mechanical indices. Ten coronary arteries are analyzed illustrating the capabilities of the tool and confirming previous technical and clinical observations. Results: The relations between the arterial indices obtained by IVUS measurement and simulations have been quantitatively analyzed along the whole surface of the artery, extending the analysis of the coronary arteries shown in previous state of the art studies. Additionally, for the first time in the literature, the framework allows the computation of the membrane stresses using a simplified mechanical model of the arterial wall. Conclusions: Circumferentially (within a given frame), statistical analysis shows an inverse relation between the wall shear stress and the plaque thickness. At the global level (comparing a frame within the entire vessel), it is observed that heavy plaque accumulations are in general calcified and are located in the areas of the vessel having high wall shear stress. Finally, in their experiments the inverse proportionality between fluid and structural stresses is observed.
Resumo:
The objective of this work was to evaluate the growth of the mangrove oyster Crassostrea gasar cultured in marine and estuarine environments. Oysters were cultured for 11 months in a longline system in two study sites - São Francisco do Sul and Florianópolis -, in the state of Santa Catarina, Southern Brazil. Water chlorophyll-α concentration, temperature, and salinity were measured weekly. The oysters were measured monthly (shell size and weight gain) to assess growth. At the end of the culture period, the average wet flesh weight, dry flesh weight, and shell weight were determined, as well as the distribution of oysters per size class. Six nonlinear models (logistic, exponential, Gompertz, Brody, Richards, and Von Bertalanffy) were adjusted to the oyster growth data set. Final mean shell sizes were higher in São Francisco do Sul than in Florianópolis. In addition, oysters cultured in São Francisco do Sul were more uniformly distributed in the four size classes than those cultured in Florianópolis. The highest average values of wet flesh weight and shell weight were observed in São Francisco do Sul, whereas dry flesh weight did not differ between the sites. The estuary environment is more promising for the cultivation of oysters.
Resumo:
The aim of this doctoral thesis was to study personality characteristics of patients at an early stage of Alzheimer's disease (AD), and more specifically to describe personality and its changes over time, and to explore its possible links with psychological and symptoms (BPS) and cognitive level. The results were compared to those of a group of participants without cognitive disorder through three empirical studies. In the first study, the findings showed significant personality changes that follow a specific trend in the clinical group. The profil of personality changes showed an increase in Neuroticism and a decrease in Extraversion, Openess to experiences, and Conscientiousness over time. The second study highlighted that personality and BPS occur early in the cours of AD. Recognizing them as possible precoce signs of neurodegeneration may prove to be a key factor for early detection and intervention. In the third study, a significant association between personality changes and cognitive status was observed in the patients with incipient AD. Thus, changes in Neuroticism and Conscientiousness were linked with cognitive deterioration, whereas decreased Openness to experiences and Conscientiousness over time predicted loss of independence in daily functioning. Other well-known factors such as age, education level or civil status were taken into account to predict cognitive decline. The three studies suggested five important implications: (1) cost-effective screening should take into account premorbid and specific personality changes; (2) psycho-educative interventions should provide information on the possible personality changes and BPS that may occur at the beginning of the disease; (3) using personality traits alongside other variables in the future studies on prevention might help to better understand AD's etiology; (4) individual treatment plans (psychotherapeutic, social, and pharmacological) might be adapted to the specific changes in personality profiles; (5) more researches are needed to study the impact of social-cultural and lifestyle variables on the development of AD.
Resumo:
Purpose: Atheromatic plaque progression is affected, among others phenomena, by biomechanical, biochemical, and physiological factors. In this paper, the authors introduce a novel framework able to provide both morphological (vessel radius, plaque thickness, and type) and biomechanical (wall shear stress and Von Mises stress) indices of coronary arteries. Methods: First, the approach reconstructs the three-dimensional morphology of the vessel from intravascular ultrasound(IVUS) and Angiographic sequences, requiring minimal user interaction. Then, a computational pipeline allows to automatically assess fluid-dynamic and mechanical indices. Ten coronary arteries are analyzed illustrating the capabilities of the tool and confirming previous technical and clinical observations. Results: The relations between the arterial indices obtained by IVUS measurement and simulations have been quantitatively analyzed along the whole surface of the artery, extending the analysis of the coronary arteries shown in previous state of the art studies. Additionally, for the first time in the literature, the framework allows the computation of the membrane stresses using a simplified mechanical model of the arterial wall. Conclusions: Circumferentially (within a given frame), statistical analysis shows an inverse relation between the wall shear stress and the plaque thickness. At the global level (comparing a frame within the entire vessel), it is observed that heavy plaque accumulations are in general calcified and are located in the areas of the vessel having high wall shear stress. Finally, in their experiments the inverse proportionality between fluid and structural stresses is observed.
Resumo:
In this article I deal with time as a notion of epistemological content associated though with the notion of a subjective consciousness co-constitutive of physical reality. In this phenomenologically grounded approach I attempt to establish a 'metaphysical' aspect of time, within a strictly epistemological context, in the sense of an underlying absolute subjectivity which is non-objectifiable within objective temporality and thus non-susceptible of any ontological designation. My arguments stem, on the one hand, from a version of quantum-mechanical theory (History Projection Operator theory, HPO theory) in view of its formal treatment of two different aspects of time within a quantum context. The discrete, partial-ordering properties (the notions of before and after) and the dynamical-parameter properties reflected in the wave equations of motion. On the other hand, to strengthen my arguments for a transcendental factor of temporality, I attempt an interpretation of some relevant conclusions in the work of J. Eccles ([5]) and of certain results of experimental research of S. Deahaene et al. ([2]) and others.
Resumo:
Selon le voile d’ignorance proposé par John Harsanyi (1953, 1955), l’observateur rationnel derrière le voile d’ignorance cherche à maximiser la somme des utilités individuelles. Cependant, le modèle d’Harsanyi est fondé sur une hypothèse erronée que la fonction d’utilité à la von Neumann-Morgenstern de l’observateur permet la comparaison interpersonnelle de bien-être. Ce papier suggère une modification du modèle d’Harsanyi qui permet la comparaison interpersonnelle de bien-être, en utilisant les années de vie en parfaite utilité ou les années de vie heureuse comme mesure du bien-être.
Resumo:
Dans ce mémoire, je démontre que la distribution de probabilités de l'état quantique Greenberger-Horne-Zeilinger (GHZ) sous l'action locale de mesures de von Neumann indépendantes sur chaque qubit suit une distribution qui est une combinaison convexe de deux distributions. Les coefficients de la combinaison sont reliés aux parties équatoriales des mesures et les distributions associées à ces coefficients sont reliées aux parties réelles des mesures. Une application possible du résultat est qu'il permet de scinder en deux la simulation de l'état GHZ. Simuler, en pire cas ou en moyenne, un état quantique comme GHZ avec des ressources aléatoires, partagées ou privées, et des ressources classiques de communication, ou même des ressources fantaisistes comme les boîtes non locales, est un problème important en complexité de la communication quantique. On peut penser à ce problème de simulation comme un problème où plusieurs personnes obtiennent chacune une mesure de von Neumann à appliquer sur le sous-système de l'état GHZ qu'il partage avec les autres personnes. Chaque personne ne connaît que les données décrivant sa mesure et d'aucune façon une personne ne connaît les données décrivant la mesure d'une autre personne. Chaque personne obtient un résultat aléatoire classique. La distribution conjointe de ces résultats aléatoires classiques suit la distribution de probabilités trouvée dans ce mémoire. Le but est de simuler classiquement la distribution de probabilités de l'état GHZ. Mon résultat indique une marche à suivre qui consiste d'abord à simuler les parties équatoriales des mesures pour pouvoir ensuite savoir laquelle des distributions associées aux parties réelles des mesures il faut simuler. D'autres chercheurs ont trouvé comment simuler les parties équatoriales des mesures de von Neumann avec de la communication classique dans le cas de 3 personnes, mais la simulation des parties réelles résiste encore et toujours.
Resumo:
Les implications philosophiques de la Théorie de la Perspective de 1979, notamment celles qui concernent l’introduction d’une fonction de valeur sur les résultats et d’un coefficient de pondération sur les probabilités, n’ont à ce jour jamais été explorées. Le but de ce travail est de construire une théorie philosophique de la volonté à partir des résultats de la Théorie de la Perspective. Afin de comprendre comment cette théorie a pu être élaborée il faut étudier la Théorie de l’Utilité Attendue dont elle est l’aboutissement critique majeur, c’est-à-dire les axiomatisations de la décision de Ramsey (1926), von Neumann et Morgenstern (1947), et enfin Savage (1954), qui constituent les fondements de la théorie classique de la décision. C’est entre autres la critique – par l’économie et la psychologie cognitive – du principe d’indépendance, des axiomes d’ordonnancement et de transitivité qui a permis de faire émerger les éléments représentationnels subjectifs à partir desquels la Théorie de la Perspective a pu être élaborée. Ces critiques ont été menées par Allais (1953), Edwards (1954), Ellsberg (1961), et enfin Slovic et Lichtenstein (1968), l’étude de ces articles permet de comprendre comment s’est opéré le passage de la Théorie de l’Utilité Attendue, à la Théorie de la Perspective. À l’issue de ces analyses et de celle de la Théorie de la Perspective est introduite la notion de Système de Référence Décisionnel, qui est la généralisation naturelle des concepts de fonction de valeur et de coefficient de pondération issus de la Théorie de la Perspective. Ce système, dont le fonctionnement est parfois heuristique, sert à modéliser la prise de décision dans l’élément de la représentation, il s’articule autour de trois phases : la visée, l’édition et l’évaluation. À partir de cette structure est proposée une nouvelle typologie des décisions et une explication inédite des phénomènes d’akrasie et de procrastination fondée sur les concepts d’aversion au risque et de surévaluation du présent, tous deux issus de la Théorie de la Perspective.
Resumo:
La thèse est divisée principalement en deux parties. La première partie regroupe les chapitres 2 et 3. La deuxième partie regroupe les chapitres 4 et 5. La première partie concerne l'échantillonnage de distributions continues non uniformes garantissant un niveau fixe de précision. Knuth et Yao démontrèrent en 1976 comment échantillonner exactement n'importe quelle distribution discrète en n'ayant recours qu'à une source de bits non biaisés indépendants et identiquement distribués. La première partie de cette thèse généralise en quelque sorte la théorie de Knuth et Yao aux distributions continues non uniformes, une fois la précision fixée. Une borne inférieure ainsi que des bornes supérieures pour des algorithmes génériques comme l'inversion et la discrétisation figurent parmi les résultats de cette première partie. De plus, une nouvelle preuve simple du résultat principal de l'article original de Knuth et Yao figure parmi les résultats de cette thèse. La deuxième partie concerne la résolution d'un problème en théorie de la complexité de la communication, un problème qui naquit avec l'avènement de l'informatique quantique. Étant donné une distribution discrète paramétrée par un vecteur réel de dimension N et un réseau de N ordinateurs ayant accès à une source de bits non biaisés indépendants et identiquement distribués où chaque ordinateur possède un et un seul des N paramètres, un protocole distribué est établi afin d'échantillonner exactement ladite distribution.
Resumo:
In [4], Guillard and Viozat propose a finite volume method for the simulation of inviscid steady as well as unsteady flows at low Mach numbers, based on a preconditioning technique. The scheme satisfies the results of a single scale asymptotic analysis in a discrete sense and comprises the advantage that this can be derived by a slight modification of the dissipation term within the numerical flux function. Unfortunately, it can be observed by numerical experiments that the preconditioned approach combined with an explicit time integration scheme turns out to be unstable if the time step Dt does not satisfy the requirement to be O(M2) as the Mach number M tends to zero, whereas the corresponding standard method remains stable up to Dt=O(M), M to 0, which results from the well-known CFL-condition. We present a comprehensive mathematical substantiation of this numerical phenomenon by means of a von Neumann stability analysis, which reveals that in contrast to the standard approach, the dissipation matrix of the preconditioned numerical flux function possesses an eigenvalue growing like M-2 as M tends to zero, thus causing the diminishment of the stability region of the explicit scheme. Thereby, we present statements for both the standard preconditioner used by Guillard and Viozat [4] and the more general one due to Turkel [21]. The theoretical results are after wards confirmed by numerical experiments.
Resumo:
El presente documento constituye un estudio de caso que se desarrolló de acuerdo a los lineamientos planteados en el Plan Nacional de Desarrollo 2010 – 2014 “Prosperidad para todos”, en la que el Gobierno define que se deben otorgar 1.000.000 de soluciones de vivienda a nivel nacional en este periodo presidencial, de las cuales 254.920 soluciones son responsabilidad del Fondo Nacional del Ahorro. Por lo tanto, se analizan las estrategias que ha venido desarrollando el FNA con el propósito de proponer alternativas que permitan a la alta dirección de la entidad tomar decisiones coherentes con los modelos de promoción de vivienda, los cuales han estado alineados con el cumplimiento de los objetivos definidos por el Gobierno Nacional en el eje central de vivienda.
Resumo:
The strategic equilibrium of an N-person cooperative game with transferable utility is a system composed of a cover collection of subsets of N and a set of extended imputations attainable through such equilibrium cover. The system describes a state of coalitional bargaining stability where every player has a bargaining alternative against any other player to support his corresponding equilibrium claim. Any coalition in the sable system may form and divide the characteristic value function of the coalition as prescribed by the equilibrium payoffs. If syndicates are allowed to form, a formed coalition may become a syndicate using the equilibrium payoffs as disagreement values in bargaining for a part of the complementary coalition incremental value to the grand coalition when formed. The emergent well known-constant sum derived game in partition function is described in terms of parameters that result from incumbent binding agreements. The strategic-equilibrium corresponding to the derived game gives an equal value claim to all players. This surprising result is alternatively explained in terms of strategic-equilibrium based possible outcomes by a sequence of bargaining stages that when the binding agreements are in the right sequential order, von Neumann and Morgenstern (vN-M) non-discriminatory solutions emerge. In these solutions a preferred branch by a sufficient number of players is identified: the weaker players syndicate against the stronger player. This condition is referred to as the stronger player paradox. A strategic alternative available to the stronger players to overcome the anticipated not desirable results is to voluntarily lower his bargaining equilibrium claim. In doing the original strategic equilibrium is modified and vN-M discriminatory solutions may occur, but also a different stronger player may emerge that has eventually will have to lower his equilibrium claim. A sequence of such measures converges to the equal opportunity for all vN-M solution anticipated by the strategic equilibrium of partition function derived game. [298-words]