92 resultados para modern atomic theory
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Felipe Pérez Martí, who was the Venezuelan Minister of Planning and Development in the government of Hugo Chávez, proposes an economic model that he calls the altruistic economy or fourth way, which leads cooperative game theory to its logical extremes postulating a pure communism. Here we sustain that, first, it is impossible in the model of Pérez Martí to marginally allocate non-primary goods to those most in need or who most value them, facing a problem of defective economic calculation, and second, in order to achieve equality, he would have to replace his atomic local planners by a central planner, who would be unable to overcome the problem of imperfect and and incomplete information.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
In this article, we present a new approach of Nekhoroshev theory for a generic unperturbed Hamiltonian which completely avoids small divisors problems. The proof is an extension of a method introduced by P. Lochak which combines averaging along periodic orbits with simultaneous Diophantine approximation and uses geometric arguments designed by the second author to handle generic integrable Hamiltonians. This method allows to deal with generic non-analytic Hamiltonians and to obtain new results of generic stability around linearly stable tori.
Resumo:
Much of the self-image of the Western university hangs on the idea that research and teaching are intimately connected. The central axiom here is that research and teaching are mutually supportive of each other. An institution lacking such a set of relationships between research and teaching falls short of what it means to be a university. This set of beliefs raises certain questions: Is it the case that the presence of such a mutually supportive set of relationships between research and teaching is a necessary condition of the fulfilment of the idea of the university? (A conceptual question). And is it true that, in practice today, such a mutually supportive set of relationships between research and teaching characterises universities? (An empirical question). In my talk, I want to explore these matters in a critical vein. I shall suggest that: a) In practice today, such a mutually supportive set of relationships between research and teaching is in jeopardy. Far from supporting each other, very often research and teaching contend against each other. Research and teaching are becoming two separate ideologies, with their own interest structures. b) Historically, the supposed tight link between research and teaching is both of recent origin and far from universally achieved in universities. Institutional separateness between research and teaching is and has been evident, both across institutions and even across departments in the same institution. c) Conceptually, research and teaching are different activities: each is complex and neither is reducible to the other. In theory, therefore, research and teaching may be said to constitute a holy alliance but in practice, we see more of an unholy alliance. If, then, in an ideal world, a positive relationship between research and teaching is still a worthwhile goal, how might it be construed and worked for? Seeing research and teaching as two discrete and unified sets of activity is now inadequate. Much better is a construal of research and teaching as themselves complexes, as intermingling pools of activity helping to form the liquid university that is emerging today. On this view, research and teaching are fluid spaces, ever on the move, taking up new shapes, and themselves dividing and reforming, as the university reworks its own destiny in modern society. On such a perspective, working out a productive relationship between research and teaching is a complex project. This is an alliance that is neither holy nor unholy. It is an uneasy alliance, with temporary accommodations and continuous new possibilities.
Resumo:
Is there a link between decentralized governance and conflict prevention? This article tries to answer the question by presenting the state of the art of the intersection of both concepts. Provided that social conflict is inevitable and given the appearance of new threats and types of violence, as well as new demands for security based on people (human security), our societies should focus on promoting peaceful changes. Through an extensive analysis of the existing literature and the study of several cases, this paper suggests that decentralized governance can contribute to these efforts by transforming conflicts, bringing about power-sharing and inclusion incentives of minority groups. Albeit the complexity of assessing its impact on conflict prevention, it can be contended that decentralized governance might have very positive effects on the reduction of causes that bring about conflicts due to its ability to foster the creation of war/violence preventors. More specifically, this paper argues that decentralization can have a positive impact on the so-called triggers and accelerators (short- and medium-term causes).
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
In this paper we prove a formula for the analytic index of a basic Dirac-type operator on a Riemannian foliation, solving a problem that has been open for many years. We also consider more general indices given by twisting the basic Dirac operator by a representation of the orthogonal group. The formula is a sum of integrals over blowups of the strata of the foliation and also involves eta invariants of associated elliptic operators. As a special case, a Gauss-Bonnet formula for the basic Euler characteristic is obtained using two independent proofs.
Resumo:
Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.
Resumo:
We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.
Resumo:
Vintage capital growth models have been at the heart of growth theory in the 60s. This research line collapsed in the late 60s with the so-called embodiment controversy and the technical sophisitication of the vintage models. This paper analyzes the astonishing revival of this literature in the 90s. In particular, it outlines three methodological breakthroughs explaining this resurgence: a growth accounting revolution, taking advantage of the availability of new time series, an optimal control revolution allowing to safely study vintage capital optimal growth models, and a vintage human capital revolution, along with the rise of economic demography, accounting for the vintage structure of human capital similarly to physical capital age structuring. The related literature is surveyed.
Resumo:
After a historical survey of temperament in Bach’s Well-Tempered Clavier by Johann Sebastian Bach, an analysis of the work has been made by applying a number of historical good temperaments as well as some recent proposals. The results obtained show that the global dissonance for all preludes and fugues in major keys can be minimized using the Kirnberger II temperament. The method of analysis used for this research is based on the mathematical theories of sensory dissonance, which have been developed by authors such as Hermann Ludwig Ferdinand von Helmholtz, Harry Partch, Reinier Plomp, Willem J. M. Levelt and William A. Sethares
Resumo:
En aquest projecte s'ha estudiat la posada a punt d’un equip comercial ALD per a l’obtenció de capes primes d'alúmina a escala nanomètrica utilitzant vapor d’aigua i TMA com a precursors. Per tal de comprovar a bondat de les receptes experimentals aportades pel fabricant així com comprovar alguns aspectes de la teoria ALD s’han realitzat una sèrie de mostres variant els diferents paràmetres experimentals, principalment la temperatura de deposició, el nombre de cicles, la durada del cicle i el tipus de substrat. Per a la determinació dels gruixos nanomètrics de les capes i per tant dels ritmes de creixement s’ha utilitzat la el·lipsometria, una de les poques tècniques no destructives capaç de mesurar amb gran precisió gruixos de capes o interfases de pocs àngstroms o nanòmetres. En una primera etapa s'han utilitzat els valors experimentals donats pel fabricant del sistema ALD per determinar el ritme de creixement en funció de la temperatura de dipòsit i del numero de cicles, en ambdós casos sobre diversos substrats. S'ha demostrat que el ritme de creixement augmenta lleugerament en augmentar la temperatura de dipòsit, tot i que amb una variació petita, de l'ordre del 12% en variar 70ºC la temperatura de deposició. Així mateix s'ha demostrat la linealitat del gruix amb el número de cicles, tot i que no s’observa una proporcionalitat exacta. En una segona etapa s'han optimitzat els paràmetres experimentals, bàsicament els temps de purga entre pols i pols per tal de reduir considerablement les durades dels experiments realitzats a relativament baixes temperatures. En aquest cas s’ha comprovat que es mantenien els ritmes de creixement amb una diferencia del 3,6%, 4,8% i 5,5% en optimitzar el cicles en 6,65h, 8,31h, o 8,33h, respectivament. A més, per una d'aquestes condicions s’ha demostrat que es mantenia l’alta conformitat de les capes d’alúmina. A més, s'ha realitzat un estudi de l'homogeneïtat del gruix de les capes en tota la zona de dipòsit del reactor ALD. S’ha demostrat que la variació en gruix de les capes dipositades a 120ºC és com a màxim del 6,2% en una superfície de 110 cm2. Confirmant l’excepcional control de gruixos de la tècnica ALD.
Resumo:
In order to explain the speed of Vesicular Stomatitis Virus VSV infections, we develop a simple model that improves previous approaches to the propagation of virus infections. For VSV infections, we find that the delay time elapsed between the adsorption of a viral particle into a cell and the release of its progeny has a veryimportant effect. Moreover, this delay time makes the adsorption rate essentially irrelevant in order to predict VSV infection speeds. Numerical simulations are in agreement with the analytical results. Our model satisfactorily explains the experimentally measured speeds of VSV infections