987 resultados para Birkhoff and Von Neumann ergodic theorems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

From the beginning, the world of game-playing by machine has been fortunate in attracting contributions from the leading names of computer science. Charles Babbage, Konrad Zuse, Claude Shannon, Alan Turing, John von Neumann, John McCarthy, Alan Newell, Herb Simon and Ken Thompson all come to mind, and each reader will wish to add to this list. Recently, the Journal has saluted both Claude Shannon and Herb Simon. Ken’s retirement from Lucent Technologies’ Bell Labs to the start-up Entrisphere is also a good moment for reflection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A total of 86 profiles from meat and egg strains of chickens (male and female) were used in this study. Different flexible growth functions were evaluated with regard to their ability to describe the relationship between live weight and age and were compared with the Gompertz and logistic equations, which have a fixed point of inflection. Six growth functions were used: Gompertz, logistic, Lopez, Richards, France, and von Bertalanffy. A comparative analysis was carried out based on model behavior and statistical performance. The results of this study confirmed the initial concern about the limitation of a fixed point of inflection, such as in the Gompertz equation. Therefore, consideration of flexible growth functions as an alternatives to the simpler equations (with a fixed point of inflection) for describing the relationship between live weight and age are recommended for the following reasons: they are easy to fit, they very often give a closer fit to data points because of their flexibility and therefore a smaller RSS value, than the simpler models, and they encompasses simpler models for the addition of an extra parameter, which is especially important when the behavior of a particular data set is not defined previously.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Platelets perform a central role in haemostasis and thrombosis. They adhere to subendothelial collagens exposed at sites of blood vessel injury via the glycoprotein (GP) 1b-V-IX receptor complex, GPV1 and integrin alpha(2)beta(1)-These receptors perform distinct functions in the regulation of cell signalling involving non-receptor tyrosine kinases (e.g. Src, Fyn, Lyn, Syk and Btk), adaptor proteins, phospholipase C and lipid kinases such as phosphoinositide 3-kinase. They are also coupled to an increase in cytosolic calcium levels and protein kinase C activation, leading to the secretion of paracrine/autocrine platelet factors and an increase in integrin receptor affinities. Through the binding of plasma fibrinogen and von Willebrand Factor to integrin alphaIIbbeta(3), a platelet thrombus is formed. Although increasing evidence indicates that each of the adhesion receptors GPIb-V-IX and GPV1 and integrins alpha(2)beta(1) and alpha(IIb)beta(3) contribute to the signalling that regulates this process, the individual roles of each are only beginning to be dissected. By contrast, adhesion receptor signalling through platelet endothelial cell adhesion molecule 1 (PECAM-1) is implicated in the inhibition of platelet function and thrombus formation in the healthy circulation. Recent studies indicate that understanding of platelet adhesion signalling mechanisms might enable the development of new strategies to treat and prevent thrombosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulations of ozone loss rates using a three-dimensional chemical transport model and a box model during recent Antarctic and Arctic winters are compared with experimental loss rates. The study focused on the Antarctic winter 2003, during which the first Antarctic Match campaign was organized, and on Arctic winters 1999/2000, 2002/2003. The maximum ozone loss rates retrieved by the Match technique for the winters and levels studied reached 6 ppbv/sunlit hour and both types of simulations could generally reproduce the observations at 2-sigma error bar level. In some cases, for example, for the Arctic winter 2002/2003 at 475 K level, an excellent agreement within 1-sigma standard deviation level was obtained. An overestimation was also found with the box model simulation at some isentropic levels for the Antarctic winter and the Arctic winter 1999/2000, indicating an overestimation of chlorine activation in the model. Loss rates in the Antarctic show signs of saturation in September, which have to be considered in the comparison. Sensitivity tests were performed with the box model in order to assess the impact of kinetic parameters of the ClO-Cl2O2 catalytic cycle and total bromine content on the ozone loss rate. These tests resulted in a maximum change in ozone loss rates of 1.2 ppbv/sunlit hour, generally in high solar zenith angle conditions. In some cases, a better agreement was achieved with fastest photolysis of Cl2O2 and additional source of total inorganic bromine but at the expense of overestimation of smaller ozone loss rates derived later in the winter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the prospect of exascale computing, computational methods requiring only local data become especially attractive. Consequently, the typical domain decomposition of atmospheric models means horizontally-explicit vertically-implicit (HEVI) time-stepping schemes warrant further attention. In this analysis, Runge-Kutta implicit-explicit schemes from the literature are analysed for their stability and accuracy using a von Neumann stability analysis of two linear systems. Attention is paid to the numerical phase to indicate the behaviour of phase and group velocities. Where the analysis is tractable, analytically derived expressions are considered. For more complicated cases, amplification factors have been numerically generated and the associated amplitudes and phase diagnosed. Analysis of a system describing acoustic waves has necessitated attributing the three resultant eigenvalues to the three physical modes of the system. To do so, a series of algorithms has been devised to track the eigenvalues across the frequency space. The result enables analysis of whether the schemes exactly preserve the non-divergent mode; and whether there is evidence of spurious reversal in the direction of group velocities or asymmetry in the damping for the pair of acoustic modes. Frequency ranges that span next-generation high-resolution weather models to coarse-resolution climate models are considered; and a comparison is made of errors accumulated from multiple stability-constrained shorter time-steps from the HEVI scheme with a single integration from a fully implicit scheme over the same time interval. Two schemes, “Trap2(2,3,2)” and “UJ3(1,3,2)”, both already used in atmospheric models, are identified as offering consistently good stability and representation of phase across all the analyses. Furthermore, according to a simple measure of computational cost, “Trap2(2,3,2)” is the least expensive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe infinitely scalable pipeline machines with perfect parallelism, in the sense that every instruction of an inline program is executed, on successive data, on every clock tick. Programs with shared data effectively execute in less than a clock tick. We show that pipeline machines are faster than single or multi-core, von Neumann machines for sufficiently many program runs of a sufficiently time consuming program. Our pipeline machines exploit the totality of transreal arithmetic and the known waiting time of statically compiled programs to deliver the interesting property that they need no hardware or software exception handling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Civil wars are the most common type of large scale violent conflict. They are long, brutal and continue to harm societies long after the shooting stops. Post-conflict countries face extraordinary challenges with respect to development and security. In this paper we examine how countries can recover economically from these devastating conflicts and how international interventions can help to build lasting peace. We revisit the aid and growth debate and confirm that aid does not increase growth in general. However, we find that countries experience increased growth after the end of the war and that aid helps to make the most of this peace dividend. However, aid is only growth enhancing when the violence has stopped, in violent post-war societies aid has no growth enhancing effect. We also find that good governance is robustly correlated with growth, however we cannot confirm that aid increases growth conditional on good policies. We examine various aspects of aid and governance by disaggregating the aid and governance variables. Our analysis does not provide a clear picture of which types of aid and policy should be prioritized. We find little evidence for a growth enhancing effect of UN missions and suggest that case studies may provide better insight into the relationship between security guarantees and economic stabilization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter considers aspects of urban design and associated identity of place that shifts over time and has to identify with aspects of economic pressures to develop as well as cultural concerns about change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactions between host nutrition and feeding behaviour are central to understanding the pathophysiological consequences of infections of the digestive tract with parasitic nematodes. The manipulation of host nutrition provides useful options to control gastrointestinal nematodes as a component of an integrated strategy. Focused mainly on the Hameonchus contortus infection model in small ruminants, this chapter (i) illustrates the relationship between quantitative (macro- and micro-nutrients) and qualitative (plant secondary metabolites) aspects of host nutrition and nematode infection, and (ii) shows how basic studies aimed at addressing some generic questions can help provide solutions, despite the considerable diversity of epidemiological situations and breeding systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On using McKenzie’s taxonomy of optimal accumulation in the longrun, we report a “uniform turnpike” theorem of the third kind in a model original to Robinson, Solow and Srinivasan (RSS), and further studied by Stiglitz. Our results are presented in the undiscounted, discrete-time setting emphasized in the recent work of Khan-Mitra, and they rely on the importance of strictly concave felicity functions, or alternatively, on the value of a “marginal rate of transformation”, ξσ, from one period to the next not being unity. Our results, despite their specificity, contribute to the methodology of intertemporal optimization theory, as developed in economics by Ramsey, von Neumann and their followers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We define a subgame perfect Nash equilibrium under Knightian uncertainty for two players, by means of a recursive backward induction procedure. We prove an extension of the Zermelo-von Neumann-Kuhn Theorem for games of perfect information, i. e., that the recursive procedure generates a Nash equilibrium under uncertainty (Dow and Werlang(1994)) of the whole game. We apply the notion for two well known games: the chain store and the centipede. On the one hand, we show that subgame perfection under Knightian uncertainty explains the chain store paradox in a one shot version. On the other hand, we show that subgame perfection under uncertainty does not account for the leaving behavior observed in the centipede game. This is in contrast to Dow, Orioli and Werlang(1996) where we explain by means of Nash equilibria under uncertainty (but not subgame perfect) the experiments of McKelvey and Palfrey(1992). Finally, we show that there may be nontrivial subgame perfect equilibria under uncertainty in more complex extensive form games, as in the case of the finitely repeated prisoner's dilemma, which accounts for cooperation in early stages of the game.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Prospect Theory is one of the basis of Behavioral Finance and models the investor behavior in a different way than von Neumann and Morgenstern Utility Theory. Behavioral characteristics are evaluated for different control groups, validating the violation of Utility Theory Axioms. Naïve Diversification is also verified, utilizing the 1/n heuristic strategy for investment funds allocations. This strategy causes different fixed and equity allocations, compared to the desirable exposure, given the exposure of the subsample that answered a non constrained allocation question. When compared to non specialists, specialists in finance are less risk averse and allocate more of their wealth on equity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We define a subgame perfect Nash equilibrium under Knightian uncertainty for two players, by means of a recursive backward induction procedure. We prove an extension of the Zermelo-von Neumann-Kuhn Theorem for games of perfect information, i. e., that the recursive procedure generates a Nash equilibrium under uncertainty (Dow and Werlang(1994)) of the whole game. We apply the notion for two well known games: the chain store and the centipede. On the one hand, we show that subgame perfection under Knightian uncertainty explains the chain store paradox in a one shot version. On the other hand, we show that subgame perfection under uncertainty does not account for the leaving behavior observed in the centipede game. This is in contrast to Dow, Orioli and Werlang(1996) where we explain by means of Nash equilibria under uncertainty (but not subgame perfect) the experiments of McKelvey and Palfrey(1992). Finally, we show that there may be nontrivial subgame perfect equilibria under uncertainty in more complex extensive form games, as in the case of the finitely repeated prisoner's dilemma, which accounts for cooperation in early stages of the game .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this article is to test the hypothesis that utility preferences that incorporate asymmetric reactions between gains and losses generate better results than the classic Von Neumann-Morgenstern utility functions in the Brazilian market. The asymmetric behavior can be computed through the introduction of a disappointment (or loss) aversion coefficient in the classical expected utility function, which increases the impact of losses against gains. The results generated by both traditional and loss aversion utility functions are compared with real data from the Brazilian market regarding stock market participation in the investment portfolio of pension funds and individual investors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study shows the implementation and the embedding of an Artificial Neural Network (ANN) in hardware, or in a programmable device, as a field programmable gate array (FPGA). This work allowed the exploration of different implementations, described in VHDL, of multilayer perceptrons ANN. Due to the parallelism inherent to ANNs, there are disadvantages in software implementations due to the sequential nature of the Von Neumann architectures. As an alternative to this problem, there is a hardware implementation that allows to exploit all the parallelism implicit in this model. Currently, there is an increase in use of FPGAs as a platform to implement neural networks in hardware, exploiting the high processing power, low cost, ease of programming and ability to reconfigure the circuit, allowing the network to adapt to different applications. Given this context, the aim is to develop arrays of neural networks in hardware, a flexible architecture, in which it is possible to add or remove neurons, and mainly, modify the network topology, in order to enable a modular network of fixed-point arithmetic in a FPGA. Five synthesis of VHDL descriptions were produced: two for the neuron with one or two entrances, and three different architectures of ANN. The descriptions of the used architectures became very modular, easily allowing the increase or decrease of the number of neurons. As a result, some complete neural networks were implemented in FPGA, in fixed-point arithmetic, with a high-capacity parallel processing