890 resultados para continuous model theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present existence, uniqueness and continuous dependence results for some kinetic equations motivated by models for the collective behavior of large groups of individuals. Models of this kind have been recently proposed to study the behavior of large groups of animals, such as flocks of birds, swarms, or schools of fish. Our aim is to give a well-posedness theory for general models which possibly include a variety of effects: an interaction through a potential, such as a short-range repulsion and long-range attraction; a velocity-averaging effect where individuals try to adapt their own velocity to that of other individuals in their surroundings; and self-propulsion effects, which take into account effects on one individual that are independent of the others. We develop our theory in a space of measures, using mass transportation distances. As consequences of our theory we show also the convergence of particle systems to their corresponding kinetic equations, and the local-in-time convergence to the hydrodynamic limit for one of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we analyse the asymptotic behavior of solutions of the continuous kinetic version of flocking by Cucker and Smale [16], which describes the collective behavior of an ensemble of organisms, animals or devices. This kinetic version introduced in [24] is here obtained starting from a Boltzmann-type equation. The large-time behavior of the distribution in phase space is subsequently studied by means of particle approximations and a stability property in distances between measures. A continuous analogue of the theorems of [16] is shown to hold for the solutions on the kinetic model. More precisely, the solutions will concentrate exponentially fast their velocity to their mean while in space they will converge towards a translational flocking solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of a quantitative phenotype is often envisioned as a trait substitution sequence where mutant alleles repeatedly replace resident ones. In infinite populations, the invasion fitness of a mutant in this two-allele representation of the evolutionary process is used to characterize features about long-term phenotypic evolution, such as singular points, convergence stability (established from first-order effects of selection), branching points, and evolutionary stability (established from second-order effects of selection). Here, we try to characterize long-term phenotypic evolution in finite populations from this two-allele representation of the evolutionary process. We construct a stochastic model describing evolutionary dynamics at non-rare mutant allele frequency. We then derive stability conditions based on stationary average mutant frequencies in the presence of vanishing mutation rates. We find that the second-order stability condition obtained from second-order effects of selection is identical to convergence stability. Thus, in two-allele systems in finite populations, convergence stability is enough to characterize long-term evolution under the trait substitution sequence assumption. We perform individual-based simulations to confirm our analytic results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a new benchmark for the analysis of the international diversi…cation puzzle in a tractable new open economy macroeconomic model. Building on Cole and Obstfeld (1991) and Heathcote and Perri (2009), this model speci…es an equilibrium model of perfect risk sharing in incomplete markets, with endogenous portfolios and number of varieties. Equity home bias may not be a puzzle but a perfectly optimal allocation for hedging risk. In contrast to previous work, the model shows that: (i) optimal international portfolio diversi…cation is driven by home bias in capital goods, independently of home bias in consumption, and by the share of income accruing to labour. The model explains reasonably well the recent patterns of portfolio allocations in developed economies; and (ii) optimal portfolio shares are independent of market dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How do monopolistically competitive industries react to shocks in the context of a New Keynesian macro model? I bridge macroeconomics and trade theory by considering market dynamics. I use an analytically tractable closed-economy model with endogenous entry of firms and show the implications of markets structure for the transmission of real shocks on aggregate variables and welfare. Shock sources become crucial for the results: traditional productivity shocks cause an extensive effect on production; shocks on innovation cause an intensive impact. More patient populations bring the economy to a richer market, although it cushions the extensive effect after an innovation shock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gifted children develop asynchronously, often advanced for their age cognitively, but at or between their chronological and mental ages socially and emotionally (Robinson, 2008). In order to help gifted children and adolescents develop and practice social and emotional self-regulation skills, we investigated the use of an Adlerian play therapy approach during pen-and-paper role-playing games. Additionally, we used Goffman's (1961, 1974) social role identification and distance to encourage participants to experiment with new identities. Herein, we propose a psychosocial model of interactions during role-playing games based on Goffman's theory and Adlerian play therapy techniques, and suggest that role-playing games are an effective way of intervening with gifted children and adolescents to improve their intra- and interpersonal skills. We specifically targeted intrapersonal skills of exercising creativity, becoming self-aware, and setting individual goals by raising participants' awareness of their privately logical reasons for making decisions and their levels of social interest. We also targeted their needs and means of seeking significance in the group to promote collaboration and interaction skills with other gifted peers through role analysis, embracement, and distancing. We report results from a case study and conclude that role-playing games deserve more attention, both from researchers and clinical practitioners, because they encourage change while improving young clients' social and emotional development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing the relationship between the baseline value and subsequent change of a continuous variable is a frequent matter of inquiry in cohort studies. These analyses are surprisingly complex, particularly if only two waves of data are available. It is unclear for non-biostatisticians where the complexity of this analysis lies and which statistical method is adequate.With the help of simulated longitudinal data of body mass index in children,we review statistical methods for the analysis of the association between the baseline value and subsequent change, assuming linear growth with time. Key issues in such analyses are mathematical coupling, measurement error, variability of change between individuals, and regression to the mean. Ideally, it is better to rely on multiple repeated measurements at different times and a linear random effects model is a standard approach if more than two waves of data are available. If only two waves of data are available, our simulations show that Blomqvist's method - which consists in adjusting for measurement error variance the estimated regression coefficient of observed change on baseline value - provides accurate estimates. The adequacy of the methods to assess the relationship between the baseline value and subsequent change depends on the number of data waves, the availability of information on measurement error, and the variability of change between individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leaders must scan the internal and external environment, chart strategic and task objectives, and provide performance feedback. These instrumental leadership (IL) functions go beyond the motivational and quid-pro quo leader behaviors that comprise the full-range-transformational, transactional, and laissez faire-leadership model. In four studies we examined the construct validity of IL. We found evidence for a four-factor IL model that was highly prototypical of good leadership. IL predicted top-level leader emergence controlling for the full-range factors, initiating structure, and consideration. It also explained unique variance in outcomes beyond the full-range factors; the effects of transformational leadership were vastly overstated when IL was omitted from the model. We discuss the importance of a "fuller full-range" leadership theory for theory and practice. We also showcase our methodological contributions regarding corrections for common method variance (i.e., endogeneity) bias using two-stage least squares (2SLS) regression and Monte Carlo split-sample designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article is intended to improve our understanding of the reasons underlying the intellectual migration of scientists from well-known cognitive domains to nascent scientific fields. To that purpose we present, first, a number of findings from the sociology of science that give different insights about this phenomenon. We then attempt to bring some of these insights together under the conceptual roof of an actor-based approach linking expected utility and diffusion theory. Intellectual migration is regarded as the rational choice of scientists who decide under uncertainty and on the base of a number of decision-making variables, which define probabilities, costs, and benefits of the migration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to explain the speed of Vesicular Stomatitis Virus VSV infections, we develop a simple model that improves previous approaches to the propagation of virus infections. For VSV infections, we find that the delay time elapsed between the adsorption of a viral particle into a cell and the release of its progeny has a veryimportant effect. Moreover, this delay time makes the adsorption rate essentially irrelevant in order to predict VSV infection speeds. Numerical simulations are in agreement with the analytical results. Our model satisfactorily explains the experimentally measured speeds of VSV infections

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In eusocial Hymenoptera, queens and workers are in conflict over optimal sex allocation. Sex ratio theory, while generating predictions on the extent of this conflict under a wide range of conditions, has largely neglected the fact that worker control of investment almost certainly requires the manipulation of brood sex ratio. This manipulation is likely to incur costs, for example, if workers eliminate male larvae or rear more females as sexuals rather than workers. In this article, we present a model of sex ratio evolution under worker control that incorporates costs of brood manipulation. We assume cost to be a continuous, increasing function of the magnitude of sex ratio manipulation. We demonstrate that costs counterselect sex ratio biasing, which leads to less female-biased population sex ratios than expected on the basis of relatedness asymmetry. Furthermore, differently shaped cost functions lead to different equilibria of manipulation at the colony level. While linear and accelerating cost functions generate monomorphic equilibria, decelerating costs lead to a process of evolutionary branching and hence split sex ratios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines a dataset which is modeled well by thePoisson-Log Normal process and by this process mixed with LogNormal data, which are both turned into compositions. Thisgenerates compositional data that has zeros without any need forconditional models or assuming that there is missing or censoreddata that needs adjustment. It also enables us to model dependenceon covariates and within the composition

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: An animal model has been developed to compare the effects of suture technique on the luminal dimensions and compliance of end-to-side vascular anastomoses. METHODS: Carotid and internal mammalian arteries (IMAs) were exposed in three pigs (90 kg). IMAs were sectioned distally to perform end-to-side anastomoses on carotid arteries. One anastomosis was performed with 7/0 polypropylene running suture. The other was performed with the automated suture delivery device (Perclose/Abbott Labs Inc.) that makes a 7/0 polypropylene interrupted suture. Four piezoelectric crystals were sutured on toe, heel and both lateral sides of each anastomosis to measure anastomotic axes. Anastomotic cross-sectional area (CSAA) was calculated with: CSAA = pi x mM/4 where m and M are the minor and major axes of the elliptical anastomosis. Cross-sectional anastomotic compliance (CSAC) was calculated as CSAC=Delta CSAA/Delta P where Delta P is the mean pulse pressure and Delta CSAA is the mean CSAA during cardiac cycle. RESULTS: We collected a total of 1200000 pressure-length data per animal. For running suture we had a mean systolic CSAA of 26.94+/-0.4 mm(2) and a mean CSAA in diastole of 26.30+/-0.5 mm(2) (mean Delta CSAA was 0.64 mm(2)). CSAC for running suture was 4.5 x 10(-6)m(2)/kPa. For interrupted suture we had a mean CSAA in systole of 21.98+/-0.2 mm(2) and a mean CSAA in diastole of 17.38+/-0.3 mm(2) (mean Delta CSAA was 4.6+/-0.1 mm(2)). CSAC for interrupted suture was 11 x 10(-6) m(2)/kPa. CONCLUSIONS: This model, even with some limitations, can be a reliable source of information improving the outcome of vascular anastomoses. The study demonstrates that suture technique has a substantial effect on cross-sectional anastomotic compliance of end-to-side anastomoses. Interrupted suture may maximise the anastomotic lumen and provides a considerably higher CSAC than continuous suture, that reduces flow turbulence, shear stress and intimal hyperplasia. The Heartflo anastomosis device is a reliable instrument that facilitates performance of interrupted suture anastomoses.