853 resultados para Refined Solution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://resfranco.cochrane.org/sites/resfranco.cochrane.org/files/uploads/Arrettabac2009.pdf

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze the rate of convergence towards self-similarity for the subcritical Keller-Segel system in the radially symmetric two-dimensional case and in the corresponding one-dimensional case for logarithmic interaction. We measure convergence in Wasserstein distance. The rate of convergence towards self-similarity does not degenerate as we approach the critical case. As a byproduct, we obtain a proof of the logarithmic Hardy-Littlewood-Sobolev inequality in the one dimensional and radially symmetric two dimensional case based on optimal transport arguments. In addition we prove that the onedimensional equation is a contraction with respect to Fourier distance in the subcritical case.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main result is a proof of the existence of a unique viscosity solution for Hamilton-Jacobi equation, where the hamiltonian is discontinuous with respect to variable, usually interpreted as the spatial one. Obtained generalized solution is continuous, but not necessarily differentiable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work in this paper deals with the development of momentum and thermal boundary layers when a power law fluid flows over a flat plate. At the plate we impose either constant temperature, constant flux or a Newton cooling condition. The problem is analysed using similarity solutions, integral momentum and energy equations and an approximation technique which is a form of the Heat Balance Integral Method. The fluid properties are assumed to be independent of temperature, hence the momentum equation uncouples from the thermal problem. We first derive the similarity equations for the velocity and present exact solutions for the case where the power law index n = 2. The similarity solutions are used to validate the new approximation method. This new technique is then applied to the thermal boundary layer, where a similarity solution can only be obtained for the case n = 1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been good progress in inferring the evolutionary relationships within trypanosomes from DNA data as until relatively recently, many relationships have remained rather speculative. Ongoing molecular studies have provided data that have adequately shown Trypanosoma to be monophyletic and, rather surprisingly, that there are sharply contrasting levels of genetic variation within and between the major trypanosomatid groups. There are still, however, areas of research that could benefit from further development and resolution that broadly fall upon three questions. Are the current statements of evolutionary homology within ribosomal small sub-unit genes in need of refinement? Can the published phylograms be expanded upon to form `supertrees' depicting further relationships? Does a bifurcating tree structure impose an untenable dogma upon trypanosomatid phylogeny where hybridisation or reticulate evolutionary steps have played a part? This article briefly addresses these three questions and, in so doing, hopes to stimulate further interest in the molecular evolution of the group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new and original reagent based on the use of highly fluorescent cadmium telluride (CdTe) quantum dots (QDs) in aqueous solution is proposed to detect weak fingermarks in blood on non-porous surfaces. To assess the efficiency of this approach, comparisons were performed with one of the most efficient blood reagents on non-porous surfaces, Acid Yellow 7 (AY7). To this end, four non-porous surfaces were studied, i.e. glass, transparent polypropylene, black polyethylene, and aluminium foil. To evaluate the sensitivity of both reagents, sets of depleted fingermarks were prepared, using the same finger, initially soaked with blood, which was then successively applied on the same surface without recharging it with blood or latent secretions. The successive marks were then cut in halves and the halves treated separately with each reagent. The results showed that QDs were equally efficient to AY7 on glass, polyethylene and polypropylene surfaces, and were superior to AY7 on aluminium. The use of QDs in new, sensitive and highly efficient latent and blood mark detection techniques appears highly promising. Health and safety issues related to the use of cadmium are also discussed. It is suggested that applying QDs in aqueous solution (and not as a dry dusting powder) considerably lowers the toxicity risks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Malaria during pregnancy can be severe in non-immune women, but in areas of stable transmission, where women are semi-immune and often asymptomatic during infection, malaria is an insidious cause of disease and death for mothers and their offspring. Sequelae, such as severe anaemia and hypertension in the mother and low birth weight and infant mortality in the offspring, are often not recognised as consequences of infection. Pregnancy malaria, caused by Plasmodium falciparum, is mediated by infected erythrocytes (IEs) that bind to chondroitin sulphate A and are sequestered in the placenta. These parasites have a unique adhesion phenotype and distinct antigenicity, which indicates that novel targets may be required for development of an effective vaccine. Women become resistant to malaria as they acquire antibodies against placental IE, which leads to higher haemoglobin levels and heavier babies. Proteins exported from the placental parasites have been identified, including both variant and conserved antigens, and some of these are in preclinical development for vaccines. A vaccine that prevents P. falciparum malaria in pregnant mothers is feasible and would potentially save hundreds of thousands of lives each year.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we axiomatize the strong constrained egalitarian solution (Dutta and Ray, 1991) over the class of weak superadditive games using constrained egalitarianism, order-consistency, and converse order-consistency. JEL classification: C71, C78. Keywords: Cooperative TU-game, strong constrained egalitarian solution, axiomatization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cobre Las Cruces is a renowned copper mining company located in Sevilla, with unexpected problems in wireless communications that have a direct affectation in production. Therefore, the main goals are to improve the WiFi infrastructure, to secure it and to detect and prevent from attacks and from the installation of rogue (and non-authorized) APs. All of that integrated with the current ICT infrastructure.This project has been divided into four phases, although only two of them have been included into the TFC; they are the analysis of the current situation and the design of a WLAN solution.Once the analysis part was finished, some weaknesses were detected. Subjects such as lack of connectivity and control, ignorance about installed WiFi devices and their localization and state and, by and large, the use of weak security mechanisms were some of the problems found. Additionally, due to the fact that the working area became larger and new WiFi infrastructures were added, the first phase took more time than expected.As a result of the detailed analysis, some goals were defined to solve and it was designed a centralized approach able to cope with them. A solution based on 802.11i and 802.1x protocols, digital certificates, a probe system running as IDS/IPS and ligthweight APs in conjunction with a Wireless LAN Controller are the main features.