927 resultados para Random finite set theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Part I, we formulate and examine some systems that have arisen in the study of the constructible hierarchy; we find numerous transitive models for them, among which are supertransitive models containing all ordinals that show that Devlin's system BS lies strictly between Gandy's systems PZ and BST'; and we use our models to show that BS fails to handle even the simplest rudimentary functions, and is thus inadequate for the use intended for it in Devlin's treatise. In Part II we propose and study an enhancement of the underlying logic of these systems, build further models to show where the previous hierarchy of systems is preserved by our enhancement; and consider three systems that might serve for Devlin's purposes: one the enhancement of a version of BS, one a formulation of Gandy-Jensen set theory, and the third a subsystem common to those two. In Part III we give new proofs of results of Boffa by constructing three models in which, respectively, TCo, AxPair and AxSing fail; we give some sufficient conditions for a set not to belong to the rudimentary closure of another set, and thus answer a question of McAloon; and we comment on Gandy's numerals and correct and sharpen other of his observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider collective choice problems where a set of agents have to choose an alternative from a finite set and agents may or may not become users of the chosen alternative. An allocation is a pair given by the chosen alternative and the set of its users. Agents have gregarious preferences over allocations: given an allocation, they prefer that the set of users becomes larger. We require that the final allocation be efficient and stable (no agent can be forced to be a user and no agent who wants to be a user can be excluded). We propose a two-stage sequential mechanism whose unique subgame perfect equilibrium outcome is an efficient and stable allocation which also satisfies a maximal participation property.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bilateral oligopoly is a simple model of exchange in which a finite set of sellers seek to exchange the goods they are endowed with for money with a finite set of buyers, and no price-taking assumptions are imposed. If trade takes place via a strategic market game bilateral oligopoly can be thought of as two linked proportional-sharing contests: in one the sellers share the aggregate bid from the buyers in proportion to their supply and in the other the buyers share the aggregate supply in proportion to their bids. The analysis can be separated into two ‘partial games’. First, fix the aggregate bid at B; in the first partial game the sellers contest this fixed prize in proportion to their supply and the aggregate supply in the equilibrium of this game is X˜ (B). Next, fix the aggregate supply at X; in the second partial game the buyers contest this fixed prize in proportion to their bids and the aggregate bid in the equilibrium of this game is ˜B (X). The analysis of these two partial games takes into account competition within each side of the market. Equilibrium in bilateral oligopoly must take into account competition between sellers and buyers and requires, for example, ˜B (X˜ (B)) = B. When all traders have Cobb-Douglas preferences ˜ X(B) does not depend on B and ˜B (X) does not depend on X: whilst there is competition within each side of the market there is no strategic interdependence between the sides of the market. The Cobb-Douglas assumption provides a tractable framework in which to explore the features of fully strategic trade but it misses perhaps the most interesting feature of bilateral oligopoly, the implications of which are investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The usual way to investigate the statistical properties of finitely generated subgroups of free groups, and of finite presentations of groups, is based on the so-called word-based distribution: subgroups are generated (finite presentations are determined) by randomly chosen k-tuples of reduced words, whose maximal length is allowed to tend to infinity. In this paper we adopt a different, though equally natural point of view: we investigate the statistical properties of the same objects, but with respect to the so-called graph-based distribution, recently introduced by Bassino, Nicaud and Weil. Here, subgroups (and finite presentations) are determined by randomly chosen Stallings graphs whose number of vertices tends to infinity. Our results show that these two distributions behave quite differently from each other, shedding a new light on which properties of finitely generated subgroups can be considered frequent or rare. For example, we show that malnormal subgroups of a free group are negligible in the raph-based distribution, while they are exponentially generic in the word-based distribution. Quite surprisingly, a random finite presentation generically presents the trivial group in this new distribution, while in the classical one it is known to generically present an infinite hyperbolic group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The control and prediction of wastewater treatment plants poses an important goal: to avoid breaking the environmental balance by always keeping the system in stable operating conditions. It is known that qualitative information — coming from microscopic examinations and subjective remarks — has a deep influence on the activated sludge process. In particular, on the total amount of effluent suspended solids, one of the measures of overall plant performance. The search for an input–output model of this variable and the prediction of sudden increases (bulking episodes) is thus a central concern to ensure the fulfillment of current discharge limitations. Unfortunately, the strong interrelationbetween variables, their heterogeneity and the very high amount of missing information makes the use of traditional techniques difficult, or even impossible. Through the combined use of several methods — rough set theory and artificial neural networks, mainly — reasonable prediction models are found, which also serve to show the different importance of variables and provide insight into the process dynamics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[cat] En el domini dels jocs bilaterals d’assignació, es presenta una axiomàtica del nucleolus com l´unica solució que compleix les propietats de consistència respecte del joc derivat definit per Owen (1992) i monotonia de les queixes dels sectors respecte de la seva cardinalitat. Com a conseqüència obtenim una caracterització geomètrica del nucleolus mitjançant una propietat de bisecció més forta que la que satisfan els punts del kernel (Maschler et al, 1979).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We give a sufficient condition for a set of block subspaces in an infinite-dimensional Banach space to be weakly Ramsey. Using this condition we prove that in the Levy-collapse of a Mahlo cardinal, every projective set is weakly Ramsey. This, together with a construction of W. H. Woodin, is used to show that the Axiom of Projective Determinacy implies that every projective set is weakly Ramsey. In the case of co we prove similar results for a stronger Ramsey property. And for hereditarily indecomposable spaces we show that the Axiom of Determinacy plus the Axiom of Dependent Choices imply that every set is weakly Ramsey. These results are the generalizations to the class of projective sets of some theorems from W. T. Gowers, and our paper "Weakly Ramsey sets in Banach spaces."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[cat] En el domini dels jocs bilaterals d’assignació, es presenta una axiomàtica del nucleolus com l´unica solució que compleix les propietats de consistència respecte del joc derivat definit per Owen (1992) i monotonia de les queixes dels sectors respecte de la seva cardinalitat. Com a conseqüència obtenim una caracterització geomètrica del nucleolus mitjançant una propietat de bisecció més forta que la que satisfan els punts del kernel (Maschler et al, 1979).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo se describen la teoría de los conjuntos borrosos de L. A. Zadeh(antecedentes, características e implicaciones) y las áreas en las que se ha aplicado laborrosidad en psicología y psicología social (desarrollo evolutivo, procesamiento deestímulos, percepción de la información, prototipos y otras aplicaciones). A partir de esto,se sugiere cómo la borrosidad podría ser útil en el estudio de la interacción social,asumiendo el carácter simultáneamente vago y preciso de la realidad, y la utilización deconceptos como la noción de sí mismo desde una visión compleja, que considere, desde laperspectiva del pluralismo, diversas posturas teóricas y metodológicas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of a society whose members must choose from a finite set of alternatives. After knowing the chosen alternative, members may reconsider their membership in the society by either staying or exiting. In turn, and as a consequence of the exit of some of its members, other members might now find undersirable to belong to the society as well. We analyze the voting behavior of members who take into account the effect of their votes not only on the chosen alternative, but also on the final composition of the society