993 resultados para Choice Functions
Resumo:
Adaptive Resonance Theory (ART) models are real-time neural networks for category learning, pattern recognition, and prediction. Unsupervised fuzzy ART and supervised fuzzy ARTMAP networks synthesize fuzzy logic and ART by exploiting the formal similarity between tile computations of fuzzy subsethood and the dynamics of ART category choice, search, and learning. Fuzzy ART self-organizes stable recognition categories in response to arbitrary sequences of analog or binary input patterns. It generalizes the binary ART 1 model, replacing the set-theoretic intersection (∩) with the fuzzy intersection(∧), or component-wise minimum. A normalization procedure called complement coding leads to a symmetric theory in which the fuzzy intersection and the fuzzy union (∨), or component-wise maximum, play complementary roles. A geometric interpretation of fuzzy ART represents each category as a box that increases in size as weights decrease. This paper analyzes fuzzy ART models that employ various choice functions for category selection. One such function minimizes total weight change during learning. Benchmark simulations compare peformance of fuzzy ARTMAP systems that use different choice functions.
Resumo:
Necessary and sufficient conditions for choice functions to be rational have been intensively studied in the past. However, in these attempts, a choice function is completely specified. That is, given any subset of options, called an issue, the best option over that issue is always known, whilst in real-world scenarios, it is very often that only a few choices are known instead of all. In this paper, we study partial choice functions and investigate necessary and sufficient rationality conditions for situations where only a few choices are known. We prove that our necessary and sufficient condition for partial choice functions boils down to the necessary and sufficient conditions for complete choice functions proposed in the literature. Choice functions have been instrumental in belief revision theory. That is, in most approaches to belief revision, the problem studied can simply be described as the choice of possible worlds compatible with the input information, given an agent’s prior belief state. The main effort has been to devise strategies in order to infer the agents revised belief state. Our study considers the converse problem: given a collection of input information items and their corresponding revision results (as provided by an agent), does there exist a rational revision operation used by the agent and a consistent belief state that may explain the observed results?
Resumo:
The rationalizability of a choice function by means of a transitive relation has been analyzed thoroughly in the literature. However, not much seems to be known when transitivity is weakened to quasi-transitivity or acyclicity. We describe the logical relationships between the different notions of rationalizability involving, for example, the transitivity, quasi-transitivity, or acyclicity of the rationalizing relation. Furthermore, we discuss sufficient conditions and necessary conditions for rational choice on arbitrary domains. Transitive, quasi-transitive, and acyclical rationalizability are fully characterized for domains that contain all singletons and all two-element subsets of the universal set.
Resumo:
We analyze infinite-horizon choice functions within the setting of a simple linear technology. Time consistency and efficiency are characterized by stationary consumption and inheritance functions, as well as a transversality condition. In addition, we consider the equity axioms Suppes-Sen, Pigou-Dalton, and resource monotonicity. We show that Suppes-Sen and Pigou-Dalton imply that the consumption and inheritance functions are monotone with respect to time—thus justifying sustainability—while resource monotonicity implies that the consumption and inheritance functions are monotone with respect to the resource. Examples illustrate the characterization results.
Resumo:
It is not uncommon that a society facing a choice problem has also to choose the choice rule itself. In such situation voters’ preferences on alternatives induce preferences over the voting rules. Such a setting immediately gives rise to a natural question concerning consistency between these two levels of choice. If a choice rule employed to resolve the society’s original choice problem does not choose itself when it is also used in choosing the choice rule, then this phenomenon can be regarded as inconsistency of this choice rule as it rejects itself according to its own rationale. Koray (2000) proved that the only neutral, unanimous universally self-selective social choice functions are the dictatorial ones. Here we in troduce to our society a constitution, which rules out inefficient social choice rules. When inefficient social choice rules become unavailable for comparison, the property of self-selectivity becomes weaker and we show that some non-trivial self-selective social choice functions do exist. Under certain assumptions on the constitution we describe all of them.
Resumo:
We analyze an alternative to the standard rationalizability requirement for observed choices by considering non-deteriorating selections. A selection function is a generalization of a choice function where selected alternatives may depend on a reference (or status quo) alternative in addition to the set of feasible options. A selection function is non-deteriorating if there exists an ordering over the universal set of alternatives such that the selected alternatives are at least as good as the reference option. We characterize non-deteriorating selection functions in an abstract framework and in an economic environment.
Resumo:
Single-peaked preferences have played an important role in the literature ever since they were used by Black (1948) to formulate a domain restriction that is sufficient for the exclusion of cycles according to the majority rule. In this paper, we approach single-peakedness from a choice-theoretic perspective. We show that the well-known axiom independence of irrelevant alternatives (a form of contraction consistency) and a weak continuity requirement characterize a class of single-peaked choice functions. Moreover, we examine the rationalizability and the rationalizability-representability of these choice functions.
Resumo:
We study the problem of deriving a complete welfare ordering from a choice function. Under the sequential solution, the best alternative is the alternative chosen from the universal set; the second best is the one chosen when the best alternative is removed; and so on. We show that this is the only completion of Bernheim and Rangel's (2009) welfare relation that satisfies two natural axioms: neutrality, which ensures that the names of the alternatives are welfare-irrelevant; and persistence, which stipulates that every choice function between two welfare-identical choice functions must exhibit the same welfare ordering.
Resumo:
This paper shows that, in production economies, the generalized serial social choice functions defined by Shenker (1992) are securely implementable (in the sense of Saijo et al., 2007) and that they include the well-known fixed path social choice functions.
Resumo:
Consistency of a binary relation requires any preference cycle to involve indifference only. As shown by Suzumura (1976b), consistency is necessary and sufficient for the existence of an ordering extension of a relation. Because of this important role of consistency, it is of interest to examine the rationalizability of choice functions by means of consistent relations. We describe the logical relationships between the different notions of rationalizability obtained if reflexivity or completeness are added to consistency, both for greatest-element rationalizability and for maximal-element rationalizability. All but one notion of consistent rationalizability are characterized for general domains, and all of them are characterized for domains that contain all two-element subsets of the universal set.
Resumo:
We examine the maximal-element rationalizability of choice functions with arbitrary do-mains. While rationality formulated in terms of the choice of greatest elements according to a rationalizing relation has been analyzed relatively thoroughly in the earlier litera-ture, this is not the case for maximal-element rationalizability, except when it coincides with greatest-element rationalizability because of properties imposed on the rationalizing relation. We develop necessary and sufficient conditions for maximal-element rationaliz-ability by itself, and for maximal-element rationalizability in conjunction with additional properties of a rationalizing relation such as re exivity, completeness, P-acyclicity, quasi-transitivity, consistency and transitivity.
Resumo:
We study the problem of assigning indivisible and heterogenous objects (e.g., houses, jobs, offices, school or university admissions etc.) to agents. Each agent receives at most one object and monetary compensations are not possible. We consider mechanisms satisfying a set of basic properties (unavailable-type-invariance, individual-rationality, weak non-wastefulness, or truncation-invariance). In the house allocation problem, where at most one copy of each object is available, deferred-acceptance (DA)-mechanisms allocate objects based on exogenously fixed objects' priorities over agents and the agent-proposing deferred-acceptance-algorithm. For house allocation we show that DA-mechanisms are characterized by our basic properties and (i) strategy-proofness and population-monotonicity or (ii) strategy-proofness and resource-monotonicity. Once we allow for multiple identical copies of objects, on the one hand the first characterization breaks down and there are unstable mechanisms satisfying our basic properties and (i) strategy-proofness and population-monotonicity. On the other hand, our basic properties and (ii) strategy-proofness and resource-monotonicity characterize (the most general) class of DA-mechanisms based on objects' fixed choice functions that are acceptant, monotonic, substitutable, and consistent. These choice functions are used by objects to reject agents in the agent-proposing deferred-acceptance-algorithm. Therefore, in the general model resource-monotonicity is the «stronger» comparative statics requirement because it characterizes (together with our basic requirements and strategy-proofness) choice-based DA-mechanisms whereas population-monotonicity (together with our basic properties and strategy-proofness) does not.
Resumo:
In this thesis I examine a variety of linguistic elements which involve ``alternative'' semantic values---a class arguably including focus, interrogatives, indefinites, and disjunctions---and the connections between these elements. This study focusses on the analysis of such elements in Sinhala, with comparison to Malayalam, Tlingit, and Japanese. The central part of the study concerns the proper syntactic and semantic analysis of Q[uestion]-particles (including Sinhala "da", Malayalam "-oo", Japanese "ka"), which, in many languages, appear not only in interrogatives, but also in the formation of indefinites, disjunctions, and relative clauses. This set of contexts is syntactically-heterogeneous, and so syntax does not offer an explanation for the appearance of Q-particles in this particular set of environments. I propose that these contexts can be united in terms of semantics, as all involving some element which denotes a set of ``alternatives''. Both wh-words and disjunctions can be analysed as creating Hamblin-type sets of ``alternatives''. Q-particles can be treated as uniformly denoting variables over choice functions which apply to the aforementioned Hamblin-type sets, thus ``restoring'' the derivation to normal Montagovian semantics. The treatment of Q-particles as uniformly denoting variables over choice functions provides an explanation for why these particles appear in just this set of contexts: they all include an element with Hamblin-type semantics. However, we also find variation in the use of Q-particles; including, in some languages, the appearance of multiple morphologically-distinct Q-particles in different syntactic contexts. Such variation can be handled largely by positing that Q-particles may vary in their formal syntactic feature specifications, determining which syntactic contexts they are licensed in. The unified analysis of Q-particles as denoting variables over choice functions also raises various questions about the proper analysis of interrogatives, indefinites, and disjunctions, including issues concerning the nature of the semantics of wh-words and the syntactic structure of disjunction. As well, I observe that indefinites involving Q-particles have a crosslinguistic tendency to be epistemic indefinites, i.e. indefinites which explicitly signal ignorance of details regarding who or what satisfies the existential claim. I provide an account of such indefinites which draws on the analysis of Q-particles as variables over choice functions. These pragmatic ``signals of ignorance'' (which I argue to be presuppositions) also have a further role to play in determining the distribution of Q-particles in disjunctions. The final section of this study investigates the historical development of focus constructions and Q-particles in Sinhala. This diachronic study allows us not only to observe the origin and development of such elements, but also serves to delimit the range of possible synchronic analyses, thus providing us with further insights into the formal syntactic and semantic properties of Q-particles. This study highlights both the importance of considering various components of the grammar (e.g. syntax, semantics, pragmatics, morphology) and the use of philology in developing plausible formal analyses of complex linguistic phenomena such as the crosslinguistic distribution of Q-particles.
Resumo:
The basic requirement for an autopilot is fast response and minimum steady state error for better guidance performance. The highly nonlinear nature of the missile dynamics due to the severe kinematic and inertial coupling of the missile airframe as well as the aerodynamics has been a challenge for an autopilot that is required to have satisfactory performance for all flight conditions in probable engagements. Dynamic inversion is very popular nonlinear controller for this kind of scenario. But the drawback of this controller is that it is sensitive to parameter perturbation. To overcome this problem, neural network has been used to capture the parameter uncertainty on line. The choice of basis function plays the major role in capturing the unknown dynamics. Here in this paper, many basis function has been studied for approximation of unknown dynamics. Cosine basis function has yield the best response compared to any other basis function for capturing the unknown dynamics. Neural network with Cosine basis function has improved the autopilot performance as well as robustness compared to Dynamic inversion without Neural network.
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.