101 resultados para random utility
Resumo:
In this paper we consider an insider with privileged information thatis affected by an independent noise vanishing as the revelation timeapproaches. At this time, information is available to every trader. Ourfinancial markets are based on Wiener space. In probabilistic terms weobtain an infinite dimensional extension of Jacod s theorem to covercases of progressive enlargement of filtrations. The application ofthis result gives the semimartingale decomposition of the originalWiener process under the progressively enlarged filtration. As anapplication we prove that if the rate at which the additional noise inthe insider s information vanishes is slow enough then there is noarbitrage and the additional utility of the insider is finite.
Resumo:
Random coefficient regression models have been applied in differentfields and they constitute a unifying setup for many statisticalproblems. The nonparametric study of this model started with Beranand Hall (1992) and it has become a fruitful framework. In thispaper we propose and study statistics for testing a basic hypothesisconcerning this model: the constancy of coefficients. The asymptoticbehavior of the statistics is investigated and bootstrapapproximations are used in order to determine the critical values ofthe test statistics. A simulation study illustrates the performanceof the proposals.
Resumo:
This paper generalizes the original random matching model of money byKiyotaki and Wright (1989) (KW) in two aspects: first, the economy ischaracterized by an arbitrary distribution of agents who specialize in producing aparticular consumption good; and second, these agents have preferences suchthat they want to consume any good with some probability. The resultsdepend crucially on the size of the fraction of producers of each goodand the probability with which different agents want to consume eachgood. KW and other related models are shown to be parameterizations ofthis more general one.
Resumo:
Expected utility theory (EUT) has been challenged as a descriptive theoryin many contexts. The medical decision analysis context is not an exception.Several researchers have suggested that rank dependent utility theory (RDUT)may accurately describe how people evaluate alternative medical treatments.Recent research in this domain has addressed a relevant feature of RDU models-probability weighting-but to date no direct test of this theoryhas been made. This paper provides a test of the main axiomatic differencebetween EUT and RDUT when health profiles are used as outcomes of riskytreatments. Overall, EU best described the data. However, evidence on theediting and cancellation operation hypothesized in Prospect Theory andCumulative Prospect Theory was apparent in our study. we found that RDUoutperformed EU in the presentation of the risky treatment pairs in whichthe common outcome was not obvious. The influence of framing effects onthe performance of RDU and their importance as a topic for future researchis discussed.
Resumo:
Confidence in decision making is an important dimension of managerialbehavior. However, what is the relation between confidence, on the onehand, and the fact of receiving or expecting to receive feedback ondecisions taken, on the other hand? To explore this and related issuesin the context of everyday decision making, use was made of the ESM(Experience Sampling Method) to sample decisions taken by undergraduatesand business executives. For several days, participants received 4 or 5SMS messages daily (on their mobile telephones) at random moments at whichpoint they completed brief questionnaires about their current decisionmaking activities. Issues considered here include differences between thetypes of decisions faced by the two groups, their structure, feedback(received and expected), and confidence in decisions taken as well as inthe validity of feedback. No relation was found between confidence indecisions and whether participants received or expected to receivefeedback on those decisions. In addition, although participants areclearly aware that feedback can provide both confirming and disconfirming evidence, their ability to specify appropriatefeedback is imperfect. Finally, difficulties experienced inusing the ESM are discussed as are possibilities for further researchusing this methodology.
Resumo:
We characterize the prekernel of NTU games by means of consistency,converse consistency, and five axioms of the Nash type on bilateral problems.The intersection of the prekernel and the core is also characterized with thesame axioms over the class of games where the core is nonempty.
Resumo:
Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
Resumo:
This article introduces a model of rationality that combines procedural utility over actions with consequential utility over payoffs. It applies the model to the Prisoners Dilemma and shows that empirically observed cooperative behaviors can be rationally explained by a procedural utility for cooperation. The model characterizes the situations in which cooperation emerges as a Nash equilibrium. When rational individuals are not solely concerned by the consequences of their behavior but also care for the process by which these consequences are obtained, there is no one single rational solution to a Prisoners Dilemma. Rational behavior depends on the payoffs at stake and on the procedural utility of individuals. In this manner, this model of procedural utility reflects how ethical considerations, social norms or emotions can transform a game of consequences.
Resumo:
This paper argues that any specific utility or disutility for gamblingmust be excluded from expected utility because such a theory is consequentialwhile a pleasure or displeasure for gambling is a matter of process, notof consequences. A (dis)utility for gambling is modeled as a process utilitywhich monotonically combines with expected utility restricted to consequences.This allows for a process (dis)utility for gambling to be revealed. Asan illustration, the model shows how empirical observations in the Allaisparadox can reveal a process disutility of gambling. A more general modelof rational behavior combining processes and consequences is then proposedand discussed.
Resumo:
This paper proposes an exploration of the methodology of utilityfunctions that distinguishes interpretation from representation. Whilerepresentation univocally assigns numbers to the entities of the domainof utility functions, interpretation relates these entities withempirically observable objects of choice. This allows us to makeexplicit the standard interpretation of utility functions which assumesthat two objects have the same utility if and only if the individual isindifferent among them. We explore the underlying assumptions of suchan hypothesis and propose a non-standard interpretation according towhich objects of choice have a well-defined utility although individualsmay vary in the way they treat these objects in a specific context.We provide examples of such a methodological approach that may explainsome reversal of preferences and suggest possible mathematicalformulations for further research.
Resumo:
This paper proposes a common and tractable framework for analyzingdifferent definitions of fixed and random effects in a contant-slopevariable-intercept model. It is shown that, regardless of whethereffects (i) are treated as parameters or as an error term, (ii) areestimated in different stages of a hierarchical model, or whether (iii)correlation between effects and regressors is allowed, when the sameinformation on effects is introduced into all estimation methods, theresulting slope estimator is also the same across methods. If differentmethods produce different results, it is ultimately because differentinformation is being used for each methods.
Resumo:
Recently, several anonymization algorithms have appeared for privacy preservation on graphs. Some of them are based on random-ization techniques and on k-anonymity concepts. We can use both of them to obtain an anonymized graph with a given k-anonymity value. In this paper we compare algorithms based on both techniques in orderto obtain an anonymized graph with a desired k-anonymity value. We want to analyze the complexity of these methods to generate anonymized graphs and the quality of the resulting graphs.
Resumo:
In this paper we analyse, using Monte Carlo simulation, the possible consequences of incorrect assumptions on the true structure of the random effects covariance matrix and the true correlation pattern of residuals, over the performance of an estimation method for nonlinear mixed models. The procedure under study is the well known linearization method due to Lindstrom and Bates (1990), implemented in the nlme library of S-Plus and R. Its performance is studied in terms of bias, mean square error (MSE), and true coverage of the associated asymptotic confidence intervals. Ignoring other criteria like the convenience of avoiding over parameterised models, it seems worst to erroneously assume some structure than do not assume any structure when this would be adequate.
Resumo:
Using the once and thrice energy-weighted moments of the random-phase-approximation strength function, we have derived compact expressions for the average energy of surface collective oscillations of clusters and spheres of metal atoms. The L=0 volume mode has also been studied. We have carried out quantal and semiclassical calculations for Na and Ag systems in the spherical-jellium approximation. We present a rather thorough discussion of surface diffuseness and quantal size effects on the resonance energies.