876 resultados para Theory of Rational Choice


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eutrophication of the Baltic Sea is a serious problem. This thesis estimates the benefit to Finns from reduced eutrophication in the Gulf of Finland, the most eutrophied part of the Baltic Sea, by applying the choice experiment method, which belongs to the family of stated preference methods. Because stated preference methods have been subject to criticism, e.g., due to their hypothetical survey context, this thesis contributes to the discussion by studying two anomalies that may lead to biased welfare estimates: respondent uncertainty and preference discontinuity. The former refers to the difficulty of stating one s preferences for an environmental good in a hypothetical context. The latter implies a departure from the continuity assumption of conventional consumer theory, which forms the basis for the method and the analysis. In the three essays of the thesis, discrete choice data are analyzed with the multinomial logit and mixed logit models. On average, Finns are willing to contribute to the water quality improvement. The probability for willingness increases with residential or recreational contact with the gulf, higher than average income, younger than average age, and the absence of dependent children in the household. On average, for Finns the relatively most important characteristic of water quality is water clarity followed by the desire for fewer occurrences of blue-green algae. For future nutrient reduction scenarios, the annual mean household willingness to pay estimates range from 271 to 448 and the aggregate welfare estimates for Finns range from 28 billion to 54 billion euros, depending on the model and the intensity of the reduction. Out of the respondents (N=726), 72.1% state in a follow-up question that they are either Certain or Quite certain about their answer when choosing the preferred alternative in the experiment. Based on the analysis of other follow-up questions and another sample (N=307), 10.4% of the respondents are identified as potentially having discontinuous preferences. In relation to both anomalies, the respondent- and questionnaire-specific variables are found among the underlying causes and a departure from standard analysis may improve the model fit and the efficiency of estimates, depending on the chosen modeling approach. The introduction of uncertainty about the future state of the Gulf increases the acceptance of the valuation scenario which may indicate an increased credibility of a proposed scenario. In conclusion, modeling preference heterogeneity is an essential part of the analysis of discrete choice data. The results regarding uncertainty in stating one s preferences and non-standard choice behavior are promising: accounting for these anomalies in the analysis may improve the precision of the estimates of benefit from reduced eutrophication in the Gulf of Finland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurements of the electrical resistivity of thin potassium wires at temperatures near 1 K have revealed a minimum in the resistivity as a function of temperature. By proposing that the electrons in these wires have undergone localization, albeit with large localization length, and that inelastic-scattering events destroy the coherence of that state, we can explain both the magnitude and shape of the temperature-dependent resistivity data. Localization of electrons in these wires is to be expected because, due to the high purity of the potassium, the elastic mean free path is comparable to the diameters of the thinnest samples, making the Thouless length lT (or inelastic diffusion length) much larger than the diameter, so that the wire is effectively one dimensional. The inelastic events effectively break the wire into a series of localized segments, whose resistances can be added to obtain the total resistance of the wire. The ensemble-averaged resistance for all possible segmented wires, weighted with a Poisson distribution of inelastic-scattering lengths along the wire, yields a length dependence for the resistance that is proportional to [L3/lin(T)], provided that lin(T)?L, where L is the sample length and lin(T) is some effective temperature-dependent one-dimensional inelastic-scattering length. A more sophisticated approach using a Poisson distribution in inelastic-scattering times, which takes into account the diffusive motion of the electrons along the wire through the Thouless length, yields a length- and temperature-dependent resistivity proportional to (L/lT)4 under appropriate conditions. Inelastic-scattering lifetimes are inferred from the temperature-dependent bulk resistivities (i.e., those of thicker, effectively three-dimensional samples), assuming that a minimum amount of energy must be exchanged for a collision to be effective in destroying the phase coherence of the localized state. If the dominant inelastic mechanism is electron-electron scattering, then our result, given the appropriate choice of the channel number parameter, is consistent with the data. If electron-phason scattering were of comparable importance, then our results would remain consistent. However, the inelastic-scattering lifetime inferred from bulk resistivity data is too short. This is because the electron-phason mechanism dominates in the inelastic-scattering rate, although the two mechanisms may be of comparable importance for the bulk resistivity. Possible reasons why the electron-phason mechanism might be less effective in thin wires than in bulk are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper proposes a study of symmetrical and related components, based on the theory of linear vector spaces. Using the concept of equivalence, the transformation matrixes of Clarke, Kimbark, Concordia, Boyajian and Koga are shown to be column equivalent to Fortescue's symmetrical-component transformation matrix. With a constraint on power, criteria are presented for the choice of bases for voltage and current vector spaces. In particular, it is shown that, for power invariance, either the same orthonormal (self-reciprocal) basis must be chosen for both voltage and current vector spaces, or the basis of one must be chosen to be reciprocal to that of the other. The original �¿, ��, 0 components of Clarke are modified to achieve power invariance. For machine analysis, it is shown that invariant transformations lead to reciprocal mutual inductances between the equivalent circuits. The relative merits of the various components are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A disadvantage of multiple-choice tests is that students have incentives to guess. To discourage guessing, it is common to use scoring rules that either penalize wrong answers or reward omissions. These scoring rules are considered equivalent in psychometrics, although experimental evidence has not always been consistent with this claim. We model students' decisions and show, first, that equivalence holds only under risk neutrality and, second, that the two rules can be modified so that they become equivalent even under risk aversion. This paper presents the results of a field experiment in which we analyze the decisions of subjects taking multiple-choice exams. The evidence suggests that differences between scoring rules are due to risk aversion as theory predicts. We also find that the number of omitted items depends on the scoring rule, knowledge, gender and other covariates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a growing amount of experimental evidence that suggests people often deviate from the predictions of game theory. Some scholars attempt to explain the observations by introducing errors into behavioral models. However, most of these modifications are situation dependent and do not generalize. A new theory, called the rational novice model, is introduced as an attempt to provide a general theory that takes account of erroneous behavior. The rational novice model is based on two central principals. The first is that people systematically make inaccurate guesses when they are evaluating their options in a game-like situation. The second is that people treat their decisions similar to a portfolio problem. As a result, non optimal actions in a game theoretic sense may be included in the rational novice strategy profile with positive weights.

The rational novice model can be divided into two parts: the behavioral model and the equilibrium concept. In a theoretical chapter, the mathematics of the behavioral model and the equilibrium concept are introduced. The existence of the equilibrium is established. In addition, the Nash equilibrium is shown to be a special case of the rational novice equilibrium. In another chapter, the rational novice model is applied to a voluntary contribution game. Numerical methods were used to obtain the solution. The model is estimated with data obtained from the Palfrey and Prisbrey experimental study of the voluntary contribution game. It is found that the rational novice model explains the data better than the Nash model. Although a formal statistical test was not used, pseudo R^2 analysis indicates that the rational novice model is better than a Probit model similar to the one used in the Palfrey and Prisbrey study.

The rational novice model is also applied to a first price sealed bid auction. Again, computing techniques were used to obtain a numerical solution. The data obtained from the Chen and Plott study were used to estimate the model. The rational novice model outperforms the CRRAM, the primary Nash model studied in the Chen and Plott study. However, the rational novice model is not the best amongst all models. A sophisticated rule-of-thumb, called the SOPAM, offers the best explanation of the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Open source projects are networks of developers, distributors and end-users of non-proprietary created knowledge goods. It has been argued that this form of organization has some advantages over the firm or market coordination. I show that for sufficiently convex and modular projects proprietary licences are not able to sustain sequential knowledge production which, however, can be carried out if the project is run on the open source basis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the theory of planned behaviour (TPB) has been applied successfully in the area of food choice, it has been criticized for its pure utilitarian approach to the factors determining behaviour. Despite the increase in predictive power of the model with added components such as affective attitude and moral and ethical concerns, in most studies the elicitation process still only addresses people's utilitarian beliefs about the behaviour with little attention paid to other aspects. This study compares the traditional method of elicitation of advantages and disadvantages with two other methods (word association and open-ended) in the elicitations of beliefs, attitudes and moral concerns in relation to the consumption of organic foods. Results show the traditional method to be best for eliciting cognitive beliefs, open-ended emotion task for eliciting emotional beliefs and open-ended beliefs task best for moral concerns. The advantages and disadvantages of each method are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As técnicas estatísticas são fundamentais em ciência e a análise de regressão linear é, quiçá, uma das metodologias mais usadas. É bem conhecido da literatura que, sob determinadas condições, a regressão linear é uma ferramenta estatística poderosíssima. Infelizmente, na prática, algumas dessas condições raramente são satisfeitas e os modelos de regressão tornam-se mal-postos, inviabilizando, assim, a aplicação dos tradicionais métodos de estimação. Este trabalho apresenta algumas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, em particular na estimação de modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. A investigação é desenvolvida em três vertentes, nomeadamente na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, na estimação do parâmetro ridge em regressão ridge e, por último, em novos desenvolvimentos na estimação com máxima entropia. Na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, o trabalho desenvolvido evidencia um melhor desempenho dos estimadores de máxima entropia em relação ao estimador de máxima verosimilhança. Este bom desempenho é notório em modelos com poucas observações por estado e em modelos com um grande número de estados, os quais são comummente afetados por colinearidade. Espera-se que a utilização de estimadores de máxima entropia contribua para o tão desejado aumento de trabalho empírico com estas fronteiras de produção. Em regressão ridge o maior desafio é a estimação do parâmetro ridge. Embora existam inúmeros procedimentos disponíveis na literatura, a verdade é que não existe nenhum que supere todos os outros. Neste trabalho é proposto um novo estimador do parâmetro ridge, que combina a análise do traço ridge e a estimação com máxima entropia. Os resultados obtidos nos estudos de simulação sugerem que este novo estimador é um dos melhores procedimentos existentes na literatura para a estimação do parâmetro ridge. O estimador de máxima entropia de Leuven é baseado no método dos mínimos quadrados, na entropia de Shannon e em conceitos da eletrodinâmica quântica. Este estimador suplanta a principal crítica apontada ao estimador de máxima entropia generalizada, uma vez que prescinde dos suportes para os parâmetros e erros do modelo de regressão. Neste trabalho são apresentadas novas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, tendo por base o estimador de máxima entropia de Leuven, a teoria da informação e a regressão robusta. Os estimadores desenvolvidos revelam um bom desempenho em modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. Por último, são apresentados alguns códigos computacionais para estimação com máxima entropia, contribuindo, deste modo, para um aumento dos escassos recursos computacionais atualmente disponíveis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The law regulating the availability of abortion is problematic both legally and morally. It is dogmatic in its requirements of women and doctors and ignorant of would-be fathers. Practically, its usage is liberal - with s1(1)(a) Abortion Act 1967 treated as a ‘catch all’ ground - it allows abortion on demand. Yet this is not reflected in the ‘law’. Against this outdated legislation I propose a model of autonomy which seeks to tether our moral concerns with a new legal approach to abortion. I do so by maintaining that a legal conception of autonomy is derivable from the categorical imperative resulting from Gewirth’s argument to the Principle of Generic Consistency: Act in accordance with the generic rights of your recipients as well as of yourself. This model of Gewirthian Rational Autonomy, I suggest, provides a guide for both public and private notions of autonomy and how our autonomous interests can be balanced across social structures in order to legitimately empower choice. I claim, ultimately, that relevant rights in the context of abortion are derivable from this model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the role of higher-order moments in portfolio choice within an expected-utility framework. We consider two-, three-, four- and five-parameter density functions for portfolio returns and derive exact conditions under which investors would all be optimally plungers rather than diversifiers. Through comparative statics we show the importance of higher-order risk preference properties, such as riskiness, prudence and temperance, in determining plunging behaviour. Empirical estimates for the S&P500 provide evidence for the optimality of diversification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The theory of fractional calculus (FC) is a useful mathematical tool in many applied sciences. Nevertheless, only in the last decades researchers were motivated for the adoption of the FC concepts. There are several reasons for this state of affairs, namely the co-existence of different definitions and interpretations, and the necessity of approximation methods for the real time calculation of fractional derivatives (FDs). In a first part, this paper introduces a probabilistic interpretation of the fractional derivative based on the Grünwald-Letnikov definition. In a second part, the calculation of fractional derivatives through Padé fraction approximations is analyzed. It is observed that the probabilistic interpretation and the frequency response of fraction approximations of FDs reveal a clear correlation between both concepts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study tested the appHcabiUty of Ajzen's (1985) theory of planned behaviour (TPB), an extension of Fishbein and Ajzen's (1975) theory of reasoned action (TRA), for the first time, in the context of abused women's decision to leave their abusive relationships. The TPB, as a means of predicting women's decision to leave their abusive partners' was drawn from Strube's (1988, 1991) proposed decision-making model based on the principle that the decision-making process is a rational, deliberative process, and regardless of outcome, was a result of a logical assessment of the available data. As a means of predicting those behaviours not under volitional control, Ajzen's (1985) TPB incorporated a measure of perceived behavioural control. Data were collected in two phases, ranging from 6 months to 1 year apart. It was hypothesized that, to the extent that an abused woman held positive attitudes, subjective norms conducive to leaving, and perceived control over leaving, she would form an intention to leave and thus, increase the likelihood of actually leaving her partner. Furthermore, it was expected that perceptions of control would predict leaving behaviour over and above attitude and subjective norm. In addition, severity and frequency of abuse were assessed, as were demographic variables. The TPB failed to account significantly for variability in either intentions or leaving behaviour. All of the variance was attributed to those variables associated with the theory of reasoned action, with social influence emerging as the strongest predictor of a woman's intentions. The poor performance of this model is attributed to measurement problems with aspects of attitude and perceived control, as well as a lack of power due to the small sample size. The insufficiency of perceived control to predict behaviour also suggests that, on the surface at least, other factors may be at work in this context. Implications of these results, and recommendations such as, the importance of obtaining representative samples, the inclusion of self-esteem and emotions as predictor variables in this model, a reevaluation of the target behaviovu" as nonvolitional, and longitudinal studies spanning a longer time period for future research within the context of decision-making are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new theory of random consumer demand. The primitive is a collection of probability distributions, rather than a binary preference. Various assumptions constrain these distributions, including analogues of common assumptions about preferences such as transitivity, monotonicity and convexity. Two results establish a complete representation of theoretically consistent random demand. The purpose of this theory of random consumer demand is application to empirical consumer demand problems. To this end, the theory has several desirable properties. It is intrinsically stochastic, so the econometrician can apply it directly without adding extrinsic randomness in the form of residuals. Random demand is parsimoniously represented by a single function on the consumption set. Finally, we have a practical method for statistical inference based on the theory, described in McCausland (2004), a companion paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

McCausland (2004a) describes a new theory of random consumer demand. Theoretically consistent random demand can be represented by a \"regular\" \"L-utility\" function on the consumption set X. The present paper is about Bayesian inference for regular L-utility functions. We express prior and posterior uncertainty in terms of distributions over the indefinite-dimensional parameter set of a flexible functional form. We propose a class of proper priors on the parameter set. The priors are flexible, in the sense that they put positive probability in the neighborhood of any L-utility function that is regular on a large subset bar(X) of X; and regular, in the sense that they assign zero probability to the set of L-utility functions that are irregular on bar(X). We propose methods of Bayesian inference for an environment with indivisible goods, leaving the more difficult case of indefinitely divisible goods for another paper. We analyse individual choice data from a consumer experiment described in Harbaugh et al. (2001).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper makes some steps toward a formal political economy of environmental policy. Economists' quasi-unanimous preferences for sophisticated incentive regulation is reconsidered. First, we recast the question of instrument choice in the general mechanism literature and provide an incomplete contract approach to political economy. Then, in various settings, we show why constitutional constraints on the instruments of environmental policy may be desirable, even though they appear inefficient from a purely standard economic viewpoint.