4 resultados para evidence-in-chief

em CaltechTHESIS


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In Part I, we construct a symmetric stress-energy-momentum pseudo-tensor for the gravitational fields of Brans-Dicke theory, and use this to establish rigorously conserved integral expressions for energy-momentum Pi and angular momentum Jik. Application of the two-dimensional surface integrals to the exact static spherical vacuum solution of Brans leads to an identification of our conserved mass with the active gravitational mass. Application to the distant fields of an arbitrary stationary source reveals that Pi and Jik have the same physical interpretation as in general relativity. For gravitational waves whose wavelength is small on the scale of the background radius of curvature, averaging over several wavelengths in the Brill-Hartle-Isaacson manner produces a stress-energy-momentum tensor for gravitational radiation which may be used to calculate the changes in Pi and Jik of their source.

In Part II, we develop strong evidence in favor of a conjecture by Penrose--that, in the Brans-Dicke theory, relativistic gravitational collapse in three dimensions produce black holes identical to those of general relativity. After pointing out that any black hole solution of general relativity also satisfies Brans-Dicke theory, we establish the Schwarzschild and Kerr geometries as the only possible spherical and axially symmetric black hole exteriors, respectively. Also, we show that a Schwarzschild geometry is necessarily formed in the collapse of an uncharged sphere.

Appendices discuss relationships among relativistic gravity theories and an example of a theory in which black holes do not exist.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work concerns itself with the possibility of solutions, both cooperative and market based, to pollution abatement problems. In particular, we are interested in pollutant emissions in Southern California and possible solutions to the abatement problems enumerated in the 1990 Clean Air Act. A tradable pollution permit program has been implemented to reduce emissions, creating property rights associated with various pollutants.

Before we discuss the performance of market-based solutions to LA's pollution woes, we consider the existence of cooperative solutions. In Chapter 2, we examine pollutant emissions as a trans boundary public bad. We show that for a class of environments in which pollution moves in a bi-directional, acyclic manner, there exists a sustainable coalition structure and associated levels of emissions. We do so via a new core concept, one more appropriate to modeling cooperative emissions agreements (and potential defection from them) than the standard definitions.

However, this leaves the question of implementing pollution abatement programs unanswered. While the existence of a cost-effective permit market equilibrium has long been understood, the implementation of such programs has been difficult. The design of Los Angeles' REgional CLean Air Incentives Market (RECLAIM) alleviated some of the implementation problems, and in part exacerbated them. For example, it created two overlapping cycles of permits and two zones of permits for different geographic regions. While these design features create a market that allows some measure of regulatory control, they establish a very difficult trading environment with the potential for inefficiency arising from the transactions costs enumerated above and the illiquidity induced by the myriad assets and relatively few participants in this market.

It was with these concerns in mind that the ACE market (Automated Credit Exchange) was designed. The ACE market utilizes an iterated combined-value call market (CV Market). Before discussing the performance of the RECLAIM program in general and the ACE mechanism in particular, we test experimentally whether a portfolio trading mechanism can overcome market illiquidity. Chapter 3 experimentally demonstrates the ability of a portfolio trading mechanism to overcome portfolio rebalancing problems, thereby inducing sufficient liquidity for markets to fully equilibrate.

With experimental evidence in hand, we consider the CV Market's performance in the real world. We find that as the allocation of permits reduces to the level of historical emissions, prices are increasing. As of April of this year, prices are roughly equal to the cost of the Best Available Control Technology (BACT). This took longer than expected, due both to tendencies to mis-report emissions under the old regime, and abatement technology advances encouraged by the program. Vve also find that the ACE market provides liquidity where needed to encourage long-term planning on behalf of polluting facilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is divided into two parts: interacting dark matter and fluctuations in cosmology. There is an incongruence between the properties that dark matter is expected to possess between the early universe and the late universe. Weakly-interacting dark matter yields the observed dark matter relic density and is consistent with large-scale structure formation; however, there is strong astrophysical evidence in favor of the idea that dark matter has large self-interactions. The first part of this thesis presents two models in which the nature of dark matter fundamentally changes as the universe evolves. In the first model, the dark matter mass and couplings depend on the value of a chameleonic scalar field that changes as the universe expands. In the second model, dark matter is charged under a hidden SU(N) gauge group and eventually undergoes confinement. These models introduce very different mechanisms to explain the separation between the physics relevant for freezeout and for small-scale dynamics.

As the universe continues to evolve, it will asymptote to a de Sitter vacuum phase. Since there is a finite temperature associated with de Sitter space, the universe is typically treated as a thermal system, subject to rare thermal fluctuations, such as Boltzmann brains. The second part of this thesis begins by attempting to escape this unacceptable situation within the context of known physics: vacuum instability induced by the Higgs field. The vacuum decay rate competes with the production rate of Boltzmann brains, and the cosmological measures that have a sufficiently low occurrence of Boltzmann brains are given more credence. Upon further investigation, however, there are certain situations in which de Sitter space settles into a quiescent vacuum with no fluctuations. This reasoning not only provides an escape from the Boltzmann brain problem, but it also implies that vacuum states do not uptunnel to higher-energy vacua and that perturbations do not decohere during slow-roll inflation, suggesting that eternal inflation is much less common than often supposed. Instead, decoherence occurs during reheating, so this analysis does not alter the conventional understanding of the origin of density fluctuations from primordial inflation.