5 resultados para validation tests of PTO
em CaltechTHESIS
Resumo:
In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.
We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.
We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.
In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.
In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.
We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.
In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.
Resumo:
The material included within this report is the result of a series of tests of concrete specimens taken during the construction of various buildings in the cities of Pasadena and Los Angeles over a period of eight months.
The object of the problem is to determine the effect of the water ratio on the ultimate strength of the concrete as obtained from data observed and recorded from specimens taken from actual building practice rather than that from laboratory specimens made under ideal, or at least more nearly standard conditions.
Resumo:
Several patients of P. J. Vogel who had undergone cerebral commissurotomy for the control of intractable epilepsy were tested on a variety of tasks to measure aspects of cerebral organization concerned with lateralization in hemispheric function. From tests involving identification of shapes it was inferred that in the absence of the neocortical commissures, the left hemisphere still has access to certain types of information from the ipsilateral field. The major hemisphere can still make crude differentiations between various left-field stimuli, but is unable to specify exact stimulus properties. Most of the time the major hemisphere, having access to some ipsilateral stimuli, dominated the minor hemisphere in control of the body.
Competition for control of the body between the hemispheres is seen most clearly in tests of minor hemisphere language competency, in which it was determined that though the minor hemisphere does possess some minimal ability to express language, the major hemisphere prevented its expression much of the time. The right hemisphere was superior to the left in tests of perceptual visualization, and the two hemispheres appeared to use different strategies in attempting to solve the problems, namely, analysis for the left hemisphere and synthesis for the right hemisphere.
Analysis of the patients' verbal and performance I.Q.'s, as well as observations made throughout testing, suggest that the corpus callosum plays a critical role in activities that involve functions in which the minor hemisphere normally excels, that the motor expression of these functions may normally come through the major hemisphere by way of the corpus callosum.
Lateral specialization is thought to be an evolutionary adaptation which overcame problems of a functional antagonism between the abilities normally associated with the two hemispheres. The tests of perception suggested that this function lateralized into the mute hemisphere because of an active counteraction by language. This latter idea was confirmed by the finding that left-handers, in whom there is likely to be bilateral language centers, are greatly deficient on tests of perception.
Resumo:
The distal half of the bacteriophage T4 tail fiber interacts with the surface of the bacterium during adsorption. The largest polypeptide in this half fiber is the product of gene 37 (P37). During assembly of the tail fiber, P37 interacts with the product of gene 38 (P38). These two gene products are incompatible with the corresponding gene products from the related phage T2. T2 P37 does not interact with T4 P38 and T2 P38 does not interact with T4 P37. Crosses between T2 and T4 phages mutant in genes 37 and 38 have shown that the carboxyl end of P37 interacts with P38 and with the bacterial surface. In the corresponding region of gene 37 and in gene 38 there is no recombination between T2 and T4. In the rest of gene 37 there are two small regions with relatively high recombination and a region of low recombination.
When T2/T4 heteroduplex DNA molecules are examined in the electron microscope four nonhomologous loops appear in the region of genes 37 and 38. Heteroduplexes between hybrid phages which have part of gene 37 from T4 and part from T2 have roughly located gene 37 mutations in the heteroduplex pattern. For a more precise location of the , mutations a physical map of gene 37 was constructed by determining the molecular weights of amber polypeptide fragments on polyacrylamide gels in the presence of sodium dodecyl sulfate. When the physical and heteroduplex maps are aligned, the regions of low recombination correspond to regions of nonhomology between T2 and T4. Regions with relatively high recombination are homologous.
The molecular weight of T2 P37 is about 13,000 greater than that of T4 P37. Analysis of hybrid phage has shown that this molecular weight difference is all at the carboxyl end of P37.
An antiserum has been prepared which is specific for the distal half fiber of T4. Tests of the ability of gene 37 hybrids to block this antiserum show that there are at least 4 subclasses of antigen specified by different parts of P37.
Observations in the electron microscope of the tailfiber - anti- body complexes formed by the gene 37 hybrids and the specific anti- serum have shown that P37 is oriented linearly in the distal half fiber with its N-terminus near the joint between the two half fibers and its C-terminus near the tip of the fiber. These observations lead to a simple model for the structure of the distal half fiber.
The high recombination in T4 gene 34 was also investigated. A comparison of genetic and physical maps of gene 34 showed that there is a gradient of increasing recombination near one end of the gene.
Resumo:
The evoked response, a signal present in the electro-encephalogram when specific sense modalities are stimulated with brief sensory inputs, has not yet revealed as much about brain function as it apparently promised when first recorded in the late 1940's. One of the problems has been to record the responses at a large number of points on the surface of the head; thus in order to achieve greater spatial resolution than previously attained, a 50-channel recording system was designed to monitor experiments with human visually evoked responses.
Conventional voltage versus time plots of the responses were found inadequate as a means of making qualitative studies of such a large data space. This problem was solved by creating a graphical display of the responses in the form of equipotential maps of the activity at successive instants during the complete response. In order to ascertain the necessary complexity of any models of the responses, factor analytic procedures were used to show that models characterized by only five or six independent parameters could adequately represent the variability in all recording channels.
One type of equivalent source for the responses which meets these specifications is the electrostatic dipole. Two different dipole models were studied: the dipole in a homogeneous sphere and the dipole in a sphere comprised of two spherical shells (of different conductivities) concentric with and enclosing a homogeneous sphere of a third conductivity. These models were used to determine nonlinear least squares fits of dipole parameters to a given potential distribution on the surface of a spherical approximation to the head. Numerous tests of the procedures were conducted with problems having known solutions. After these theoretical studies demonstrated the applicability of the technique, the models were used to determine inverse solutions for the evoked response potentials at various times throughout the responses. It was found that reliable estimates of the location and strength of cortical activity were obtained, and that the two models differed only slightly in their inverse solutions. These techniques enabled information flow in the brain, as indicated by locations and strengths of active sites, to be followed throughout the evoked response.