923 resultados para Field theories in lower dimensions
Resumo:
The dilaton action in 3 + 1 dimensions plays a crucial role in the proof of the a-theorem. This action arises using Wess-Zumino consistency conditions and crucially relies on the existence of the trace anomaly. Since there are no anomalies in odd dimensions, it is interesting to ask how such an action could arise otherwise. Motivated by this we use the AdS/CFT correspondence to examine both even and odd dimensional conformal field theories. We find that in even dimensions, by promoting the cutoff to a field, one can get an action for this field which coincides with the Wess-Zumino action in flat space. In three dimensions, we observe that by finding an exact Hamilton-Jacobi counterterm, one can find a non-polynomial action which is invariant under global Weyl rescalings. We comment on how this finding is tied up with the F-theorem conjectures.
Resumo:
In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.
We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.
We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.
In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.
In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.
We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.
In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.
Resumo:
The present thesis deals with the studies on certain aspects of pathological higher field theories .It brings to light some new abnormalities and new samples of abnormal theories and also puts forward a novel approach towards the construction of trouble free theories
Resumo:
We study the properties of the lower bound on the exchange-correlation energy in two dimensions. First we review the derivation of the bound and show how it can be written in a simple density-functional form. This form allows an explicit determination of the prefactor of the bound and testing its tightness. Next we focus on finite two-dimensional systems and examine how their distance from the bound depends on the system geometry. The results for the high-density limit suggest that a finite system that comes as close as possible to the ultimate bound on the exchange-correlation energy has circular geometry and a weak confining potential with a negative curvature. (c) 2009 Elsevier B.V. All rights reserved.
Resumo:
Using the functional integral formalism for the statistical generating functional in the statistical (finite temperature) quantum field theory, we prove the equivalence of many-photon Greens functions in the Duffin-Kennner-Petiau and Klein-Gordon-Fock statistical quantum field theories. As an illustration, we calculate the one-loop polarization operators in both theories and demonstrate their coincidence.
Resumo:
The addition of a topological Chern-Simons term to three-dimensional higher-derivative gravity is not a good therapy to cure the nonunitarity of the aforementioned theory. Moreover, R+R-2 gravity in (2+1)D, which is unitary at the tree level, becomes tree-level nonunitary when it is augmented by the abovementioned topological term. Therefore, unlike what is claimed in the literature, topological higher-derivative gravity in (2+1)D is not tree-level unitary and neither is topological three-dimensional R+R-2 gravity.
Resumo:
The zero curvature representation for two-dimensional integrable models is generalized to spacetimes of dimension d + 1 by the introduction of a d-form connection. The new generalized zero curvature conditions can be used to represent the equations of motion of some relativistic invariant field theories of physical interest in 2 + 1 dimensions (BF theories, Chern-Simons, 2 + 1 gravity and the CP1 model) and 3 + 1 dimensions (self-dual Yang-Mills theory and the Bogomolny equations). Our approach leads to new methods of constructing conserved currents and solutions. In a submodel of the 2 + 1-dimensional CP1 model, we explicitly construct an infinite number of previously unknown non-trivial conserved currents. For each positive integer spin representation of sl(2) we construct 2j + 1 conserved currents leading to 2j + 1 Lorentz scalar charges. (C) 1998 Elsevier B.V. B.V.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We prove the equivalence of many-gluon Green's functions in the Duffin-Kemmer-Petieu and Klein-Gordon-Fock statistical quantum field theories. The proof is based on the functional integral formulation for the statistical generating functional in a finite-temperature quantum field theory. As an illustration, we calculate one-loop polarization operators in both theories and show that their expressions indeed coincide.
Resumo:
We use ideas on integrability in higher dimensions to define Lorentz invariant field theories with an infinite number of local conserved currents. The models considered have a two-dimensional target space. Requiring the existence of lagrangean and the stability of static solutions singles out a class of models which have an additional conformal symmetry. That is used to explain the existence of an ansatz leading to solutions with non-trivial Hopf charges. © SISSA/ISAS 2002.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)