993 resultados para Supersymmetric Effective Theories
Resumo:
This thesis describes simple extensions of the standard model with new sources of baryon number violation but no proton decay. The motivation for constructing such theories comes from the shortcomings of the standard model to explain the generation of baryon asymmetry in the universe, and from the absence of experimental evidence for proton decay. However, lack of any direct evidence for baryon number violation in general puts strong bounds on the naturalness of some of those models and favors theories with suppressed baryon number violation below the TeV scale. The initial part of the thesis concentrates on investigating models containing new scalars responsible for baryon number breaking. A model with new color sextet scalars is analyzed in more detail. Apart from generating cosmological baryon number, it gives nontrivial predictions for the neutron-antineutron oscillations, the electric dipole moment of the neutron, and neutral meson mixing. The second model discussed in the thesis contains a new scalar leptoquark. Although this model predicts mainly lepton flavor violation and a nonzero electric dipole moment of the electron, it includes, in its original form, baryon number violating nonrenormalizable dimension-five operators triggering proton decay. Imposing an appropriate discrete symmetry forbids such operators. Finally, a supersymmetric model with gauged baryon and lepton numbers is proposed. It provides a natural explanation for proton stability and predicts lepton number violating processes below the supersymmetry breaking scale, which can be tested at the Large Hadron Collider. The dark matter candidate in this model carries baryon number and can be searched for in direct detection experiments as well. The thesis is completed by constructing and briefly discussing a minimal extension of the standard model with gauged baryon, lepton, and flavor symmetries.
Resumo:
In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.
We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.
We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.
In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.
In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.
We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.
In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.
Resumo:
The topological phases of matter have been a major part of condensed matter physics research since the discovery of the quantum Hall effect in the 1980s. Recently, much of this research has focused on the study of systems of free fermions, such as the integer quantum Hall effect, quantum spin Hall effect, and topological insulator. Though these free fermion systems can play host to a variety of interesting phenomena, the physics of interacting topological phases is even richer. Unfortunately, there is a shortage of theoretical tools that can be used to approach interacting problems. In this thesis I will discuss progress in using two different numerical techniques to study topological phases.
Recently much research in topological phases has focused on phases made up of bosons. Unlike fermions, free bosons form a condensate and so interactions are vital if the bosons are to realize a topological phase. Since these phases are difficult to study, much of our understanding comes from exactly solvable models, such as Kitaev's toric code, as well as Levin-Wen and Walker-Wang models. We may want to study systems for which such exactly solvable models are not available. In this thesis I present a series of models which are not solvable exactly, but which can be studied in sign-free Monte Carlo simulations. The models work by binding charges to point topological defects. They can be used to realize bosonic interacting versions of the quantum Hall effect in 2D and topological insulator in 3D. Effective field theories of "integer" (non-fractionalized) versions of these phases were available in the literature, but our models also allow for the construction of fractional phases. We can measure a number of properties of the bulk and surface of these phases.
Few interacting topological phases have been realized experimentally, but there is one very important exception: the fractional quantum Hall effect (FQHE). Though the fractional quantum Hall effect we discovered over 30 years ago, it can still produce novel phenomena. Of much recent interest is the existence of non-Abelian anyons in FQHE systems. Though it is possible to construct wave functions that realize such particles, whether these wavefunctions are the ground state is a difficult quantitative question that must be answered numerically. In this thesis I describe progress using a density-matrix renormalization group algorithm to study a bilayer system thought to host non-Abelian anyons. We find phase diagrams in terms of experimentally relevant parameters, and also find evidence for a non-Abelian phase known as the "interlayer Pfaffian".
Resumo:
In this paper the photorefractive sensitivity defined for single-centre holographic recording is modified to adapt two-centre holographic recording. Based on the time analytic solution of Kukhtarev equations for doubly doped crystals, the analytical expression of photorefractive sensitivity is given. For comparison with single-centre holographic recording and summing the electron competition effects between the deeper and shallower traps, an effective electron transport length is proposed, which varies with the intensity ratios of recording light to sensitive light. According to analyses in this paper, the lower photorefractive sensitivity in two-centre holographic recording is mainly due to the lower concentration of unionized dopants in the shallower centre and the lower effective electron transport length.
Resumo:
Theoretical and experimental studies were conducted to investigate the wave induced oscillations in an arbitrary shaped harbor with constant depth which is connected to the open-sea.
A theory termed the “arbitrary shaped harbor” theory is developed. The solution of the Helmholtz equation, ∇2f + k2f = 0, is formulated as an integral equation; an approximate method is employed to solve the integral equation by converting it to a matrix equation. The final solution is obtained by equating, at the harbor entrance, the wave amplitude and its normal derivative obtained from the solutions for the regions outside and inside the harbor.
Two special theories called the circular harbor theory and the rectangular harbor theory are also developed. The coordinates inside a circular and a rectangular harbor are separable; therefore, the solution for the region inside these harbors is obtained by the method of separation of variables. For the solution in the open-sea region, the same method is used as that employed for the arbitrary shaped harbor theory. The final solution is also obtained by a matching procedure similar to that used for the arbitrary shaped harbor theory. These two special theories provide a useful analytical check on the arbitrary shaped harbor theory.
Experiments were conducted to verify the theories in a wave basin 15 ft wide by 31 ft long with an effective system of wave energy dissipators mounted along the boundary to simulate the open-sea condition.
Four harbors were investigated theoretically and experimentally: circular harbors with a 10° opening and a 60° opening, a rectangular harbor, and a model of the East and West Basins of Long Beach Harbor located in Long Beach, California.
Theoretical solutions for these four harbors using the arbitrary shaped harbor theory were obtained. In addition, the theoretical solutions for the circular harbors and the rectangular harbor using the two special theories were also obtained. In each case, the theories have proven to agree well with the experimental data.
It is found that: (1) the resonant frequencies for a specific harbor are predicted correctly by the theory, although the amplification factors at resonance are somewhat larger than those found experimentally,(2) for the circular harbors, as the width of the harbor entrance increases, the amplification at resonance decreases, but the wave number bandwidth at resonance increases, (3) each peak in the curve of entrance velocity vs incident wave period corresponds to a distinct mode of resonant oscillation inside the harbor, thus the velocity at the harbor entrance appears to be a good indicator for resonance in harbors of complicated shape, (4) the results show that the present theory can be applied with confidence to prototype harbors with relatively uniform depth and reflective interior boundaries.
Resumo:
The phase contrast across the crystal thickness induced by the internal field is measured by the digital holographic interferometry just after the congruent lithium niobate crystal is partially poled. The direction of applied external field is antiparallel to that of internal field, and the measured phase contrast varies linearly with the applied external field. A new internal field is obtained by this method and named effective internal field. The distinct discrepancy between effective and equivalent internal fields is observed. The authors attribute this effect to the new macroscopic representation of elastic dipole components of defect complex in the crystal. (c) 2007 American Institute of Physics.
Resumo:
O trabalho busca analisar os problemas envolvendo a efetivação do direito à saúde no Brasil, os conflitos alocativos subjacentes à temática e o papel das instituições representativas, do Judiciário e da sociedade civil neste processo. Pretende-se reafirmar a importância da proteção do direito à saúde e, ao mesmo tempo, criticar uma certa euforia doutrinária e jurisprudencial que se instalou nos últimos anos e passou a compreender o Judiciário como o último guardião das promessas constitucionais não cumpridas pelos ramos representativos. O trabalho analisa as experiências constitucionais de países que não apostam no dogma da supremacia judicial e tentam conciliar a revisão judicial com mecanismos mais democráticos. A partir daí propõe o marco teórico das teorias do diálogo institucional como uma alternativa menos unilateral para enfrentar os desafios desencadeados no campo sanitário. No trabalho enfatiza-se a importância pelas preocupações com a efetividade da constituição, mas propõe-se uma reflexão sobre qual seria a melhor alternativa para tanto, chegando-se à conclusão contra-intuitiva de que talvez o caminho mais eficaz passe por um controle judicial fraco, que não despreze as potencialidades do Direito, mas que aposte mais na democracia e na interação sinérgica entre os ramos representativos e a sociedade civil.
Resumo:
The general superresolution theories for uniform amplitude beams and intercepted Gaussian beams are investigated. For these two types of incident beam, both two-zone amplitude and pure-phase filters are adopted to provide specific numerical descriptions of their differences in superresolution performances. Simulated results of comparisons between their performances indicate that, with the same spot size ratio, the intercepted Gaussian beam achieves a higher central image brightness ratio and significantly lower side-lobe effect irrespective of the filter used. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Essa pesquisa insere-se em um contexto de muitas discussões acerca da qualidade da educação no Brasil, motivadas principalmente pelos maus resultados atingidos pelo país em exames internacionais. Os constantes debates têm dado margem, inclusive, a movimentos que, culpando os PCNs pelo fracasso generalizado na alfabetização, clamam pela volta de um ensino tradicional. De fato, apesar das imensas contribuições trazidas pelos estudos que embasam os PCNs, apenas a sua institucionalização não garantiu a ocorrência de mudanças efetivas na educação. Principalmente no campo da alfabetização, interpretações equivocadas levaram, por exemplo, ao desencadeamento de um processo de desmetodização do ensino, caracterizado pela exclusão total, nos últimos anos, das questões de ordem metodológica das pautas de discussões. Tal fenômeno, denominado por Soares (2004) de desinvenção da alfabetização, e também abordado por outros pesquisadores (cf. MORAIS, 2006; FRADE, 2003; CARVALHO, 2007), obviamente, acabou se refletindo nos novos livros didáticos, avaliados e recomendados pelo MEC. O problema é que os novos livros parecem não estar satisfazendo às necessidades dos docentes. Estudos recentes têm revelado tanto uma insatisfação desses profissionais em relação a tais materiais quanto a manutenção de práticas didáticas preconizadas pelos métodos tradicionais (Cf. BRITO et al., 2007; SILVA 2008; e MORAIS E ALBURQUERQUE, 2008). Considerando-se que: (a) hoje já se reconhece que os processos de alfabetização e letramento são complementares e indissociáveis (SOARES, 2004); (b) na realidade brasileira os livros didáticos ainda são recursos centrais no trabalho em sala de aula; (c) as obras são avaliadas a partir de rígidos critérios, alinhados às mais recentes teorias; e (d), a disponibilização gratuita desses materiais demanda um alto investimento do governo, o objetivo deste estudo foi analisar criticamente um dos livros didáticos de alfabetização do PNLD/2010 (L.E.R., Leitura, escrita e reflexão 1 ano, FTD), na tentativa de levantar pistas sobre os possíveis motivos dessa não-adesão dos docentes aos novos livros. Para tanto, foi realizada uma análise documental crítica, de abordagem qualitativa, que observou na obra os seguintes aspectos: o espaço dedicado ao ensino do sistema de escrita alfabética; a existência de articulação desse trabalho com o de letramento; a coerência entre a orientação pedagógica declarada e as atividades propostas; e a clareza e objetividade das instruções e sugestões fornecidas ao docente. As análises realizadas mostram, entre os dados mais relevantes, que o livro estudado ainda dedica um espaço muito reduzido às atividades de ensino do sistema de escrita e não apresenta uma articulação satisfatória entre essas atividades e àquelas voltadas ao letramento, corroborando dados de outros estudos, aqui já mencionados. Esses resultados podem ser indicativos de que os critérios estipulados para a avaliação desses livros precisariam ser revistos de forma que atendessem mais equilibradamente tanto aos objetivos da alfabetização e do letramento quanto às necessidades da prática docente. Para um maior aprofundamento deste estudo considero que, futuramente, seus dados podem ser complementados por análises dos próprios docentes sobre o livro estudado, ou até mesmo por pesquisas sobre seu uso efetivo em sala de aula
Resumo:
O presente trabalho versa sobre o usucapião especial coletivo, uma vez que o mesmo revela-se como um dos instrumentos jurídicos escolhidos pelo legislador para promover a efetivação de valores constitucionais, especialmente a função social da propriedade. O referido instituto encontra-se disciplinado nos arts. 10 a 14 do Estatuto da Cidade e tem por objeto áreas urbanas com mais de duzentos e cinqüenta m, desde que ocupadas por população de baixa renda para sua moradia, com posse qualificada com os requisitos do art. 183 da Constituição Federal de 1988, onde não seja possível identificar os terrenos ocupados por cada possuidor. Incumbe a ele, portanto, dupla tarefa, isto é, não apenas regularizar a situação fundiária, mas também permitir a urbanização de áreas ocupadas por população de baixa renda. Neste passo, encarecer-se-á a posse, como situação fática e existencial, de apossamento e ocupação da coisa, dotada de natureza autônoma, eis que por meio dela a pessoa tem possibilidade de atender às suas necessidades vitais, como a moradia e o cultivo, daí falar-se em uma posse qualificada, isto é, na posse-trabalho. Entretanto, é acurado salientar que, mesmo para que o Estado possa atuar no sentido de promover uma efetiva regularização fundiária via usucapião especial coletivo, verifica-se imperioso o reconhecimento daqueles que serão beneficiados pela sua atuação como titulares de direitos, isto é, como membros de igual valor da coletividade política. Desta maneira, aborda-se o tema do usucapião especial coletivo sob o prisma das teorias concernentes ao reconhecimento, mais especificamente, a partir do enfoque adotado por Axel Honneth e Nancy Fraser. Tais teorias consistem no fio condutor dos capítulos da tese e através delas busca-se superar a existência de diferentes classes e status sociais, bem como remodelar os paradigmas que culminaram nessa situação como forma de se efetivar e promover o direito à moradia.