846 resultados para Efficient market theory
Resumo:
O consumidor é o agente vulnerável na relação de consumo internacional. O processo de globalização se apresenta, para o consumidor, como uma globalização do consumo. A globalização do consumo se caracteriza pelo comércio e fornecimento internacional de produtos e serviços por empresários/fornecedores transnacionais/globais, utilizando marcas de renome mundial, acessíveis a todos os consumidores do planeta, e agrava a vulnerabilidade do consumidor no mercado. A proteção jurídica do consumidor internacional é uma necessidade que os sistemas jurídicos nacionais não se mostram aptos a prover adequadamente, assim como o Direito Internacional também não. A presente tese demonstra a deficiência da Ciência do Direito na proteção do consumidor no contexto da globalização; demonstra como o próprio comércio internacional é prejudicado ao não priorizar de maneira absoluta e efetiva a proteção do consumidor na OMC, bem como ao mostrar-se apático diante dos diferentes níveis de proteção proporcionada aos consumidores em cada diferente sistema jurídico nacional; demonstra, também, como a proteção do consumidor de maneira uniforme e global por um direito comum aos Estados é possível e será capaz de tornar mais eficiente economicamente o processo de globalização do consumo, ao encorajar a participação mais intensa do consumidor no mercado internacional; e propõe a construção de um novo ramo do Direito dedicado ao problema, o Direito Internacional do Consumidor (DIC), por meio da elaboração de uma Teoria do Direito Internacional do Consumidor. O Direito Internacional do Consumidor pretende ser um direito comum e universal de proteção ao consumidor, fundado em métodos, conceitos, institutos, normas e princípios jurídicos universais. O DIC dialogará com outros ramos do Direito Público e Privado, especialmente o Direito Internacional Econômico, o Direito Internacional do Comércio, o Direito Internacional Privado, o Direito Processual Civil Internacional, e o Direito do Consumidor. Pretende-se com isto atender ao ideal de promover o livre comércio internacional com respeito aos Direitos Humanos.
Resumo:
Time, risk, and attention are all integral to economic decision making. The aim of this work is to understand those key components of decision making using a variety of approaches: providing axiomatic characterizations to investigate time discounting, generating measures of visual attention to infer consumers' intentions, and examining data from unique field settings.
Chapter 2, co-authored with Federico Echenique and Kota Saito, presents the first revealed-preference characterizations of exponentially-discounted utility model and its generalizations. My characterizations provide non-parametric revealed-preference tests. I apply the tests to data from a recent experiment, and find that the axiomatization delivers new insights on a dataset that had been analyzed by traditional parametric methods.
Chapter 3, co-authored with Min Jeong Kang and Colin Camerer, investigates whether "pre-choice" measures of visual attention improve in prediction of consumers' purchase intentions. We measure participants' visual attention using eyetracking or mousetracking while they make hypothetical as well as real purchase decisions. I find that different patterns of visual attention are associated with hypothetical and real decisions. I then demonstrate that including information on visual attention improves prediction of purchase decisions when attention is measured with mousetracking.
Chapter 4 investigates individuals' attitudes towards risk in a high-stakes environment using data from a TV game show, Jeopardy!. I first quantify players' subjective beliefs about answering questions correctly. Using those beliefs in estimation, I find that the representative player is risk averse. I then find that trailing players tend to wager more than "folk" strategies that are known among the community of contestants and fans, and this tendency is related to their confidence. I also find gender differences: male players take more risk than female players, and even more so when they are competing against two other male players.
Chapter 5, co-authored with Colin Camerer, investigates the dynamics of the favorite-longshot bias (FLB) using data on horse race betting from an online exchange that allows bettors to trade "in-play." I find that probabilistic forecasts implied by market prices before start of the races are well-calibrated, but the degree of FLB increases significantly as the events approach toward the end.
Resumo:
Er3+ -doped strontium lead bismuth glass for developing upconversion lasers has been fabricated and characterized. The Judd-Ofelt intensity parameters Omega(1) (t = 2,4,6), calculated based on the experimental absorption spectrum and Judd-Ofelt theory, were found to be Omega(2) = 2.95 x 10(-20), Omega(4) = 0-91 X 10(-20), and Omega(6) = 0.36 x 10(-20) cm(2). Under 975 nm excitation, intense green and red emissions centered at 525, 546, and 657 nm, corresponding to the transitions H-2(11/2) --> I-4(15/2), S-4(3/2) I-4(15/2), and F-4(9/2) --> I-4(15/2) respectively were observed. The upconversion mechanisms are discussed based oil the energy matching and quadratic dependence on excitation power, and the dominant mechanisms are excited state absorption and energy transfer upconversion for the green and red emissions. The long-lived I-4(11/2) level is supposed to serve as the intermediate state responsible for the upconversion processes. (C) 2004 Published by Elsevier B.V.
Resumo:
This paper explores the fiscal situation of the European Union as well as the different approaches proposed for achieving fiscal policy coordination among the member states. Furthermore, it works through the need of fiscal integration as a pressing matter in order for the European Union to achieve its aim of functioning as an efficient single market and economic unit. In order to do so, it analyzes the theoretical lines of the Modern Money Theory as a possible framework for further integration, and it evaluates the different proposals made for fiscal integration so that it can give an assessment regarding their compatibility with this theory.
Resumo:
Muitos estudos buscam tentar prever o retorno potencial sobre portfólios de ações, com intuito de obter melhor rentabilidade sobre o capital aplicado. Diversas modelagens já foram utilizadas, sendo que as mais conhecidas são as que relacionam o risco com o retorno. Nesta linha destacam-se a Teoria de Carteiras proposta por Markowitz, e o CAPM de Sharpe. Através destas teorias entende-se a questão da influência da covariância dos retornos e que para um melhor desempenho de uma carteira, não é suficiente avaliar cada ativo individualmente. Por outro lado, diversas críticas em relação ao CAPM, vêm ensejando estudos complementares na busca de outras variáveis que melhorem os métodos de seleção de ativos. Fama e French (1993) fizeram um estudo com variáveis complementares em relação ao beta do CAPM, utilizando o tamanho e a relação Book to Market, conseguindo resultados melhores que o CAPM tradicional. O presente estudo leva em conta a questão do reinvestimento do lucro gerado e utilizando o modelo de Gordon propõe uma variável de classificação de empresas de crescimento e empresas valor, conceito já utilizado na literatura de finanças.Com base nesta variável montam-se carteiras de ações entre os anos de 2005 e 2012 e observa-se que é possível obter ganhos com a lógica proposta. Ao longo do período seria possível obter com as carteiras selecionadas ganhos de até 107,85% contra os retornos de 55,58% das carteiras com todos os ativos. Organizamos os mesmos ativos pela ótica da relação Book to Market as quais obtiveram retorno total do período de 90,42%. Apesar de notar uma mudança clara de comportamento, onde apenas nos quatro primeiros anos do estudo as carteiras com empresas value são superiores e nos quatro últimos períodos as carteiras de empresas growth são as melhores. Estes resultados são compatíveis com os resultados de Braga e Leal (2000), e Mescolin, Martinelli Braga e da Costa Jr. (1997), verificando um melhor desempenho para as empresas value.
Resumo:
In this paper, a quantum chemistry method was used to investigate the effect of different sizes of substituted phenanthrolines on absorption, energy transfer, and the electroluminescent performance of a series of Eu(TTA)(3)L (L = [1,10] phenanthroline (Phen), Pyrazino[2,3-f][1,10]phenanthroline (PyPhen), 2-methylprrazino[2,3-f][1,10] phenanthroline(MPP), dipyrido[3,2-a:2',3'-c]phenazine(DPPz), 11-methyldipyrido[3,2-a:2',3'c]phenazine(MDPz), 11.12-dimethyldipyrido[3,2-a:2',3'-c]phenazine(DDPz), and benzo[i]dipyrido[3,2-a:2',3'-c]phenazine (BDPz)) complexes. Absorption spectra calculations show that different sizes of secondary ligands have different effects on transition characters, intensities, and absorption peak positions.
Resumo:
This research is concerned with designing representations for analytical reasoning problems (of the sort found on the GRE and LSAT). These problems test the ability to draw logical conclusions. A computer program was developed that takes as input a straightforward predicate calculus translation of a problem, requests additional information if necessary, decides what to represent and how, designs representations capturing the constraints of the problem, and creates and executes a LISP program that uses those representations to produce a solution. Even though these problems are typically difficult for theorem provers to solve, the LISP program that uses the designed representations is very efficient.
Resumo:
In this paper we introduce a theory of policy routing dynamics based on fundamental axioms of routing update mechanisms. We develop a dynamic policy routing model (DPR) that extends the static formalism of the stable paths problem (introduced by Griffin et al.) with discrete synchronous time. DPR captures the propagation of path changes in any dynamic network irrespective of its time-varying topology. We introduce several novel structures such as causation chains, dispute fences and policy digraphs that model different aspects of routing dynamics and provide insight into how these dynamics manifest in a network. We exercise the practicality of the theoretical foundation provided by DPR with two fundamental problems: routing dynamics minimization and policy conflict detection. The dynamics minimization problem utilizes policy digraphs, that capture the dependencies in routing policies irrespective of underlying topology dynamics, to solve a graph optimization problem. This optimization problem explicitly minimizes the number of routing update messages in a dynamic network by optimally changing the path preferences of a minimal subset of nodes. The conflict detection problem, on the other hand, utilizes a theoretical result of DPR where the root cause of a causation cycle (i.e., cycle of routing update messages) can be precisely inferred as either a transient route flap or a dispute wheel (i.e., policy conflict). Using this result we develop SafetyPulse, a token-based distributed algorithm to detect policy conflicts in a dynamic network. SafetyPulse is privacy preserving, computationally efficient, and provably correct.
Resumo:
We propose Trade & Cap (T&C), an economics-inspired mechanism that incentivizes users to voluntarily coordinate their consumption of the bandwidth of a shared resource (e.g., a DSLAM link) so as to converge on what they perceive to be an equitable allocation, while ensuring efficient resource utilization. Under T&C, rather than acting as an arbiter, an Internet Service Provider (ISP) acts as an enforcer of what the community of rational users sharing the resource decides is a fair allocation of that resource. Our T&C mechanism proceeds in two phases. In the first, software agents acting on behalf of users engage in a strategic trading game in which each user agent selfishly chooses bandwidth slots to reserve in support of primary, interactive network usage activities. In the second phase, each user is allowed to acquire additional bandwidth slots in support of presumed open-ended need for fluid bandwidth, catering to secondary applications. The acquisition of this fluid bandwidth is subject to the remaining "buying power" of each user and by prevalent "market prices" – both of which are determined by the results of the trading phase and a desirable aggregate cap on link utilization. We present analytical results that establish the underpinnings of our T&C mechanism, including game-theoretic results pertaining to the trading phase, and pricing of fluid bandwidth allocation pertaining to the capping phase. Using real network traces, we present extensive experimental results that demonstrate the benefits of our scheme, which we also show to be practical by highlighting the salient features of an efficient implementation architecture.
Resumo:
In work that involves mathematical rigor, there are numerous benefits to adopting a representation of models and arguments that can be supplied to a formal reasoning or verification system: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, accessibility has not been a priority in the design of formal verification tools that can provide these benefits. In earlier work [Lap09a], we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. This work expands one aspect of the earlier work by considering more extensively an essential capability for any formal reasoning system whose design is oriented around simulating the natural context: native support for a collection of mathematical relations that deal with common constructs in arithmetic and set theory. We provide a formal definition for a context of relations that can be used to both validate and assist formal reasoning activities. We provide a proof that any algorithm that implements this formal structure faithfully will necessary converge. Finally, we consider the efficiency of an implementation of this formal structure that leverages modular implementations of well-known data structures: balanced search trees and transitive closures of hypergraphs.
Resumo:
Strategic reviews of the Irish Food and Beverage Industry have consistently emphasised the need for food and beverage firms to improve their innovation and marketing capabilities, in order to maintain competitiveness in both domestic and overseas markets. In particular, the functional food and beverages market has been singled out as an extremely important emerging market, which Irish firms could benefit from through an increased technological and market orientation. Although health and wellness have been the most significant drivers of new product development (NPD) in recent years, failure rates for new functional foods and beverages have been reportedly high. In that context, researchers in the US, UK, Denmark and Ireland have reported a marked divergence between NPD practices within food and beverage firms and normative advice for successful product development. The high reported failure rates for new functional foods and beverages suggest a failure to manage customer knowledge effectively, as well as a lack of knowledge management between functional disciplines involved in the NPD process. This research explored the concept of managing customer knowledge at the early stages of the NPD process, and applied it to the development of a range of functional beverages, through the use of advanced concept optimisation research techniques, which provided for a more market-oriented approach to new food product development. A sequential exploratory research design strategy using mixed research methods was chosen for this study. First, the qualitative element of this research investigated customers’ choice motives for orange juice and soft drinks, and explored their attitudes and perceptions towards a range of new functional beverage concepts through a combination of 15 in-depth interviews and 3 focus groups. Second, the quantitative element of this research consisted of 3 conjoint-based questionnaires administered to 400 different customers in each study in order to model their purchase preferences for chilled nutrient-enriched and probiotic orange juices, and stimulant soft drinks. The in-depth interviews identified the key product design attributes that influenced customers’ choice motives for orange juice. The focus group discussions revealed that groups of customers were negative towards the addition of certain functional ingredients to natural foods and beverages. K-means cluster analysis was used to quantitatively identify segments of customers with similar preferences for chilled nutrient-enriched and probiotic orange juices, and stimulant soft drinks. Overall, advanced concept optimisation research methods facilitate the integration of the customer at the early stages of the NPD process, which promotes a multi-disciplinary approach to new food product design. This research illustrated how advanced concept optimisation research methods could contribute towards effective and efficient knowledge management in the new food product development process.
Resumo:
There is much common ground between the areas of coding theory and systems theory. Fitzpatrick has shown that a Göbner basis approach leads to efficient algorithms in the decoding of Reed-Solomon codes and in scalar interpolation and partial realization. This thesis simultaneously generalizes and simplifies that approach and presents applications to discrete-time modeling, multivariable interpolation and list decoding. Gröbner basis theory has come into its own in the context of software and algorithm development. By generalizing the concept of polynomial degree, term orders are provided for multivariable polynomial rings and free modules over polynomial rings. The orders are not, in general, unique and this adds, in no small way, to the power and flexibility of the technique. As well as being generating sets for ideals or modules, Gröbner bases always contain a element which is minimal with respect tot the corresponding term order. Central to this thesis is a general algorithm, valid for any term order, that produces a Gröbner basis for the solution module (or ideal) of elements satisfying a sequence of generalized congruences. These congruences, based on shifts and homomorphisms, are applicable to a wide variety of problems, including key equations and interpolations. At the core of the algorithm is an incremental step. Iterating this step lends a recursive/iterative character to the algorithm. As a consequence, not all of the input to the algorithm need be available from the start and different "paths" can be taken to reach the final solution. The existence of a suitable chain of modules satisfying the criteria of the incremental step is a prerequisite for applying the algorithm.
Resumo:
In this work, the properties of strained tetrahedrally bonded materials are explored theoretically, with special focus on group-III nitrides. In order to do so, a multiscale approach is taken: accurate quantitative calculations of material properties are carried out in a quantum first-principles frame, for small systems. These properties are then extrapolated and empirical methods are employed to make predictions for larger systems, such as alloys or nanostructures. We focus our attention on elasticity and electric polarization in semiconductors. These quantities serve as input for the calculation of the optoelectronic properties of these systems. Regarding the methods employed, our first-principles calculations use highly- accurate density functional theory (DFT) within both standard Kohn-Sham and generalized (hybrid functional) Kohn-Sham approaches. We have developed our own empirical methods, including valence force field (VFF) and a point-dipole model for the calculation of local polarization and local polarization potential. Our local polarization model gives insight for the first time to local fluctuations of the electric polarization at an atomistic level. At the continuum level, we have studied composition-engineering optimization of nitride nanostructures for built-in electrostatic field reduction, and have developed a highly efficient hybrid analytical-numerical staggered-grid computational implementation of continuum elasticity theory, that is used to treat larger systems, such as quantum dots.
Resumo:
Future high speed communications networks will transmit data predominantly over optical fibres. As consumer and enterprise computing will remain the domain of electronics, the electro-optical conversion will get pushed further downstream towards the end user. Consequently, efficient tools are needed for this conversion and due to many potential advantages, including low cost and high output powers, long wavelength Vertical Cavity Surface Emitting Lasers (VCSELs) are a viable option. Drawbacks, such as broader linewidths than competing options, can be mitigated through the use of additional techniques such as Optical Injection Locking (OIL) which can require significant expertise and expensive equipment. This thesis addresses these issues by removing some of the experimental barriers to achieving performance increases via remote OIL. Firstly, numerical simulations of the phase and the photon and carrier numbers of an OIL semiconductor laser allowed the classification of the stable locking phase limits into three distinct groups. The frequency detuning of constant phase values (ø) was considered, in particular ø = 0 where the modulation response parameters were shown to be independent of the linewidth enhancement factor, α. A new method to estimate α and the coupling rate in a single experiment was formulated. Secondly, a novel technique to remotely determine the locked state of a VCSEL based on voltage variations of 2mV−30mV during detuned injection has been developed which can identify oscillatory and locked states. 2D & 3D maps of voltage, optical and electrical spectra illustrate corresponding behaviours. Finally, the use of directly modulated VCSELs as light sources for passive optical networks was investigated by successful transmission of data at 10 Gbit/s over 40km of single mode fibre (SMF) using cost effective electronic dispersion compensation to mitigate errors due to wavelength chirp. A widely tuneable MEMS-VCSEL was established as a good candidate for an externally modulated colourless source after a record error free transmission at 10 Gbit/s over 50km of SMF across a 30nm single mode tuning range. The ability to remotely set the emission wavelength using the novel methods developed in this thesis was demonstrated.