926 resultados para Efficient Market Hypothesis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reducing tariffs and increasing consumption taxes is a standard IMF advice to countries that want to open up their economy without hurting government finances. Indeed, theoretical analysis of such a tariff–tax reform shows an unambiguous increase in welfare and government revenues. The present paper examines whether the country that implements such a reform ends up opening up its markets to international trade, i.e. whether its market access improves. It is shown that this is not necessarily so. We also show that, comparing to the reform of only tariffs, the tariff–tax reform is a less efficient proposal to follow both as far as it concerns market access and welfare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Market microstructure is “the study of the trading mechanisms used for financial securities” (Hasbrouck (2007)). It seeks to understand the sources of value and reasons for trade, in a setting with different types of traders, and different private and public information sets. The actual mechanisms of trade are a continually changing object of study. These include continuous markets, auctions, limit order books, dealer markets, or combinations of these operating as a hybrid market. Microstructure also has to allow for the possibility of multiple prices. At any given time an investor may be faced with a multitude of different prices, depending on whether he or she is buying or selling, the quantity he or she wishes to trade, and the required speed for the trade. The price may also depend on the relationship that the trader has with potential counterparties. In this research, I touch upon all of the above issues. I do this by studying three specific areas, all of which have both practical and policy implications. First, I study the role of information in trading and pricing securities in markets with a heterogeneous population of traders, some of whom are informed and some not, and who trade for different private or public reasons. Second, I study the price discovery of stocks in a setting where they are simultaneously traded in more than one market. Third, I make a contribution to the ongoing discussion about market design, i.e. the question of which trading systems and ways of organizing trading are most efficient. A common characteristic throughout my thesis is the use of high frequency datasets, i.e. tick data. These datasets include all trades and quotes in a given security, rather than just the daily closing prices, as in traditional asset pricing literature. This thesis consists of four separate essays. In the first essay I study price discovery for European companies cross-listed in the United States. I also study explanatory variables for differences in price discovery. In my second essay I contribute to earlier research on two issues of broad interest in market microstructure: market transparency and informed trading. I examine the effects of a change to an anonymous market at the OMX Helsinki Stock Exchange. I broaden my focus slightly in the third essay, to include releases of macroeconomic data in the United States. I analyze the effect of these releases on European cross-listed stocks. The fourth and last essay examines the uses of standard methodologies of price discovery analysis in a novel way. Specifically, I study price discovery within one market, between local and foreign traders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the clustering pattern in the Finnish stock market. Using trading volume and time as factors capturing the clustering pattern in the market, the Keim and Madhavan (1996) and the Engle and Russell (1998) model provide the framework for the analysis. The descriptive and the parametric analysis provide evidences that an important determinant of the famous U-shape pattern in the market is the rate of information arrivals as measured by large trading volumes and durations at the market open and close. Precisely, 1) the larger the trading volume, the greater the impact on prices both in the short and the long run, thus prices will differ across quantities. 2) Large trading volume is a non-linear function of price changes in the long run. 3) Arrival times are positively autocorrelated, indicating a clustering pattern and 4) Information arrivals as approximated by durations are negatively related to trading flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates to what extent the volatility of Finnish stock portfolios is transmitted through the "world volatility". We operationalize the volatility processes of Finnish leverage, industry, and size portfolio returns by asymmetric GARCH specifications according to Glosten et al. (1993). We use daily return data for January, 2, 1987 to December 30, 1998. We find that the world shock significantly enters the domestic models, and that the impact has increased over time. This applies also for the variance ratios, and the correlations to the world. The larger the firm, the larger is the world impact. The conditional variance is higher during recessions. The asymmetry parameter is surprisingly non-significant, and the leverage hypothesis cannot be verified. The return generating process of the domestic portfolio returns does usually not include the world information set, thus indicating that the returns are generated by a segmented conditional asset pricing model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigates the impacts of agricultural market liberalization on food security in developing countries and it evaluates the supply perspective of food security. This research theme is applied on the agricultural sector in Kenya and in Zambia by studying the role policies played in the maize sub-sector. An evaluation of selected policies introduced at the beginning of the 1980s is made, as well as an assessment of whether those policies influenced maize output. A theoretical model of agricultural production is then formulated to reflect cereal production in a developing country setting. This study begins with a review of the general framework and the aims of the structural adjustment programs and proceeds to their application in the maize sector in Kenya and Zambia. A literature review of the supply and demand synthesis of food security is presented with examples from various developing countries. Contrary to previous studies on food security, this study assesses two countries with divergent economic orientations. Agricultural sector response to economic and institutional policies in different settings is also evaluated. Finally, a dynamic time series econometric model is applied to assess the effects of policy on maize output. The empirical findings suggest a weak policy influence on maize output, but the precipitation and acreage variables stand out as core determinants of maize output. The policy dimension of acreage and how markets influence it is not discussed at length in this study. Due to weak land rights and tenure structures in these countries, the direct impact of policy change on land markets cannot be precisely measured. Recurring government intervention during the structural policy implementation period impeded efficient functioning of input and output markets, particularly in Zambia. Input and output prices of maize and fertilizer responded more strongly in Kenya than in Zambia, where the state often ceded to public pressure by revoking pertinent policy measures. These policy interpretations are based on the response of policy variables which are more responsive in Kenya than in Zambia. According the obtained regression results, agricultural markets in general, and the maize sub-sector in particular, responded more positively to implemented policies in Kenya, than in Zambia, which supported a more socialist economic system. It is observed in these results that in order for policies to be effective, sector and regional dimensions need to be considered. The regional and sector dimensions were not taken into account in the formulation and implementation of structural adjustment policies in the 1980s. It can be noted that countries with vibrant economic structures and institutions fared better than those which had a firm, socially founded system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of mining targeted association rules over multidimensional market-basket data. Here, each transaction has, in addition to the set of purchased items, ancillary dimension attributes associated with it. Based on these dimensions, transactions can be visualized as distributed over cells of an n-dimensional cube. In this framework, a targeted association rule is of the form {X -> Y} R, where R is a convex region in the cube and X. Y is a traditional association rule within region R. We first describe the TOARM algorithm, based on classical techniques, for identifying targeted association rules. Then, we discuss the concepts of bottom-up aggregation and cubing, leading to the CellUnion technique. This approach is further extended, using notions of cube-count interleaving and credit-based pruning, to derive the IceCube algorithm. Our experiments demonstrate that IceCube consistently provides the best execution time performance, especially for large and complex data cubes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes some aspects of a computer system for doing medical diagnosis in the specialized field of kidney disease. Because such a system faces the spectre of combinatorial explosion, this discussion concentrates on heuristics which control the number of concurrent hypotheses and efficient "compiled" representations of medical knowledge. In particular, the differential diagnosis of hematuria (blood in the urine) is discussed in detail. A protocol of a simulated doctor/patient interaction is presented and analyzed to determine the crucial structures and processes involved in the diagnosis procedure. The data structure proposed for representing medical information revolves around elementary hypotheses which are activated when certain disposing of findings, activating hypotheses, evaluating hypotheses locally and combining hypotheses globally is examined for its heuristic implications. The thesis attempts to fit the problem of medical diagnosis into the framework of other Artifcial Intelligence problems and paradigms and in particular explores the notions of pure search vs. heuristic methods, linearity and interaction, local vs. global knowledge and the structure of hypotheses within the world of kidney disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose Trade & Cap (T&C), an economics-inspired mechanism that incentivizes users to voluntarily coordinate their consumption of the bandwidth of a shared resource (e.g., a DSLAM link) so as to converge on what they perceive to be an equitable allocation, while ensuring efficient resource utilization. Under T&C, rather than acting as an arbiter, an Internet Service Provider (ISP) acts as an enforcer of what the community of rational users sharing the resource decides is a fair allocation of that resource. Our T&C mechanism proceeds in two phases. In the first, software agents acting on behalf of users engage in a strategic trading game in which each user agent selfishly chooses bandwidth slots to reserve in support of primary, interactive network usage activities. In the second phase, each user is allowed to acquire additional bandwidth slots in support of presumed open-ended need for fluid bandwidth, catering to secondary applications. The acquisition of this fluid bandwidth is subject to the remaining "buying power" of each user and by prevalent "market prices" – both of which are determined by the results of the trading phase and a desirable aggregate cap on link utilization. We present analytical results that establish the underpinnings of our T&C mechanism, including game-theoretic results pertaining to the trading phase, and pricing of fluid bandwidth allocation pertaining to the capping phase. Using real network traces, we present extensive experimental results that demonstrate the benefits of our scheme, which we also show to be practical by highlighting the salient features of an efficient implementation architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strategic reviews of the Irish Food and Beverage Industry have consistently emphasised the need for food and beverage firms to improve their innovation and marketing capabilities, in order to maintain competitiveness in both domestic and overseas markets. In particular, the functional food and beverages market has been singled out as an extremely important emerging market, which Irish firms could benefit from through an increased technological and market orientation. Although health and wellness have been the most significant drivers of new product development (NPD) in recent years, failure rates for new functional foods and beverages have been reportedly high. In that context, researchers in the US, UK, Denmark and Ireland have reported a marked divergence between NPD practices within food and beverage firms and normative advice for successful product development. The high reported failure rates for new functional foods and beverages suggest a failure to manage customer knowledge effectively, as well as a lack of knowledge management between functional disciplines involved in the NPD process. This research explored the concept of managing customer knowledge at the early stages of the NPD process, and applied it to the development of a range of functional beverages, through the use of advanced concept optimisation research techniques, which provided for a more market-oriented approach to new food product development. A sequential exploratory research design strategy using mixed research methods was chosen for this study. First, the qualitative element of this research investigated customers’ choice motives for orange juice and soft drinks, and explored their attitudes and perceptions towards a range of new functional beverage concepts through a combination of 15 in-depth interviews and 3 focus groups. Second, the quantitative element of this research consisted of 3 conjoint-based questionnaires administered to 400 different customers in each study in order to model their purchase preferences for chilled nutrient-enriched and probiotic orange juices, and stimulant soft drinks. The in-depth interviews identified the key product design attributes that influenced customers’ choice motives for orange juice. The focus group discussions revealed that groups of customers were negative towards the addition of certain functional ingredients to natural foods and beverages. K-means cluster analysis was used to quantitatively identify segments of customers with similar preferences for chilled nutrient-enriched and probiotic orange juices, and stimulant soft drinks. Overall, advanced concept optimisation research methods facilitate the integration of the customer at the early stages of the NPD process, which promotes a multi-disciplinary approach to new food product design. This research illustrated how advanced concept optimisation research methods could contribute towards effective and efficient knowledge management in the new food product development process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumer demand is revolutionizing the way products are being produced, distributed and marketed. In relation to the dairy sector in developing countries, aspects of milk quality are receiving more attention from both society and the government. However, milk quality management needs to be better addressed in dairy production systems to guarantee the access of stakeholders, mainly small-holders, into dairy markets. The present study is focused on an analysis of the interaction of the upstream part of the dairy supply chain (farmers and dairies) in the Mantaro Valley (Peruvian central Andes), in order to understand possible constraints both stakeholders face implementing milk quality controls and practices; and evaluate “ex-ante” how different strategies suggested to improve milk quality could affect farmers and processors’ profits. The analysis is based on three complementary field studies conducted between 2012 and 2013. Our work has shown that the presence of a dual supply chain combining both formal and informal markets has a direct impact on dairy production at the technical and organizational levels, affecting small formal dairy processors’ possibilities to implement contracts, including agreements on milk quality standards. The analysis of milk quality management from farms to dairy plants highlighted the poor hygiene in the study area, even when average values of milk composition were usually high. Some husbandry practices evaluated at farm level demonstrated cost effectiveness and a big impact on hygienic quality; however, regular application of these practices was limited, since small-scale farmers do not receive a bonus for producing hygienic milk. On the basis of these two results, we co-designed with formal small-scale dairy processors a simulation tool to show prospective scenarios, in which they could select their best product portfolio but also design milk payment systems to reward farmers’ with high milk quality performances. This type of approach allowed dairy processors to realize the importance of including milk quality management in their collection and manufacturing processes, especially in a context of high competition for milk supply. We concluded that the improvement of milk quality in a smallholder farming context requires a more coordinated effort among stakeholders. Successful implementation of strategies will depend on the willingness of small-scale dairy processors to reward farmers producing high milk quality; but also on the support from the State to provide incentives to the stakeholders in the formal sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Invasion ecology urgently requires predictive methodologies that can forecast the ecological impacts of existing, emerging and potential invasive species. We argue that many ecologically damaging invaders are characterised by their more efficient use of resources. Consequently, comparison of the classical ‘functional response’ (relationship between resource use and availability) between invasive and trophically analogous native species may allow prediction of invader ecological impact. We review the utility of species trait comparisons and the history and context of the use of functional responses in invasion ecology, then present our framework for the use of comparative functional responses. We show that functional response analyses, by describing the resource use of species over a range of resource availabilities, avoids many pitfalls of ‘snapshot’ assessments of resource use. Our framework demonstrates how comparisons of invader and native functional responses, within and between Type II and III functional responses, allow testing of the likely population-level outcomes of invasions for affected species. Furthermore, we describe how recent studies support the predictive capacity of this method; for example, the invasive ‘bloody red shrimp’ Hemimysis anomala shows higher Type II functional responses than native mysids and this corroborates, and could have predicted, actual invader impacts in the field. The comparative functional response method can also be used to examine differences in the impact of two or more invaders, two or more populations of the same invader, and the abiotic (e.g. temperature) and biotic (e.g. parasitism) context-dependencies of invader impacts. Our framework may also address the previous lack of rigour in testing major hypotheses in invasion ecology, such as the ‘enemy release’ and ‘biotic resistance’ hypotheses, as our approach explicitly considers demographic consequences for impacted resources, such as native and invasive prey species. We also identify potential challenges in the application of comparative functional responses in invasion ecology. These include incorporation of numerical responses, multiple predator effects and trait-mediated indirect interactions, replacement versus non-replacement study designs and the inclusion of functional responses in risk assessment frameworks. In future, the generation of sufficient case studies for a meta-analysis could test the overall hypothesis that comparative functional responses can indeed predict invasive species impacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sample of 445 consumers resident in distinct Lisbon areas was analyzed through direct observations in order to discover each lifestyle’s current proportion, applying the Whitaker Lifestyle™ Method. The findings of the conducted hypothesis tests on the population proportion unveil that Neo-Traditional and Modern Whitaker lifestyles have the significantly highest proportion, while the overall presence of different lifestyles varies across neighborhoods. The research further demonstrates the validity of Whitaker observation techniques, media consumption differences among lifestyles and the importance of style and aesthetics while segmenting consumers by lifestyles. Finally, market opportunities are provided for firms operating in Lisbon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to replicate Apple’s stock market movement by modeling major investment profiles and investors. The present model recreates a live exchange to forecast any predictability in stock price variation, knowing how investors act when it concerns investment decisions. This methodology is particularly relevant if, just by observing historical prices and knowing the tendencies in other players’ behavior, risk-adjusted profits can be made. Empirical research made in the academia shows that abnormal returns are hardly consistent without a clear idea of who is in the market in a given moment and the correspondent market shares. Therefore, even when knowing investors’ individual investment profiles, it is not clear how they affect aggregate markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the changes in European stock market indexes composition from 1995 to 2015. It was found that there are mixed price effects producing abnormal returns around the effective replacement of added and deleted stocks. The price pressure hypothesis seems to hold for added stocks in some indexes but not for deleted stocks as there is not a clear inversion of behaviour after the replacement. Finally, the building and back testing of a trading strategy aiming to capture some of those abnormal returns shows it yields a Sharpe Ratio of 1.4 and generates an annualised alpha of 11%.