930 resultados para the efficient market hypothesis
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
The literature on bond markets and interest rates has focused largely on the term structure of interest rates, specifically, on the so-called expectations hypothesis. At the same time, little is known about the nature of the spread of the interest rates in the money market beyond the fact that such spreads are generally unstable. However, with the evolution of complex financial instruments, it has become imperative to identify the time series process that can help one accurately forecast such spreads into the future. This article explores the nature of the time series process underlying the spread between three-month and one-year US rates, and concludes that the movements in this spread over time is best captured by a GARCH(1,1) process. It also suggests the use of a relatively long term measure of interest rate volatility as an explanatory variable. This exercise has gained added importance in view of the revelation that GARCH based estimates of option prices consistently outperform the corresponding estimates based on the stylized Black-Scholes algorithm.
Resumo:
This research examines evolving issues in applied computer science and applies economic and business analyses as well. There are two main areas. The first is internetwork communications as embodied by the Internet. The goal of the research is to devise an efficient pricing, prioritization, and incentivization plan that could be realistically implemented on the existing infrastructure. Criteria include practical and economic efficiency, and proper incentives for both users and providers. Background information on the evolution and functional operation of the Internet is given, and relevant literature is surveyed and analyzed. Economic analysis is performed on the incentive implications of the current pricing structure and organization. The problems are identified, and minimally disruptive solutions are proposed for all levels of implementation to the lowest level protocol. Practical issues are considered and performance analyses are done. The second area of research is mass market software engineering, and how this differs from classical software engineering. Software life-cycle revenues are analyzed and software pricing and timing implications are derived. A profit maximizing methodology is developed to select or defer the development of software features for inclusion in a given release. An iterative model of the stages of the software development process is developed, taking into account new communications capabilities as well as profitability. ^
Resumo:
This paper provides an agent-based software exploration of the wellknown free market efficiency/equality trade-off. Our study simulates the interaction of agents producing, trading and consuming goods in the presence of different market structures, and looks at how efficient the producers/consumers mapping turn out to be as well as the resulting distribution of welfare among agents at the end of an arbitrarily large number of iterations. Two market mechanisms are compared: the competitive market (a double auction market in which agents outbid each other in order to buy and sell products) and the random one (in which products are allocated randomly). Our results confirm that the superior efficiency of the competitive market (an effective and never stopping producers/consumers mapping and a superior aggregative welfare) comes at a very high price in terms of inequality (above all when severe budget constraints are in play).
Resumo:
Mestrado em Finanças
Resumo:
Due to their unpredictable behavior, stock markets are examples of complex systems. Yet, the dominant analysis of these markets as- sumes simple stochastic variations, eventually tainted by short-lived memory. This paper proposes an alternative strategy, based on a stochastic geometry defining a robust index of the structural dynamics of the markets and based on notions of topology defining a new coef- ficient that identifies the structural changes occurring on the S&P500 set of stocks. The results demonstrate the consistency of the random hypothesis as applied to normal periods but they also show its in- adequacy as to the analysis of periods of turbulence, for which the emergence of collective behavior of sectoral clusters of firms is mea- sured. This behavior is identified as a meta-routine.
Resumo:
The internet and digital technologies revolutionized the economy. Regulating the digital market has become a priority for the European Union. While promoting innovation and development, EU institutions must assure that the digital market maintains a competitive structure. Among the numerous elements characterizing the digital sector, users’ data are particularly important. Digital services are centered around personal data, the accumulation of which contributed to the centralization of market power in the hands of a few large providers. As a result, data-driven mergers and data-related abuses gained a central role for the purposes of EU antitrust enforcement. In light of these considerations, this work aims at assessing whether EU competition law is well-suited to address data-driven mergers and data-related abuses of dominance. These conducts are of crucial importance to the maintenance of competition in the digital sector, insofar as the accumulation of users’ data constitutes a fundamental competitive advantage. To begin with, part 1 addresses the specific features of the digital market and their impact on the definition of the relevant market and the assessment of dominance by antitrust authorities. Secondly, part 2 analyzes the EU’s case law on data-driven mergers to verify if merger control is well-suited to address these concentrations. Thirdly, part 3 discusses abuses of dominance in the phase of data collection and the legal frameworks applicable to these conducts. Fourthly, part 4 focuses on access to “essential” datasets and the indirect effects of anticompetitive conducts on rivals’ ability to access users’ information. Finally, Part 5 discusses differential pricing practices implemented online and based on personal data. As it will be assessed, the combination of an efficient competition law enforcement and the auspicial adoption of a specific regulation seems to be the best solution to face the challenges raised by “data-related dominance”.
Resumo:
Ecological systems are vulnerable to irreversible change when key system properties are pushed over thresholds, resulting in the loss of resilience and the precipitation of a regime shift. Perhaps the most important of such properties in human-modified landscapes is the total amount of remnant native vegetation. In a seminal study Andren proposed the existence of a fragmentation threshold in the total amount of remnant vegetation, below which landscape-scale connectivity is eroded and local species richness and abundance become dependent on patch size. Despite the fact that species patch-area effects have been a mainstay of conservation science there has yet to be a robust empirical evaluation of this hypothesis. Here we present and test a new conceptual model describing the mechanisms and consequences of biodiversity change in fragmented landscapes, identifying the fragmentation threshold as a first step in a positive feedback mechanism that has the capacity to impair ecological resilience, and drive a regime shift in biodiversity. The model considers that local extinction risk is defined by patch size, and immigration rates by landscape vegetation cover, and that the recovery from local species losses depends upon the landscape species pool. Using a unique dataset on the distribution of non-volant small mammals across replicate landscapes in the Atlantic forest of Brazil, we found strong evidence for our model predictions - that patch-area effects are evident only at intermediate levels of total forest cover, where landscape diversity is still high and opportunities for enhancing biodiversity through local management are greatest. Furthermore, high levels of forest loss can push native biota through an extinction filter, and result in the abrupt, landscape-wide loss of forest-specialist taxa, ecological resilience and management effectiveness. The proposed model links hitherto distinct theoretical approaches within a single framework, providing a powerful tool for analysing the potential effectiveness of management interventions.
Resumo:
Mechanical injuries and diseases in stone fruit are important causes for market rejection. The objectives of this research were to quantify and characterize the mechanical injuries and diseases in peaches, nectarines and plums at Sao Paulo`s wholesale market, the largest in Brazil. Incidence of injuries was assessed weekly in 1 % of the marketed fruit (2973 fruit/week), from September to December in 2003 and 2004. Mechanical injuries were the most frequent injuries in both years, ranging from 8.73% (plum) to 44.5% (nectarine) of injured fruit. There was a significant positive correlation between the incidence of postharvest mechanical injuries and postharvest diseases. Incidence of postharvest diseases varied from 2.5% to 6.6%. Cladosporium rot (Cladosporium sp.) and brown rot (Monilinia fructicola) were the most frequent diseases, and were mostly detected in the apexes of nectarines and peaches. Aurora (peach), Sunraycer (nectarine) and Gulfblaze (plum) varieties were the most susceptible to injuries and diseases. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
The synthesis of new chiral amino alcohols by Heck arylation of an enecarbamate is described. These compounds were used as chiral ligands for the catalytic asymmetric arylation of aldehydes and can be easily recovered. Chiral, nonracemic diarylmethanols were obtained in high yields and enantioselectivities.
Resumo:
The general-care glass ceiling hypothesis states that not only is it more difficult for women than for men to be promoted up levels of authority hierarchies within workplaces but also that the obstacles women face relative to men become greater as they move rtp the hierarchy. Gender-based discrimination in promotions is not simply present across levels of hierarchy but is more intense at higher levels. Empirically, this implies that the relative rates of women being promoted to higher levels compared to men should decline with the level of the hierarchy. This article explores this hypothesis with data from three countries: the United States, Australia, and Sweden. The basic conclusion is that while there is strong evidence for a general gender gap in authority-the odds of women having authority are less than those of men-there is no evidence for systematic glass ceiling effects in the United States and only weak evidence for such effects in the other two countries.
Resumo:
For many species of marine invertebrates, variability in larval settlement behaviour appears to be the rule rather than the exception. This variability has the potential to affect larval dispersal, because settlement behaviour will influence the length of time larvae are in the plankton. Despite the ubiquity and importance of this variability, relatively few sources of variation in larval settlement behaviour have been identified. One important factor that can affect larval settlement behaviour is the nutritional state of larvae. Non-feeding larvae often become less discriminating in their 'choice' of settlement substrate, i.e. more desperate to settle, when energetic reserves run low. We tested whether variation in larval size (and presumably in nutritional reserves) also affects the settlement behaviour of 3 species of colonial marine invertebrate larvae, the bryozoans Bugula neritina and Watersipora subtorquata and the ascidian Diplosoma listerianum. For all 3 species, larger larvae delayed settlement for longer in the absence of settlement cues, and settlement of Bugula neritina larvae was accelerated by the presence of settlement cues, independently of larval size. In the field, larger W subtorquata larvae also took longer to settle than smaller larvae and were more discriminating towards settlement surfaces. These differences in settlement time are likely to result in differences in the distance that larvae disperse in the field. We suggest that species that produce non-feeding larvae can affect the dispersal potential of their offspring by manipulating larval size and thus larval desperation.
Resumo:
The efficient expression and purification of an interfacially active peptide (mLac21) was achieved by using bioprocess-centered molecular design (BMD), wherein key bioprocess considerations are addressed during the initial molecular biology work. The 21 amino acid mLac21 peptide sequence is derived from the lac repressor protein and is shown to have high affinity for the oil-water interface, causing a substantial reduction in interfacial tension following adsorption. The DNA coding for the peptide sequence was cloned into a modified pET-31(b) vector to permit the expression of mLac21 as a fusion to ketosteroid isomerase (KSI). Rational iterative molecular design, taking into account the need for a scaleable bioprocess flowsheet, led to a simple and efficient bioprocess yielding mLac21 at 86% purity following ion exchange chromatography (and >98% following chromatographic polishing). This case study demonstrates that it is possible to produce acceptably pure peptide for potential commodity applications using common scaleable bioprocess unit operations. Moreover, it is shown that BMD is a powerful strategy that can be deployed to reduce bioseparation complexity. (C) 2004 Wiley Periodicals, Inc.