9 resultados para Market penetration
em Duke University
Resumo:
Recent empirical findings suggest that the long-run dependence in U.S. stock market volatility is best described by a slowly mean-reverting fractionally integrated process. The present study complements this existing time-series-based evidence by comparing the risk-neutralized option pricing distributions from various ARCH-type formulations. Utilizing a panel data set consisting of newly created exchange traded long-term equity anticipation securities, or leaps, on the Standard and Poor's 500 stock market index with maturity times ranging up to three years, we find that the degree of mean reversion in the volatility process implicit in these prices is best described by a Fractionally Integrated EGARCH (FIEGARCH) model. © 1999 Elsevier Science S.A. All rights reserved.
Resumo:
Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.
Resumo:
Consistent with the implications from a simple asymmetric information model for the bid-ask spread, we present empirical evidence that the size of the bid-ask spread in the foreign exchange market is positively related to the underlying exchange rate uncertainty. The estimation results are based on an ordered probit analysis that captures the discreteness in the spread distribution, with the uncertainty of the spot exchange rate being quantified through a GARCH type model. The data sets consists of more than 300,000 continuously recorded Deutschemark/dollar quotes over the period from April 1989 to June 1989. © 1994.
Resumo:
Surgery is one of the most effective and widely used procedures in treating human cancers, but a major problem is that the surgeon often fails to remove the entire tumor, leaving behind tumor-positive margins, metastatic lymph nodes, and/or satellite tumor nodules. Here we report the use of a hand-held spectroscopic pen device (termed SpectroPen) and near-infrared contrast agents for intraoperative detection of malignant tumors, based on wavelength-resolved measurements of fluorescence and surface-enhanced Raman scattering (SERS) signals. The SpectroPen utilizes a near-infrared diode laser (emitting at 785 nm) coupled to a compact head unit for light excitation and collection. This pen-shaped device effectively removes silica Raman peaks from the fiber optics and attenuates the reflected excitation light, allowing sensitive analysis of both fluorescence and Raman signals. Its overall performance has been evaluated by using a fluorescent contrast agent (indocyanine green, or ICG) as well as a surface-enhanced Raman scattering (SERS) contrast agent (pegylated colloidal gold). Under in vitro conditions, the detection limits are approximately 2-5 × 10(-11) M for the indocyanine dye and 0.5-1 × 10(-13) M for the SERS contrast agent. Ex vivo tissue penetration data show attenuated but resolvable fluorescence and Raman signals when the contrast agents are buried 5-10 mm deep in fresh animal tissues. In vivo studies using mice bearing bioluminescent 4T1 breast tumors further demonstrate that the tumor borders can be precisely detected preoperatively and intraoperatively, and that the contrast signals are strongly correlated with tumor bioluminescence. After surgery, the SpectroPen device permits further evaluation of both positive and negative tumor margins around the surgical cavity, raising new possibilities for real-time tumor detection and image-guided surgery.
Resumo:
Policy makers and analysts are often faced with situations where it is unclear whether market-based instruments hold real promise of reducing costs, relative to conventional uniform standards. We develop analytic expressions that can be employed with modest amounts of information to estimate the potential cost savings associated with market-based policies, with an application to the environmental policy realm. These simple formulae can identify instruments that merit more detailed investigation. We illustrate the use of these results with an application to nitrogen oxides control by electric utilities in the United States.
Resumo:
To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.
Resumo:
The application of semantic technologies to the integration of biological data and the interoperability of bioinformatics analysis and visualization tools has been the common theme of a series of annual BioHackathons hosted in Japan for the past five years. Here we provide a review of the activities and outcomes from the BioHackathons held in 2011 in Kyoto and 2012 in Toyama. In order to efficiently implement semantic technologies in the life sciences, participants formed various sub-groups and worked on the following topics: Resource Description Framework (RDF) models for specific domains, text mining of the literature, ontology development, essential metadata for biological databases, platforms to enable efficient Semantic Web technology development and interoperability, and the development of applications for Semantic Web data. In this review, we briefly introduce the themes covered by these sub-groups. The observations made, conclusions drawn, and software development projects that emerged from these activities are discussed.
Resumo:
Market failures associated with environmental pollution interact with market failures associated with the innovation and diffusion of new technologies. These combined market failures provide a strong rationale for a portfolio of public policies that foster emissions reduction as well as the development and adoption of environmentally beneficial technology. Both theory and empirical evidence suggest that the rate and direction of technological advance is influenced by market and regulatory incentives, and can be cost-effectively harnessed through the use of economic-incentive based policy. In the presence of weak or nonexistent environmental policies, investments in the development and diffusion of new environmentally beneficial technologies are very likely to be less than would be socially desirable. Positive knowledge and adoption spillovers and information problems can further weaken innovation incentives. While environmental technology policy is fraught with difficulties, a long-term view suggests a strategy of experimenting with policy approaches and systematically evaluating their success. © 2005 Elsevier B.V. All rights reserved.