10 resultados para Market segmentation
em Duke University
Resumo:
Recent empirical findings suggest that the long-run dependence in U.S. stock market volatility is best described by a slowly mean-reverting fractionally integrated process. The present study complements this existing time-series-based evidence by comparing the risk-neutralized option pricing distributions from various ARCH-type formulations. Utilizing a panel data set consisting of newly created exchange traded long-term equity anticipation securities, or leaps, on the Standard and Poor's 500 stock market index with maturity times ranging up to three years, we find that the degree of mean reversion in the volatility process implicit in these prices is best described by a Fractionally Integrated EGARCH (FIEGARCH) model. © 1999 Elsevier Science S.A. All rights reserved.
Resumo:
Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.
Resumo:
Consistent with the implications from a simple asymmetric information model for the bid-ask spread, we present empirical evidence that the size of the bid-ask spread in the foreign exchange market is positively related to the underlying exchange rate uncertainty. The estimation results are based on an ordered probit analysis that captures the discreteness in the spread distribution, with the uncertainty of the spot exchange rate being quantified through a GARCH type model. The data sets consists of more than 300,000 continuously recorded Deutschemark/dollar quotes over the period from April 1989 to June 1989. © 1994.
Resumo:
Segmentation of anatomical and pathological structures in ophthalmic images is crucial for the diagnosis and study of ocular diseases. However, manual segmentation is often a time-consuming and subjective process. This paper presents an automatic approach for segmenting retinal layers in Spectral Domain Optical Coherence Tomography images using graph theory and dynamic programming. Results show that this method accurately segments eight retinal layer boundaries in normal adult eyes more closely to an expert grader as compared to a second expert grader.
Resumo:
While genome-wide gene expression data are generated at an increasing rate, the repertoire of approaches for pattern discovery in these data is still limited. Identifying subtle patterns of interest in large amounts of data (tens of thousands of profiles) associated with a certain level of noise remains a challenge. A microarray time series was recently generated to study the transcriptional program of the mouse segmentation clock, a biological oscillator associated with the periodic formation of the segments of the body axis. A method related to Fourier analysis, the Lomb-Scargle periodogram, was used to detect periodic profiles in the dataset, leading to the identification of a novel set of cyclic genes associated with the segmentation clock. Here, we applied to the same microarray time series dataset four distinct mathematical methods to identify significant patterns in gene expression profiles. These methods are called: Phase consistency, Address reduction, Cyclohedron test and Stable persistence, and are based on different conceptual frameworks that are either hypothesis- or data-driven. Some of the methods, unlike Fourier transforms, are not dependent on the assumption of periodicity of the pattern of interest. Remarkably, these methods identified blindly the expression profiles of known cyclic genes as the most significant patterns in the dataset. Many candidate genes predicted by more than one approach appeared to be true positive cyclic genes and will be of particular interest for future research. In addition, these methods predicted novel candidate cyclic genes that were consistent with previous biological knowledge and experimental validation in mouse embryos. Our results demonstrate the utility of these novel pattern detection strategies, notably for detection of periodic profiles, and suggest that combining several distinct mathematical approaches to analyze microarray datasets is a valuable strategy for identifying genes that exhibit novel, interesting transcriptional patterns.
Resumo:
Policy makers and analysts are often faced with situations where it is unclear whether market-based instruments hold real promise of reducing costs, relative to conventional uniform standards. We develop analytic expressions that can be employed with modest amounts of information to estimate the potential cost savings associated with market-based policies, with an application to the environmental policy realm. These simple formulae can identify instruments that merit more detailed investigation. We illustrate the use of these results with an application to nitrogen oxides control by electric utilities in the United States.
Resumo:
© 2016 The Author(s).Mid-ocean ridges display tectonic segmentation defined by discontinuities of the axial zone, and geophysical and geochemical observations suggest segmentation of the underlying magmatic plumbing system. Here, observations of tectonic and magmatic segmentation at ridges spreading from fast to ultraslow rates are reviewed in light of influential concepts of ridge segmentation, including the notion of hierarchical segmentation, spreading cells and centralized v. multiple supply of mantle melts. The observations support the concept of quasi-regularly spaced principal magmatic segments, which are 30-50 km long on average at fast- to slow-spreading ridges and fed by melt accumulations in the shallow asthenosphere. Changes in ridge properties approaching or crossing transform faults are often comparable with those observed at smaller offsets, and even very small discontinuities can be major boundaries in ridge properties. Thus, hierarchical segmentation models that suggest large-scale transform fault-bounded segmentation arises from deeper level processes in the asthenosphere than the finer-scale segmentation are not generally supported. The boundaries between some but not all principal magmatic segments defined by ridge axis geophysical properties coincide with geochemical boundaries reflecting changes in source composition or melting processes. Where geochemical boundaries occur, they can coincide with discontinuities of a wide range of scales.
Resumo:
To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.
Resumo:
Market failures associated with environmental pollution interact with market failures associated with the innovation and diffusion of new technologies. These combined market failures provide a strong rationale for a portfolio of public policies that foster emissions reduction as well as the development and adoption of environmentally beneficial technology. Both theory and empirical evidence suggest that the rate and direction of technological advance is influenced by market and regulatory incentives, and can be cost-effectively harnessed through the use of economic-incentive based policy. In the presence of weak or nonexistent environmental policies, investments in the development and diffusion of new environmentally beneficial technologies are very likely to be less than would be socially desirable. Positive knowledge and adoption spillovers and information problems can further weaken innovation incentives. While environmental technology policy is fraught with difficulties, a long-term view suggests a strategy of experimenting with policy approaches and systematically evaluating their success. © 2005 Elsevier B.V. All rights reserved.