962 resultados para Buffer stock model
Resumo:
In this paper, we consider the impact of the introduction of a closing call auction on market quality of the London Stock Exchange. We employ the market model, RDD and MEC metrics of market quality. These signify substantial improvements to market quality at both the close and open for migrating stocks.We note that these improvements are larger at the open than the close. An important contribution of our paper is that we show that changes to market quality are stronger in those securities that have the lowest liquidity in the pre-call period. In contrast, market quality changes following the introduction of a closing call auction are approximately neutral for high-liquidity securities. We conclude that the implementation of a closing call auction, for high-liquidity securities may not enhance market quality.
Resumo:
Many papers claim that a Log Periodic Power Law (LPPL) model fitted to financial market bubbles that precede large market falls or 'crashes', contains parameters that are confined within certain ranges. Further, it is claimed that the underlying model is based on influence percolation and a martingale condition. This paper examines these claims and their validity for capturing large price falls in the Hang Seng stock market index over the period 1970 to 2008. The fitted LPPLs have parameter values within the ranges specified post hoc by Johansen and Sornette (2001) for only seven of these 11 crashes. Interestingly, the LPPL fit could have predicted the substantial fall in the Hang Seng index during the recent global downturn. Overall, the mechanism posited as underlying the LPPL model does not do so, and the data used to support the fit of the LPPL model to bubbles does so only partially. © 2013.
Resumo:
A special inventory problem is presented: aircraft spares that are repaired and returned to spares, called rotable inventory. Rotable inventory is not consumed so does not change in the medium term, but is rotated through operational, maintenance and stock phases. The objective for inventory performance is fleet Service Level (SL), which effects aircraft dispatch performance. A model is proposed where the fleet SL drives combined stock levels such that cost is optimized. By holding greater numbers of lower-cost items and holding lower levels of more expensive items, it is possible to achieve substantial cost savings while maintaining performance. This approach is shown to be an advance over the current literature and is tested with case data, with conclusive results.
Resumo:
Purpose – On 29 January 2001, Euronext LIFFE introduced single security futures contracts on a range of global companies. The purpose of this paper is to examine the impact that the introduction of these futures contracts had on the behaviour of opening and closing UK equity returns. Design/methodology/approach – The paper models the price discovery process using the Amihud and Mendelson partial adjustment model which can be estimated using a Kalman filter. Findings – Empirical results show that during the pre-futures period both opening and closing returns under-react to new information. After the introduction of futures contracts opening returns over-react. A rise in the partial adjustment coefficient also takes place for closing returns but this is not large enough to cause over-reaction. Originality/value – This is the first study to examine the impact of a single security futures contract on the speed of spot market price discovery.
Resumo:
Recent investigations into cross-country convergence follow Mankiw, Romer, and Weil (1992) in using a log-linear approximation to the Swan-Solow growth model to specify regressions. These studies tend to assume a common and exogenous technology. In contrast, the technology catch-up literature endogenises the growth of technology. The use of capital stock data renders the approximations and over-identification of the Mankiw model unnecessary and enables us, using dynamic panel estimation, to estimate the separate contributions of diminishing returns and technology transfer to the rate of conditional convergence. We find that both effects are important.
Resumo:
The information domain is a recognised sphere for the influence, ownership, and control of information and it's specifications, format, exploitation and explanation (Thompson, 1967). The article presents a description of the financial information domain issues related to the organisation and operation of a stock market. We review the strategic, institutional and standards dimensions of the stock market information domain in relation to the current semantic web knowledge and how and whether this could be used in modern web based stock market information systems to provide the quality of information that their stakeholders want. The analysis is based on the FINE model (Blanas, 2003). The analysis leads to a number of research questions for future research.
Resumo:
We examine how the most prevalent stochastic properties of key financial time series have been affected during the recent financial crises. In particular we focus on changes associated with the remarkable economic events of the last two decades in the volatility dynamics, including the underlying volatility persistence and volatility spillover structure. Using daily data from several key stock market indices, the results of our bivariate GARCH models show the existence of time varying correlations as well as time varying shock and volatility spillovers between the returns of FTSE and DAX, and those of NIKKEI and Hang Seng, which became more prominent during the recent financial crisis. Our theoretical considerations on the time varying model which provides the platform upon which we integrate our multifaceted empirical approaches are also of independent interest. In particular, we provide the general solution for time varying asymmetric GARCH specifications, which is a long standing research topic. This enables us to characterize these models by deriving, first, their multistep ahead predictors, second, the first two time varying unconditional moments, and third, their covariance structure.
Resumo:
Video streaming via Transmission Control Protocol (TCP) networks has become a popular and highly demanded service, but its quality assessment in both objective and subjective terms has not been properly addressed. In this paper, based on statistical analysis a full analytic model of a no-reference objective metric, namely pause intensity (PI), for video quality assessment is presented. The model characterizes the video playout buffer behavior in connection with the network performance (throughput) and the video playout rate. This allows for instant quality measurement and control without requiring a reference video. PI specifically addresses the need for assessing the quality issue in terms of the continuity in the playout of TCP streaming videos, which cannot be properly measured by other objective metrics such as peak signal-to-noise-ratio, structural similarity, and buffer underrun or pause frequency. The performance of the analytical model is rigidly verified by simulation results and subjective tests using a range of video clips. It is demonstrated that PI is closely correlated with viewers' opinion scores regardless of the vastly different composition of individual elements, such as pause duration and pause frequency which jointly constitute this new quality metric. It is also shown that the correlation performance of PI is consistent and content independent. © 2013 IEEE.
Resumo:
MSC 2010: 26A33, 33E12, 33C60, 44A20
Resumo:
From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubblelike deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the nonfundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.
Resumo:
My dissertation investigates the financial linkages and transmission of economic shocks between the US and the smallest emerging markets (frontier markets). The first chapter sets up an empirical model that examines the impact of US market returns and conditional volatility on the returns and conditional volatilities of twenty-one frontier markets. The model is estimated via maximum likelihood; utilizes the GARCH model of errors, and is applied to daily country data from the MSCI Barra. We find limited, but statistically significant exposure of Frontier markets to shocks from the US. Our results suggest that it is not the lagged US market returns that have impact; rather it is the expected US market returns that influence frontier market returns The second chapter sets up an empirical time-varying parameter (TVP) model to explore the time-variation in the impact of mean US returns on mean Frontier market returns. The model utilizes the Kalman filter algorithm as well as the GARCH model of errors and is applied to daily country data from the MSCI Barra. The TVP model detects statistically significant time-variation in the impact of US returns and low, but statistically and quantitatively important impact of US market conditional volatility. The third chapter studies the risk-return relationship in twenty Frontier country stock markets by setting up an international version of the intertemporal capital asset pricing model. The systematic risk in this model comes from covariance of Frontier market stock index returns with world returns. Both the systematic risk and risk premium are time-varying in our model. We also incorporate own country variances as additional determinants of Frontier country returns. Our results suggest statistically significant impact of both world and own country risk in explaining Frontier country returns. Time-variation in the world risk premium is also found to be statistically significant for most Frontier market returns. However, own country risk is found to be quantitatively more important.
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubble-like deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the non-fundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.
Modelling the effectiveness of grass buffer strips in managing muddy floods under a changing climate
Resumo:
Muddy floods occur when rainfall generates runoff on agricultural land, detaching and transporting sediment into the surrounding natural and built environment. In the Belgian Loess Belt, muddy floods occur regularly and lead to considerable economic costs associated with damage to property and infrastructure. Mitigation measures designed to manage the problem have been tested in a pilot area within Flanders and were found to be cost-effective within three years. This study assesses whether these mitigation measures will remain effective under a changing climate. To test this, the Water Erosion Prediction Project (WEPP) model was used to examine muddy flooding diagnostics (precipitation, runoff, soil loss and sediment yield) for a case study hillslope in Flanders where grass buffer strips are currently used as a mitigation measure. The model was run for present day conditions and then under 33 future site-specific climate scenarios. These future scenarios were generated from three earth system models driven by four representative concentration pathways and downscaled using quantile mapping and the weather generator CLIGEN. Results reveal that under the majority of future scenarios, muddy flooding diagnostics are projected to increase, mostly as a consequence of large scale precipitation events rather than mean changes. The magnitude of muddy flood events for a given return period is also generally projected to increase. These findings indicate that present day mitigation measures may have a reduced capacity to manage muddy flooding given the changes imposed by a warming climate with an enhanced hydrological cycle. Revisions to the design of existing mitigation measures within existing policy frameworks are considered the most effective way to account for the impacts of climate change in future mitigation planning.
Resumo:
Este estudio presenta la validación de las observaciones que realizó el programa de observación pesquera llamado Programa Bitácoras de Pesca (PBP) durante el periodo 2005 - 2011 en el área de distribución donde operan las embarcaciones industriales de cerco dedicadas a la pesca del stock norte-centro de la anchoveta peruana (Engraulis ringens). Además, durante ese mismo periodo y área de distribución, se estimó la magnitud del descarte por exceso de captura, descarte de juveniles y la captura incidental de dicha pesquera. Se observaron 3 768 viajes de un total de 302 859, representando un porcentaje de 1.2 %. Los datos del descarte por exceso de captura, descarte de juveniles y captura incidental registrados en los viajes observados, se caracterizaron por presentar un alta proporción de ceros. Para la validación de las observaciones, se realizó un estudio de simulación basado en la metodología de Monte Carlo usando un modelo de distribución binomial negativo. Esta permite inferir sobre el nivel de cobertura óptima y conocer si la información obtenida en el programa de observación es contable. De este análisis, se concluye que los niveles de observación actual se deberían incrementar hasta tener un nivel de cobertura de al menos el 10% del total de viajes que realicen en el año las embarcaciones industriales de cerco dedicadas a la pesca del stock norte-centro de la anchoveta peruana. La estimación del descarte por exceso de captura, descarte de juveniles y captura incidental se realizó mediante tres metodologías: Bootstrap, Modelo General Lineal (GLM) y Modelo Delta. Cada metodología estimó distintas magnitudes con tendencias similares. Las magnitudes estimadas fueron comparadas usando un ANOVA Bayesiano, la cual muestra que hubo escasa evidencia que las magnitudes estimadas del descarte por exceso de captura por metodología sean diferentes, lo mismo se presentó para el caso de la captura incidental, mientras que para el descarte de juveniles mostró que hubieron diferencias sustanciales de ser diferentes. La metodología que cumplió los supuestos y explico la mayor variabilidad de las variables modeladas fue el Modelo Delta, el cual parece ser una mejor alternativa para la estimación, debido a la alta proporción de ceros en los datos. Las estimaciones promedio del descarte por exceso de captura, descarte de juveniles y captura incidental aplicando el Modelo Delta, fueron 252 580, 41 772, 44 823 toneladas respectivamente, que en conjunto representaron el 5.74% de los desembarques. Además, con la magnitud de la estimación del descarte de juveniles, se realizó un ejercicio de proyección de biomasa bajo el escenario hipotético de no mortalidad por pesca y que los individuos juveniles descartados sólo presentaron tallas de 8 y 11 cm., en la cual se obtuvo que la biomasa que no estará disponible a la pesca está entre los 52 mil y 93 mil toneladas.