915 resultados para Forecasting Volatility


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Írásunkban azt vizsgáljuk, hogy a hosszú lejáratú határidős árfolyamok stacionaritását feltételező hibakorrekciós modellek, amelyeknek korábbi számítások szerint - a világ devizapiaci forgalmának mintegy 75 százalékát kitevő fejlett ipari országokra alkalmazva - kitűnő a mintán kívüli előrejelző erejük, hogyan képesek három keletközép- európai ország devizaárfolyamát előrejelezni. A három vizsgálat alá vont deviza (cseh, magyar, lengyel) esetében az eredmények relációnként nagyon eltérnek, és összességében kedvezőtlenebbek, mint a fejlett ipari országokra kapott eredmények, amit a nem teljesen rugalmas árfolyamrezsim, a rendelkezésre álló adatsor rövidsége, az eurózóna-csatlakozáshoz kapcsolódó bizonytalanságok, a devizakockázati és a határidős kamatprémium létezése, továbbá a Balassa-Samuelson-hatás együttes befolyásaként tudunk értelmezni. JEL kód: E43, F31, F47. /===/ This paper studies whether models that assume long-maturity forward exchange rates are stationary (which proved in earlier studies to provide superior forecasting ability when applied to exchange rates of major currencies) are capable of forecasting the Euro exchange rates of three Central-East European currencies (the Czech koruna, Hungarian forint and Polish zloty). The results for the three currencies differ from each other and are generally much worse than those obtained earlier for major currencies. These unfavourable results are attributed to the consequences of managed exchange-rate systems, to the short time series available, to uncertainties related to future Euro-zone entry, to the existence of a foreign exchange and term premium, and to the Balassa–Samuelson effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Széleskörűen alátámasztott, empirikus tény, hogy önmagában a nagyobb volatilitás csökkenti a piac likviditását, vagyis változékonyabb piacokon várhatóan nagyobb lesz egy-egy tranzakció áreltérítő hatása. Kutatásomban azt a kérdést vizsgáltam, hogy a Budapesti Értéktőzsdén az OTP-részvény piacán a 2007/2008-as válságban tapasztalható, átmeneti likviditáscsökkenés betudható volt-e egyszerűen a megnövekedett volatilitásnak, vagy ezen túl abban más tényezők (pl. a szereplők körének és viselkedésének drasztikus megváltozása, általános forráscsökkenés stb.) is szerepet játszhattak-e. A volatilitást a loghozamok szórásával, illetve a tényleges ársávval, míg az illikviditást a Budapesti Likviditási Mértékkel (BLM) reprezentáltam. Egyrészt azt állapítottam meg, hogy az OTP esetében a tényleges ársáv szorosabban korrelál a BLM-mel, mint a szórás. Másrészt az is egyértelmű, hogy a válság előtti kapcsolat a volatilitás és a likviditás között a válságban és azután már jelentősen megváltozott. Válságban az illikviditás jóval nagyobb volt, mint amit a volatilitás növekedése alapján vártunk, a válság lecsengése után azonban megfordult ez a reláció. _________ It is a widely supported empirical fact, that the greater volatility in itself decreases the liquidity of the market, namely more volatile a market is, the higher a transaction’s price impact will be. I have examined in my paper the question, whether the decrease of liquidity during the crisis of 2007/2008 in case of the OTP stock – traded on the Budapest Stock Exchange – was the consequence of the increased volatility, or other factors had an effect on the illiquidity as well (e.g.: the drastic change of market participants’ behaviour; reduction of fi nancing sources; etc.). I have represented volatility with the standard deviation of the logreturns, and with the true range, while the illiquidity with the Budapest Liquidity Measure (BLM). On one hand I have identifi ed, that in case of the OTP, the true range has a stronger relationship with the BLM than the standard deviation has. On the other hand it was clear, that the relationship between volatility and liquidity has changed notably during and after the crisis. During crisis the illiquidity was greater than what I have estimated based on the volatility increase, but after the crisis this relation has changed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ebben a cikkben bemutatjuk az MTA KRTK KTI munkaerő-piaci előrejelző rendszerének nagy léptékű szerkezetét, a szerkezet kialakítása során követett főbb elveket. Ismertetjük a hazai gyakorlatban egyedülállóan összetett és széles körű adatrendszert, amelyen a becslést és az előrejelzést elvégeztük. Röviden kitérünk az ágazati GDP előrejelzésére, a modell keresleti és kínálati oldalának működésére, valamint a kereslet és kínálat közötti eltérések dinamikájának vizsgálatára. ______ The article presents the overall structure, and main principles followed in devising the structure, of the labour-market forecasting system developed by the Institute of Economics of the Research Centre for Economic and Regional Studies of the Hungarian Academy of Sciences (MTA KRTK KTI). The authors present the broad, comprehensive data system unprecedented in Hungarian practice, from which the estimate and forecast are made. The article diverges briefly onto the forecasting of branch GDP, the mode of operation of the supply and demand sides of the model, and examination of the dynamics of discrepancies between supply and demand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A vállalatok jelentős része szembesül azzal, hogy termékei jelentős része iránt viszonylag kevés alkalommal jelentkezik kereslet. Ebből következik, hogy az ilyen termékekre a klasszikus előrejelzési módszerek, mint pl. a mozgó átlag számítása, vagy az exponenciális simítás nem alkalmazható. Azon termékeket, amelyek iránt viszonylag ritkán jelenik meg kereslet, sporadikus keresletű termékeknek nevezzük. A megkülönböztetés a sporadikus és nem sporadikus termékek között sokszor csak hüvelykujj szabály alapján állapítható meg, de erre vonatkozóan a szakirodalomban találunk iránymutatást. A nemzetközi szakirodalomban már megjelentek olyan új kereslet-előrejelzési módszerek, melyeket kimondottan az ilyen, sporadikus kereslettel rendelkező termékek estében javasoltak. Cikkünk célja, hogy ezeket a szakirodalmi ajánlásokat egy konkrét hazai vállalat valós adatain esettanulmány jelleggel tesztelje. A nemzetközi szakirodalomban is ritkán publikálnak tudományos dolgozatokat, amelyek ezt a témakört valós alkalmazási környezetben tárgyalják; ismereteink szerint magyar nyelven erről tudományos dolgozat pedig még nem született. Elméleti bevezetőnk után egy gyógyszer-nagykereskedelmi vállalatnál valós adatait használva vizsgáljuk a kérdéskört. Sor kerül a vállalat termékportfóliójának a kereslet-előrejelzés szempontjából történő tipizálására, majd sporadikus keresletű termékek keresletének előrejelzésére és ennek során a szakirodalomban az alkalmazandó módszerekre vonatkozó ajánlások vizsgálatára. _____ Significant numbers of companies have the problem that demand for their products are sporadic in nature. Demand of such products is not continual in time; its demand is diffused, is random with large proportion of zero values in the analyzed time series. The sporadic character of a demand pattern actually means that available information on the demand of previous selling periods is leaky resulting in lower quality of data available. In these cases traditional forecasting techniques do not result in reliable forecast. Special forecasting algorithms have been developed during the last decade dealing with this problem. The paper introduces these techniques and offers suggestions for application. It also presents the case study of a Hungarian pharmaceutical wholesaler company. Based on real data we develop a topology of the company's product portfolio, carry out forecasts using different techniques including those developed for products with sporadic demand and also analyze the quality of these forecasts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A szerző a 2008-ban kezdődött gazdasági világválság hatását vizsgálja az egy részvényre jutó nyereség előrejelzésének hibájára. Számos publikáció bizonyította, hogy az elemzők a tényértékeknél szisztematikusan kedvezőbb tervértéket adnak meg az egy részvényre jutó előrejelzéseikben. Más vizsgálatok azt igazolták, hogy az egy részvényre jutó előrejelzési hiba bizonytalan környezetben növekszik, míg arra is számos bizonyítékot lehet találni, hogy a negatív hírek hatását az elemzők alulsúlyozzák. A gazdasági világválság miatt az elemzőknek számtalan negatív hírt kellett figyelembe venniük az előrejelzések készítésekor, továbbá a válság az egész gazdaságban jelentősen növelte a bizonytalanságot. A szerző azt vizsgálja, hogy miként hatott a gazdasági világválság az egy részvényre jutó nyereség- előrejelzés hibájára, megkülönböztetve azt az időszakot, amíg a válság negatív hír volt, attól, amikor már hatásaként jelentősen megnőtt a bizonytalanság. _____ The author investigated the impact of the financial crisis that started in 2008 on the forecasting error for earnings per share. There is plentiful evidence from the 1980s that analysts give systematically more favourable values in their earnings per share (EPS) forecasts than reality, i.e. they are generally optimistic. Other investigations have supported the idea that the EPS forecasting error is greater under uncertain environmental circumstances, while other researchers prove that the analysts under-react to the negative information in their forecasts. The financial crisis brought a myriad of negative information for analysts to consider in such forecasts, while also increasing the level of uncertainty for the entire economy. The article investigates the impact of the financial crisis on the EPS forecasting error, distinguishing the period when the crisis gave merely negative information, from the one when its effect of uncertainty was significantly increased over the entire economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Bázel–2. tőkeegyezmény bevezetését követően a bankok és hitelintézetek Magyarországon is megkezdték saját belső minősítő rendszereik felépítését, melyek karbantartása és fejlesztése folyamatos feladat. A szerző arra a kérdésre keres választ, hogy lehetséges-e a csőd-előrejelző modellek előre jelző képességét növelni a hagyományos matematikai-statisztikai módszerek alkalmazásával oly módon, hogy a modellekbe a pénzügyi mutatószámok időbeli változásának mértékét is beépítjük. Az empirikus kutatási eredmények arra engednek következtetni, hogy a hazai vállalkozások pénzügyi mutatószámainak időbeli alakulása fontos információt hordoz a vállalkozás jövőbeli fizetőképességéről, mivel azok felhasználása jelentősen növeli a csődmodellek előre jelző képességét. A szerző azt is megvizsgálja, hogy javítja-e a megfigyelések szélsőségesen magas vagy alacsony értékeinek modellezés előtti korrekciója a modellek klasszifikációs teljesítményét. ______ Banks and lenders in Hungary also began, after the introduction of the Basel 2 capital agreement, to build up their internal rating systems, whose maintenance and development are a continuing task. The author explores whether it is possible to increase the predictive capacity of business-failure forecasting models by traditional mathematical-cum-statistical means in such a way that they incorporate the measure of change in the financial indicators over time. Empirical findings suggest that the temporal development of the financial indicators of firms in Hungary carries important information about future ability to pay, since the predictive capacity of bankruptcy forecasting models is greatly increased by using such indicators. The author also examines whether the classification performance of the models can be improved by correcting for extremely high or low values before modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubblelike deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the nonfundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An iterative travel time forecasting scheme, named the Advanced Multilane Prediction based Real-time Fastest Path (AMPRFP) algorithm, is presented in this dissertation. This scheme is derived from the conventional kernel estimator based prediction model by the association of real-time nonlinear impacts that caused by neighboring arcs’ traffic patterns with the historical traffic behaviors. The AMPRFP algorithm is evaluated by prediction of the travel time of congested arcs in the urban area of Jacksonville City. Experiment results illustrate that the proposed scheme is able to significantly reduce both the relative mean error (RME) and the root-mean-squared error (RMSE) of the predicted travel time. To obtain high quality real-time traffic information, which is essential to the performance of the AMPRFP algorithm, a data clean scheme enhanced empirical learning (DCSEEL) algorithm is also introduced. This novel method investigates the correlation between distance and direction in the geometrical map, which is not considered in existing fingerprint localization methods. Specifically, empirical learning methods are applied to minimize the error that exists in the estimated distance. A direction filter is developed to clean joints that have negative influence to the localization accuracy. Synthetic experiments in urban, suburban and rural environments are designed to evaluate the performance of DCSEEL algorithm in determining the cellular probe’s position. The results show that the cellular probe’s localization accuracy can be notably improved by the DCSEEL algorithm. Additionally, a new fast correlation technique for overcoming the time efficiency problem of the existing correlation algorithm based floating car data (FCD) technique is developed. The matching process is transformed into a 1-dimensional (1-D) curve matching problem and the Fast Normalized Cross-Correlation (FNCC) algorithm is introduced to supersede the Pearson product Moment Correlation Co-efficient (PMCC) algorithm in order to achieve the real-time requirement of the FCD method. The fast correlation technique shows a significant improvement in reducing the computational cost without affecting the accuracy of the matching process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extensive data sets on water quality and seagrass distributions in Florida Bay have been assembled under complementary, but independent, monitoring programs. This paper presents the landscape-scale results from these monitoring programs and outlines a method for exploring the relationships between two such data sets. Seagrass species occurrence and abundance data were used to define eight benthic habitat classes from 677 sampling locations in Florida Bay. Water quality data from 28 monitoring stations spread across the Bay were used to construct a discriminant function model that assigned a probability of a given benthic habitat class occurring for a given combination of water quality variables. Mean salinity, salinity variability, the amount of light reaching the benthos, sediment depth, and mean nutrient concentrations were important predictor variables in the discriminant function model. Using a cross-validated classification scheme, this discriminant function identified the most likely benthic habitat type as the actual habitat type in most cases. The model predicted that the distribution of benthic habitat types in Florida Bay would likely change if water quality and water delivery were changed by human engineering of freshwater discharge from the Everglades. Specifically, an increase in the seasonal delivery of freshwater to Florida Bay should cause an expansion of seagrass beds dominated by Ruppia maritima and Halodule wrightii at the expense of the Thalassia testudinum-dominated community that now occurs in northeast Florida Bay. These statistical techniques should prove useful for predicting landscape-scale changes in community composition in diverse systems where communities are in quasi-equilibrium with environmental drivers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urban growth models have been used for decades to forecast urban development in metropolitan areas. Since the 1990s cellular automata, with simple computational rules and an explicitly spatial architecture, have been heavily utilized in this endeavor. One such cellular-automata-based model, SLEUTH, has been successfully applied around the world to better understand and forecast not only urban growth but also other forms of land-use and land-cover change, but like other models must be fed important information about which particular lands in the modeled area are available for development. Some of these lands are in categories for the purpose of excluding urban growth that are difficult to quantify since their function is dictated by policy. One such category includes voluntary differential assessment programs, whereby farmers agree not to develop their lands in exchange for significant tax breaks. Since they are voluntary, today’s excluded lands may be available for development at some point in the future. Mapping the shifting mosaic of parcels that are enrolled in such programs allows this information to be used in modeling and forecasting. In this study, we added information about California’s Williamson Act into SLEUTH’s excluded layer for Tulare County. Assumptions about the voluntary differential assessments were used to create a sophisticated excluded layer that was fed into SLEUTH’s urban growth forecasting routine. The results demonstrate not only a successful execution of this method but also yielded high goodness-of-fit metrics for both the calibration of enrollment termination as well as the urban growth modeling itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In their dialogue entitled - The Food Service Industry Environment: Market Volatility Analysis - by Alex F. De Noble, Assistant Professor of Management, San Diego State University and Michael D. Olsen, Associate Professor and Director, Division of Hotel, Restaurant & Institutional Management at Virginia Polytechnic Institute and State University, De Noble and Olson preface the discussion by saying: “Hospitality executives, as a whole, do not believe they exist in a volatile environment and spend little time or effort in assessing how current and future activity in the environment will affect their success or failure. The authors highlight potential differences that may exist between executives' perceptions and objective indicators of environmental volatility within the hospitality industry and suggest that executives change these perceptions by incorporating the assumption of a much more dynamic environment into their future strategic planning efforts. Objective, empirical evidence of the dynamic nature of the hospitality environment is presented and compared to several studies pertaining to environmental perceptions of the industry.” That weighty thesis statement presumes that hospitality executives/managers do not fully comprehend the environment in which they operate. The authors provide a contrast, which conventional wisdom would seem to support and satisfy. “Broadly speaking, the operating environment of an organization is represented by its task domain,” say the authors. “This task domain consists of such elements as a firm's customers, suppliers, competitors, and regulatory groups.” These are dynamic actors and the underpinnings of change, say the authors by way of citation. “The most difficult aspect for management in this regard tends to be the development of a proper definition of the environment of their particular firm. Being able to precisely define who the customers, competitors, suppliers, and regulatory groups are within the environment of the firm is no easy task, yet is imperative if proper planning is to occur,” De Noble and Olson further contribute to support their thesis statement. The article is bloated, and that’s not necessarily a bad thing, with tables both survey and empirically driven, to illustrate market volatility. One such table is the Bates and Eldredge outline; Table-6 in the article. “This comprehensive outline…should prove to be useful to most executives in expanding their perception of the environment of their firm,” say De Noble and Olson. “It is, however, only a suggested outline,” they advise. “…risk should be incorporated into every investment decision, especially in a volatile environment,” say the authors. De Noble and Olson close with an intriguing formula to gauge volatility in an environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Type 2 diabetes mellitus (T2DM) is increasingly becoming a major public health problem worldwide. Estimating the future burden of diabetes is instrumental to guide the public health response to the epidemic. This study aims to project the prevalence of T2DM among adults in Syria over the period 2003–2022 by applying a modelling approach to the country’s own data. Methods Future prevalence of T2DM in Syria was estimated among adults aged 25 years and older for the period 2003–2022 using the IMPACT Diabetes Model (a discrete-state Markov model). Results According to our model, the prevalence of T2DM in Syria is projected to double in the period between 2003 and 2022 (from 10% to 21%). The projected increase in T2DM prevalence is higher in men (148%) than in women (93%). The increase in prevalence of T2DM is expected to be most marked in people younger than 55 years especially the 25–34 years age group. Conclusions The future projections of T2DM in Syria put it amongst countries with the highest levels of T2DM worldwide. It is estimated that by 2022 approximately a fifth of the Syrian population aged 25 years and older will have T2DM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Space-for-time substitution is often used in predictive models because long-term time-series data are not available. Critics of this method suggest factors other than the target driver may affect ecosystem response and could vary spatially, producing misleading results. Monitoring data from the Florida Everglades were used to test whether spatial data can be substituted for temporal data in forecasting models. Spatial models that predicted bluefin killifish (Lucania goodei) population response to a drying event performed comparably and sometimes better than temporal models. Models worked best when results were not extrapolated beyond the range of variation encompassed by the original dataset. These results were compared to other studies to determine whether ecosystem features influence whether space-for-time substitution is feasible. Taken in the context of other studies, these results suggest space-for-time substitution may work best in ecosystems with low beta-diversity, high connectivity between sites, and small lag in organismal response to the driver variable.