883 resultados para FORECASTING


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

First, in Essay 1, we test whether it is possible to forecast Finnish Options Index return volatility by examining the out-of-sample predictive ability of several common volatility models with alternative well-known methods; and find additional evidence for the predictability of volatility and for the superiority of the more complicated models over the simpler ones. Secondly, in Essay 2, the aggregated volatility of stocks listed on the Helsinki Stock Exchange is decomposed into a market, industry-and firm-level component, and it is found that firm-level (i.e., idiosyncratic) volatility has increased in time, is more substantial than the two former, predicts GDP growth, moves countercyclically and as well as the other components is persistent. Thirdly, in Essay 3, we are among the first in the literature to seek for firm-specific determinants of idiosyncratic volatility in a multivariate setting, and find for the cross-section of stocks listed on the Helsinki Stock Exchange that industrial focus, trading volume, and block ownership, are positively associated with idiosyncratic volatility estimates––obtained from both the CAPM and the Fama and French three-factor model with local and international benchmark portfolios––whereas a negative relation holds between firm age as well as size and idiosyncratic volatility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The low predictive power of implied volatility in forecasting the subsequently realized volatility is a well-documented empirical puzzle. As suggested by e.g. Feinstein (1989), Jackwerth and Rubinstein (1996), and Bates (1997), we test whether unrealized expectations of jumps in volatility could explain this phenomenon. Our findings show that expectations of infrequently occurring jumps in volatility are indeed priced in implied volatility. This has two important consequences. First, implied volatility is actually expected to exceed realized volatility over long periods of time only to be greatly less than realized volatility during infrequently occurring periods of very high volatility. Second, the slope coefficient in the classic forecasting regression of realized volatility on implied volatility is very sensitive to the discrepancy between ex ante expected and ex post realized jump frequencies. If the in-sample frequency of positive volatility jumps is lower than ex ante assessed by the market, the classic regression test tends to reject the hypothesis of informational efficiency even if markets are informationally effective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study contributes to the neglect effect literature by looking at the relative trading volume in terms of value. The results for the Swedish market show a significant positive relationship between the accuracy of estimation and the relative trading volume. Market capitalisation and analyst coverage have in prior studies been used as proxies for neglect. These measures however, do not take into account the effort analysts put in when estimating corporate pre-tax profits. I also find evidence that the industry of the firm influence the accuracy of estimation. In addition, supporting earlier findings, loss making firms are associated with larger forecasting errors. Further, I find that the average forecast error increased in the year 2000 – in Sweden.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mesoscale weather phenomena, such as the sea breeze circulation or lake effect snow bands, are typically too large to be observed at one point, yet too small to be caught in a traditional network of weather stations. Hence, the weather radar is one of the best tools for observing, analyzing and understanding their behavior and development. A weather radar network is a complex system, which has many structural and technical features to be tuned, from the location of each radar to the number of pulses averaged in the signal processing. These design parameters have no universal optimal values, but their selection depends on the nature of the weather phenomena to be monitored as well as on the applications for which the data will be used. The priorities and critical values are different for forest fire forecasting, aviation weather service or the planning of snow ploughing, to name a few radar-based applications. The main objective of the work performed within this thesis has been to combine knowledge of technical properties of the radar systems and our understanding of weather conditions in order to produce better applications able to efficiently support decision making in service duties for modern society related to weather and safety in northern conditions. When a new application is developed, it must be tested against ground truth . Two new verification approaches for radar-based hail estimates are introduced in this thesis. For mesoscale applications, finding the representative reference can be challenging since these phenomena are by definition difficult to catch with surface observations. Hence, almost any valuable information, which can be distilled from unconventional data sources such as newspapers and holiday shots is welcome. However, as important as getting data is to obtain estimates of data quality, and to judge to what extent the two disparate information sources can be compared. The presented new applications do not rely on radar data alone, but ingest information from auxiliary sources such as temperature fields. The author concludes that in the future the radar will continue to be a key source of data and information especially when used together in an effective way with other meteorological data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation develops a strategic management accounting perspective of inventory routing. The thesis studies the drivers of cost efficiency gains by identifying the role of the underlying cost structure, demand, information sharing, forecasting accuracy, service levels, vehicle fleet, planning horizon and other strategic factors as well as the interaction effects among these factors with respect to performance outcomes. The task is to enhance the knowledge of the strategic situations that favor the implementation of inventory routing systems, understanding cause-and-effect relationships, linkages and gaining a holistic view of the value proposition of inventory routing. The thesis applies an exploratory case study design, which is based on normative quantitative empirical research using optimization, simulation and factor analysis. Data and results are drawn from a real world application to cash supply chains. The first research paper shows that performance gains require a common cost component and cannot be explained by simple linear or affine cost structures. Inventory management and distribution decisions become separable in the absence of a set-dependent cost structure, and neither economies of scope nor coordination problems are present in this case. The second research paper analyzes whether information sharing improves the overall forecasting accuracy. Analysis suggests that the potential for information sharing is limited to coordination of replenishments and that central information do not yield more accurate forecasts based on joint forecasting. The third research paper develops a novel formulation of the stochastic inventory routing model that accounts for minimal service levels and forecasting accuracy. The developed model allows studying the interaction of minimal service levels and forecasting accuracy with the underlying cost structure in inventory routing. Interestingly, results show that the factors minimal service level and forecasting accuracy are not statistically significant, and subsequently not relevant for the strategic decision problem to introduce inventory routing, or in other words, to effectively internalize inventory management and distribution decisions at the supplier. Consequently the main contribution of this thesis is the result that cost benefits of inventory routing are derived from the joint decision model that accounts for the underlying set-dependent cost structure rather than the level of information sharing. This result suggests that the value of information sharing of demand and inventory data is likely to be overstated in prior literature. In other words, cost benefits of inventory routing are primarily determined by the cost structure (i.e. level of fixed costs and transportation costs) rather than the level of information sharing, joint forecasting, forecasting accuracy or service levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Linear optimization model was used to calculate seven wood procurement scenarios for years 1990, 2000 and 2010. Productivity and cost functions for seven cutting, five terrain transport, three long distance transport and various work supervision and scaling methods were calculated from available work study reports. All method's base on Nordic cut to length system. Finland was divided in three parts for description of harvesting conditions. Twenty imaginary wood processing points and their wood procurement areas were created for these areas. The procurement systems, which consist of the harvesting conditions and work productivity functions, were described as a simulation model. In the LP-model the wood procurement system has to fulfil the volume and wood assortment requirements of processing points by minimizing the procurement cost. The model consists of 862 variables and 560 restrictions. Results show that it is economical to increase the mechanical work in harvesting. Cost increment alternatives effect only little on profitability of manual work. The areas of later thinnings and seed tree- and shelter wood cuttings increase on cost of first thinnings. In mechanized work one method, 10-tonne one grip harvester and forwarder, is gaining advantage among other methods. Working hours of forwarder are decreasing opposite to the harvester. There is only little need to increase the number of harvesters and trucks or their drivers from today's level. Quite large fluctuations in level of procurement and cost can be handled by constant number of machines, by alternating the number of season workers and by driving machines in two shifts. It is possible, if some environmental problems of large scale summer time harvesting can be solved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To enhance the utilization of the wood, the sawmills are forced to place more emphasis on planning to master the whole production chain from the forest to the end product. One significant obstacle to integrating the forest-sawmill-market production chain is the lack of appropriate information about forest stands. Since the wood procurement point of view in forest planning systems has been almost totally disregarded there has been a great need to develop an easy and efficient pre-harvest measurement method, allowing separate measurement of stands prior to harvesting. The main purpose of this study was to develop a measurement method for pine stands which forest managers could use in describing the properties of the standing trees for sawing production planning. Study materials were collected from ten Scots pine stands (Pinus sylvestris) located in North Häme and South Pohjanmaa, in southern Finland. The data comprise test sawing data on 314 pine stems, dbh and height measures of all trees and measures of the quality parameters of pine sawlog stems in all ten study stands as well as the locations of all trees in six stands. The study was divided into four sub-studies which deal with pine quality prediction, construction of diameter and dead branch height distributions, sampling designs and applying height and crown height models. The final proposal for the pre-harvest measurement method is a synthesis of the individual sub-studies. Quality analysis resulted in choosing dbh, distance from stump height to the first dead branch (dead branch height), crown height and tree height as the most appropriate quality characteristics of Scots pine. Dbh and dead branch height are measured from each pine sample tree while height and crown height are derived from dbh measures by aid of mixed height and crown height models. Pine and spruce diameter distribution as well as dead branch height distribution are most effectively predicted by the kernel function. Roughly 25 sample trees seems to be appropriate in pure pine stands. In mixed stands the number of sample trees needs to be increased in proportion to the intensity of pines in order to attain the same level of accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study uses the European Centre for Medium-Range Weather Forecasts (ECMWF) model-generated high-resolution 10-day-long predictions for the Year of Tropical Convection (YOTC) 2008. Precipitation forecast skills of the model over the tropics are evaluated against the Tropical Rainfall Measuring Mission (TRMM) estimates. It has been shown that the model was able to capture the monthly to seasonal mean features of tropical convection reasonably. Northward propagation of convective bands over the Bay of Bengal was also forecasted realistically up to 5 days in advance, including the onset phase of the monsoon during the first half of June 2008. However, large errors exist in the daily datasets especially for longer lead times over smaller domains. For shorter lead times (less than 4-5 days), forecast errors are much smaller over the oceans than over land. Moreover, the rate of increase of errors with lead time is rapid over the oceans and is confined to the regions where observed precipitation shows large day-to-day variability. It has been shown that this rapid growth of errors over the oceans is related to the spatial pattern of near-surface air temperature. This is probably due to the one-way air-sea interaction in the atmosphere-only model used for forecasting. While the prescribed surface temperature over the oceans remain realistic at shorter lead times, the pattern and hence the gradient of the surface temperature is not altered with change in atmospheric parameters at longer lead times. It has also been shown that the ECMWF model had considerable difficulties in forecasting very low and very heavy intensity of precipitation over South Asia. The model has too few grids with ``zero'' precipitation and heavy (>40 mm day(-1)) precipitation. On the other hand, drizzle-like precipitation is too frequent in the model compared to that in the TRMM datasets. Further analysis shows that a major source of error in the ECMWF precipitation forecasts is the diurnal cycle over the South Asian monsoon region. The peak intensity of precipitation in the model forecasts over land (ocean) appear about 6 (9) h earlier than that in the observations. Moreover, the amplitude of the diurnal cycle is much higher in the model forecasts compared to that in the TRMM estimates. It has been seen that the phase error of the diurnal cycle increases with forecast lead time. The error in monthly mean 3-hourly precipitation forecasts is about 2-4 times of the error in the daily mean datasets. Thus, effort should be given to improve the phase and amplitude forecast of the diurnal cycle of precipitation from the model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of the Advanced Regional Prediction System (ARPS) in simulating an extreme rainfall event is evaluated, and subsequently the physical mechanisms leading to its initiation and sustenance are explored. As a case study, the heavy precipitation event that led to 65 cm of rainfall accumulation in a span of around 6 h (1430 LT-2030 LT) over Santacruz (Mumbai, India), on 26 July, 2005, is selected. Three sets of numerical experiments have been conducted. The first set of experiments (EXP1) consisted of a four-member ensemble, and was carried out in an idealized mode with a model grid spacing of 1 km. In spite of the idealized framework, signatures of heavy rainfall were seen in two of the ensemble members. The second set (EXP2) consisted of a five-member ensemble, with a four-level one-way nested integration and grid spacing of 54, 18, 6 and 1 km. The model was able to simulate a realistic spatial structure with the 54, 18, and 6 km grids; however, with the 1 km grid, the simulations were dominated by the prescribed boundary conditions. The third and final set of experiments (EXP3) consisted of a five-member ensemble, with a four-level one-way nesting and grid spacing of 54, 18, 6, and 2 km. The Scaled Lagged Average Forecasting (SLAF) methodology was employed to construct the ensemble members. The model simulations in this case were closer to observations, as compared to EXP2. Specifically, among all experiments, the timing of maximum rainfall, the abrupt increase in rainfall intensities, which was a major feature of this event, and the rainfall intensities simulated in EXP3 (at 6 km resolution) were closest to observations. Analysis of the physical mechanisms causing the initiation and sustenance of the event reveals some interesting aspects. Deep convection was found to be initiated by mid-tropospheric convergence that extended to lower levels during the later stage. In addition, there was a high negative vertical gradient of equivalent potential temperature suggesting strong atmospheric instability prior to and during the occurrence of the event. Finally, the presence of a conducive vertical wind shear in the lower and mid-troposphere is thought to be one of the major factors influencing the longevity of the event.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkielman tavoitteena on selvittää miten taloustaantuma vaikutti asuntosijoittamisen kiinnostavuuteen ja asuntomarkkinoilla käytävään keskusteluun vuonna 2008. Tuolloin Suomen talous taantui voimakkaasti ja nopeasti yllättäen myös ennusteita laativat asiantuntijat. Ekonomistien lausunnoista puuttui yhdenmukaisuus ja tarkkuus. Ne myös saattoivat muuttua merkittävästi lyhyellä aikavälillä. Taantumassa sijoitusviestintä on varovaista ja tarkasti muotoiltua. Sijoittajat uskovat mielellään muiden sijoittajien mielipiteitä ja käsityksiä vaikkei niiden taustalla olisikaan aina todennettua faktatietoa. Asiantuntijoiden tilastoihin halutaan uskoa vaikka niitä kohtaan koetaan epäilyksiä. Toisaalta asuntosijoittamisen kannattavuuteen ja taloudelliseen tuottoon halutaan uskoa vaikka asiantuntijat voisivat todistaa toisin. Tutkimus toteutettiin kvalitatiivisena tapaustutkimuksena jota analysoitiin Greimasin aktanttimallia mukaillen. Tutkimusaineisto koostui 14 Helsingin Sanomissa julkaistuista asuntosijoittamiseen liittyvistä artikkelista sekä 13 Taloussanomien keskustelupalstalla julkaistusta mielipidekirjoituksesta. Viestien merkityksiä käytiin läpi semioottisesti määrittelemällä eri aktanteille rooleja. Tarinassa sijoittajasubjektin objektina on asunto, jonka avulla pyritään saavuttamaan mahdollisimman suuri rahallinen tuotto. Lähettäjiä ovat muun muassa tilastojen laatijat ja sijoitusneuvojat. Kaikki optimaalisen sijoituspäätöksen tekemiseen vaikuttavat aktantit käydään tarkemmin läpi tutkimuksen loppupuolella.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A real-time operational methodology has been developed for multipurpose reservoir operation for irrigation and hydropower generation with application to the Bhadra reservoir system in the state of Karnataka, India. The methodology consists of three phases of computer modelling. In the first phase, the optimal release policy for a given initial storage and inflow is determined using a stochastic dynamic programming (SDP) model. Streamflow forecasting using an adaptive AutoRegressive Integrated Moving Average (ARIMA) model constitutes the second phase. A real-time simulation model is developed in the third phase using the forecast inflows of phase 2 and the operating policy of phase 1. A comparison of the optimal monthly real-time operation with the historical operation demonstrates the relevance, applicability and the relative advantage of the proposed methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Equatorial Indian Ocean is warmer in the east, has a deeper thermocline and mixed layer, and supports a more convective atmosphere than in the west. During certain years, the eastern Indian Ocean becomes unusually cold, anomalous winds blow from east to west along the equator and southeastward off the coast of Sumatra, thermocline and mixed layer lift up and the atmospheric convection gets suppressed. At the same time, western Indian Ocean becomes warmer and enhances atmospheric convection. This coupled ocean-atmospheric phenomenon in which convection, winds, sea surface temperature (SST) and thermocline take part actively is known as the Indian Ocean Dipole (IOD). Propagation of baroclinic Kelvin and Rossby waves excited by anomalous winds, play an important role in the development of SST anomalies associated with the IOD. Since mean thermocline in the Indian Ocean is deep compared to the Pacific, it was believed for a long time that the Indian Ocean is passive and merely responds to the atmospheric forcing. Discovery of the IOD and studies that followed demonstrate that the Indian Ocean can sustain its own intrinsic coupled ocean-atmosphere processes. About 50% percent of the IOD events in the past 100 years have co-occurred with El Nino Southern Oscillation (ENSO) and the other half independently. Coupled models have been able to reproduce IOD events and process experiments by such models – switching ENSO on and off – support the hypothesis based on observations that IOD events develop either in the presence or absence of ENSO. There is a general consensus among different coupled models as well as analysis of data that IOD events co-occurring during the ENSO are forced by a zonal shift in the descending branch of Walker cell over to the eastern Indian Ocean. Processes that initiate the IOD in the absence of ENSO are not clear, although several studies suggest that anomalies of Hadley circulation are the most probable forcing function. Impact of the IOD is felt in the vicinity of Indian Ocean as well as in remote regions. During IOD events, biological productivity of the eastern Indian Ocean increases and this in turn leads to death of corals over a large area.Moreover, the IOD affects rainfall over the maritime continent, Indian subcontinent, Australia and eastern Africa. The maritime continent and Australia suffer from deficit rainfall whereas India and east Africa receive excess. Despite the successful hindcast of the 2006 IOD by a coupled model, forecasting IOD events and their implications to rainfall variability remains a major challenge as understanding reasons behind an increase in frequency of IOD events in recent decades.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The suitability of the European Centre for Medium Range Weather Forecasting (ECMWF) operational wind analysis for the period 1980-1991 for studying interannual variability is examined. The changes in the model and the analysis procedure are shown to give rise to a systematic and significant trend in the large scale circulation features. A new method of removing the systematic errors at all levels is presented using multivariate EOF analysis. Objectively detrended analysis of the three-dimensional wind field agrees well with independent Florida State University (FSU) wind analysis at the surface. It is shown that the interannual variations in the detrended surface analysis agree well in amplitude as well as spatial patterns with those of the FSU analysis. Therefore, the detrended analyses at other levels as well are expected to be useful for studies of variability and predictability at interannual time scales. It is demonstrated that this trend in the wind field is due to the shift in the climatologies from the period 1980-1985 to the period 1986-1991.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Owing to the lack of atmospheric vertical profile data with sufficient accuracy and vertical resolution, the response of the deep atmosphere to passage of monsoon systems over the Bay of Bengal. had not been satisfactorily elucidated. Under the Indian Climate Research Programme, a special observational programme called 'Bay of Bengal Monsoon Experiment' (BOBMEX), was conducted during July-August 1999. The present study is based on the high-resolution radiosondes launched during BOBMEX in the north Bay. Clear changes in the vertical thermal structure of the atmosphere between active and weak phases of convection have been observed. The atmosphere cooled below 6 km height and became warmer between 6 and 13 km height. The warmest layer was located between 8 and 10 km height, and the coldest layer was found just below 5 km height. The largest fluctuations in the humidity field occurred in the mid-troposphere. The observed changes between active and weak phases of convection are compared with the results from an atmospheric general circulation model, which is similar to that used at the National Centre for Medium Range Weather Forecasting, New Delhi. The model is not able to capture realistically some important features of the temperature and humidity profiles in the lower troposphere and in the boundary layer during the active and weak spells.