183 resultados para Pooling of forecasts
Resumo:
Harmful Algal Blooms (HABs) have become an important environmental concern along the western coast of the United States. Toxic and noxious blooms adversely impact the economies of coastal communities in the region, pose risks to human health, and cause mortality events that have resulted in the deaths of thousands of fish, marine mammals and seabirds. One goal of field-based research efforts on this topic is the development of predictive models of HABs that would enable rapid response, mitigation and ultimately prevention of these events. In turn, these objectives are predicated on understanding the environmental conditions that stimulate these transient phenomena. An embedded sensor network (Fig. 1), under development in the San Pedro Shelf region off the Southern California coast, is providing tools for acquiring chemical, physical and biological data at high temporal and spatial resolution to help document the emergence and persistence of HAB events, supporting the design and testing of predictive models, and providing contextual information for experimental studies designed to reveal the environmental conditions promoting HABs. The sensor platforms contained within this network include pier-based sensor arrays, ocean moorings, HF radar stations, along with mobile sensor nodes in the form of surface and subsurface autonomous vehicles. FreewaveTM radio modems facilitate network communication and form a minimally-intrusive, wireless communication infrastructure throughout the Southern California coastal region, allowing rapid and cost-effective data transfer. An emerging focus of this project is the incorporation of a predictive ocean model that assimilates near-real time, in situ data from deployed Autonomous Underwater Vehicles (AUVs). The model then assimilates the data to increase the skill of both nowcasts and forecasts, thus providing insight into bloom initiation as well as the movement of blooms or other oceanic features of interest (e.g., thermoclines, fronts, river discharge, etc.). From these predictions, deployed mobile sensors can be tasked to track a designated feature. This focus has led to the creation of a technology chain in which algorithms are being implemented for the innovative trajectory design for AUVs. Such intelligent mission planning is required to maneuver a vehicle to precise depths and locations that are the sites of active blooms, or physical/chemical features that might be sources of bloom initiation or persistence. The embedded network yields high-resolution, temporal and spatial measurements of pertinent environmental parameters and resulting biology (see Fig. 1). Supplementing this with ocean current information and remotely sensed imagery and meteorological data, we obtain a comprehensive foundation for developing a fundamental understanding of HAB events. This then directs labor- intensive and costly sampling efforts and analyses. Additionally, we provide coastal municipalities, managers and state agencies with detailed information to aid their efforts in providing responsible environmental stewardship of their coastal waters.
Resumo:
Planning on utilization of train-set is one of the key tasks of transport organization for passenger dedicated railway in China. It also has strong relationships with timetable scheduling and operation plans at a station. To execute such a task in a railway hub pooling multiple railway lines, the characteristics of multiple routing for train-set is discussed in term of semicircle of train-sets' turnover. In programming the described problem, the minimum dwell time is selected as the objectives with special derive constraints of the train-set's dispatch, the connecting conditions, the principle of uniqueness for train-sets, and the first plus for connection in the same direction based on time tolerance σ. A compact connection algorithm based on time tolerance is then designed. The feasibility of the model and the algorithm is proved by the case study. The result indicates that the circulation model and algorithm about multiple routing can deal with the connections between the train-sets of multiple directions, and reduce the train's pulling in or leaving impact on the station's throat.
Resumo:
Since the establishment of the first national strategic development plan in the early 1970s, the construction industry has played an important role in terms of the economic, social and cultural development of Indonesia. The industry’s contribution to Indonesia’s gross domestic product (GDP) increased from 3.9% in 1973 to 7.7% in 2007. Business Monitoring International (2009) forecasts that Indonesia is home to one of the fastest-growing construction industries in Asia despite the average construction growth rate being expected to remain under 10% over the period 2006 – 2010. Similarly, Howlett and Powell (2006) place Indonesia as one of the 20 largest construction markets in 2010. Although the prospects for the Indonesian construction industry are now very promising, many local construction firms still face serious difficulties, such as poor performance and low competitiveness. There are two main reasons behind this problem: the environment that they face is not favourable; the other is the lack of strategic direction to improve competitiveness and performance. Furthermore, although strategic management has now become more widely used by many large construction firms in developed countries, practical examples and empirical studies related to the Indonesian construction industry remain scarce. In addition, research endeavours related to these topics in developing countries appear to be limited. This has potentially become one of the factors hampering efforts to guide Indonesian construction enterprises. This research aims to construct a conceptual model to enable Indonesian construction enterprises to develop a sound long-term corporate strategy that generates competitive advantage and superior performance. The conceptual model seeks to address the main prescription of a dynamic capabilities framework (Teece, Pisano & Shuen, 1997; Teece, 2007) within the context of the Indonesian construction industry. It is hypothesised that in a rapidly changing and varied environment, competitive success arises from the continuous development and reconfiguration of firm’s specific assets achieving competitive advantage is not only dependent on the exploitation of specific assets/capabilities, but on the exploitation of all of the assets and capabilities combinations in the dynamic capabilities framework. Thus, the model is refined through sequential statistical regression analyses of survey results with a sample size of 120 valid responses. The results of this study provide empirical evidence in support of the notion that a competitive advantage is achieved via the implementation of a dynamic capability framework as an important way for a construction enterprise to improve its organisational performance. The characteristics of asset-capability combinations were found to be significant determinants of the competitive advantage of the Indonesian construction enterprises, and that such advantage sequentially contributes to organisational performance. If a dynamic capabilities framework can work in the context of Indonesia, it suggests that the framework has potential applicability in other emerging and developing countries. This study also demonstrates the importance of the multi-stage nature of the model which provides a rich understanding of the dynamic process by which asset-capability should be exploited in combination by the construction firms operating in varying levels of hostility. Such findings are believed to be useful to both academics and practitioners, however, as this research represents a dynamic capabilities framework at the enterprise level, future studies should continue to explore and examine the framework in other levels of strategic management in construction as well as in other countries where different cultures or similar conditions prevails.
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.
A longitudinal study of corporate earnings guidance in Australia’s continuous disclosure environment
Resumo:
Since the introduction of a statutory‐backed continuous disclosure regime (CDR) in 1994, regulatory reforms have significantly increased litigation risk in Australia for failure to disclose material information or for false and misleading disclosure. However, there is almost no empirical research on the impact of the reforms on corporate disclosure behaviour. Motivated by the absence of research and using management earnings forecasts (MEFs) as a disclosure proxy, this study examines (1) why managers issue earnings forecasts, (2) what firm‐specific factors influence MEF characteristics, and (3) how MEF behaviour changes as litigation risk increases. Based on theories in information economics, a theoretical framework for MEF behaviour is formulated which includes antecedent influencing factors related to firms‟ internal and external environments. Applying this framework, hypotheses are developed and tested using multivariate models and a large sample of hand-collected MEFs (7,213) issued by top 500 ASX-listed companies over the 1994 to 2008 period. The results reveal strong support for the hypotheses. First, MEFs are issued to reduce information asymmetry, litigation risk and signal superior performance. Second, firms with better financial performance, smaller earnings changes, and lower operating uncertainty provide better quality MEFs. Third, forecast frequency and quality (accuracy, timeliness and precision) noticeably improve as litigation risk increases. However, managers appear to be still reluctant to disclose earnings forecasts when there are large earnings changes, and an asymmetric treatment of news type continues to prevail (a good news bias). Thus, the findings generally provide support for the effectiveness of the CDR regulatory reforms in improving disclosure behaviour and will be valuable to market participants and corporate regulators in understanding the implications of management forecasting decisions and areas for further improvement.
Resumo:
Purpose – The purpose of this paper is to jointly assess the impact of regulatory reform for corporate fundraising in Australia (CLERP Act 1999) and the relaxation of ASX admission rules in 1999, on the accuracy of management earnings forecasts in initial public offer (IPO) prospectuses. The relaxation of ASX listing rules permitted a new category of new economy firms (commitments test entities (CTEs))to list without a prior history of profitability, while the CLERP Act (introduced in 2000) was accompanied by tighter disclosure obligations and stronger enforcement action by the corporate regulator (ASIC). Design/methodology/approach – All IPO earnings forecasts in prospectuses lodged between 1998 and 2003 are examined to assess the pre- and post-CLERP Act impact. Based on active ASIC enforcement action in the post-reform period, IPO firms are hypothesised to provide more accurate forecasts, particularly CTE firms, which are less likely to have a reasonable basis for forecasting. Research models are developed to empirically test the impact of the reforms on CTE and non-CTE IPO firms. Findings – The new regulatory environment has had a positive impact on management forecasting behaviour. In the post-CLERP Act period, the accuracy of prospectus forecasts and their revisions significantly improved and, as expected, the results are primarily driven by CTE firms. However, the majority of prospectus forecasts continue to be materially inaccurate. Originality/value – The results highlight the need to control for both the changing nature of listed firms and the level of enforcement action when examining responses to regulatory changes to corporate fundraising activities.
Resumo:
The global financial crisis that impacted on all world economies throughout 2008 and 2009. This impact has not been confined to the finance industries but has had a direct and indirect impact on the property industry worldwide from both an ownership and investment perspective. Property markets have experienced various levels of impact from this event, but universally the greatest impact has been on the traditional commercial and industrial property sectors from the investor perspective, with investment and superannuation funds reporting significant declines in the reported value of these investments. Despite the very direct impact of these declining property markets, the GFC has also had a very significant indirect impact on the various property professions and how these professions are now operating in this declining property market. Of particular interest is the comparison of the property market forecasts in late 2007 to the actual results in 2008/2009.
Resumo:
Purpose Managers generally have discretion in determining how components of earnings are presented in financial statements in distinguishing between ‘normal’ earnings and items classified as unusual, special, significant, exceptional or abnormal. Prior research has found that such intra-period classificatory choice is used as a form of earnings management. Prior to 2001, Australian accounting standards mandated that unusually large items of revenue and expense be classified as ‘abnormal items’ for financial reporting, but this classification was removed from accounting standards from 2001. This move by the regulators was partly in response to concerns that the abnormal classification was being used opportunistically to manage reported pre-abnormal earnings. This study extends the earnings management literature by examining the reporting of abnormal items for evidence of intra-period classificatory earnings management in the unique Australian setting. Design/methodology/approach This study investigates associations between reporting of abnormal items and incentives in the form of analyst following and the earnings benchmarks of analysts’ forecasts, earnings levels, and earnings changes, for a sample of Australian top-500 firms for the seven-year period from 1994 to 2000. Findings The findings suggest there are systematic differences between firms reporting abnormal items and those with no abnormal items. Results show evidence that, on average, firms shifted expense items from pre-abnormal earnings to bottom line net income through reclassification as abnormal losses. Originality/value These findings suggest that the standard setters were justified in removing the ‘abnormal’ classification from the accounting standard. However, it cannot be assumed that all firms acted opportunistically in the classification of items as abnormal. With the removal of the standardised classification of items outside normal operations as ‘abnormal’, firms lost the opportunity to use such disclosures as a signalling device, with the consequential effect of limiting the scope of effectively communicating information about the nature of items presented in financial reports.
Resumo:
Forecasts generated by time series models traditionally place greater weight on more recent observations. This paper develops an alternative semi-parametric method for forecasting that does not rely on this convention and applies it to the problem of forecasting asset return volatility. In this approach, a forecast is a weighted average of historical volatility, with the greatest weight given to periods that exhibit similar market conditions to the time at which the forecast is being formed. Weighting is determined by comparing short-term trends in volatility across time (as a measure of market conditions) by means of a multivariate kernel scheme. It is found that the semi-parametric method produces forecasts that are significantly more accurate than a number of competing approaches at both short and long forecast horizons.
Resumo:
Forecasts of volatility and correlation are important inputs into many practical financial problems. Broadly speaking, there are two ways of generating forecasts of these variables. Firstly, time-series models apply a statistical weighting scheme to historical measurements of the variable of interest. The alternative methodology extracts forecasts from the market traded value of option contracts. An efficient options market should be able to produce superior forecasts as it utilises a larger information set of not only historical information but also the market equilibrium expectation of options market participants. While much research has been conducted into the relative merits of these approaches, this thesis extends the literature along several lines through three empirical studies. Firstly, it is demonstrated that there exist statistically significant benefits to taking the volatility risk premium into account for the implied volatility for the purposes of univariate volatility forecasting. Secondly, high-frequency option implied measures are shown to lead to superior forecasts of the intraday stochastic component of intraday volatility and that these then lead on to superior forecasts of intraday total volatility. Finally, the use of realised and option implied measures of equicorrelation are shown to dominate measures based on daily returns.
Resumo:
We review the literature on the impact of litigation risk (a form of external governance) on corporate prospective disclosure decisions as reflected in management earnings forecasts. From this analysis we identify four key areas for future research. First, litigation risk warrants more attention from researchers; currently it tends to be treated as a secondary factor impacting MEF decisions. Second, it would be informative from a governance perspective for researchers to explore why litigation risk has a differential impact on MEF decisions across countries. Third, understanding the interaction between litigation risk and forecast/firm-specific characteristics is important from management, investor and regulatory perspectives but is currently under-explored Last, research on the litigation risk and MEF attributes link is piecemeal and incomplete, requiring more integrated and expanded analysis.
Resumo:
Knowledge about customers is vital for supply chains in order to ensure customer satisfaction. In an ideal supply chain environment, supply chain partners are able to perform planning tasks collaboratively, because they share information. However, customers are not always able or willing to share information with their suppliers. End consumers, on the one hand, do not usually provide a retail company with demand information. On the other hand, industrial customers might consciously hide information. Wherever a supply chain is not provided with demand forecast information, it needs to derive these demand forecasts by other means. Customer Relationship Management provides a set of tools to overcome informational uncertainty. We show how CRM and SCM information can be integrated on the conceptual as well as technical levels in order to provide supply chain managers with relevant information.
Resumo:
The occurrence of extreme movements in the spot price of electricity represents a significant source of risk to retailers. A range of approaches have been considered with respect to modelling electricity prices; these models, however, have relied on time-series approaches, which typically use restrictive decay schemes placing greater weight on more recent observations. This study develops an alternative, semi-parametric method for forecasting, which uses state-dependent weights derived from a kernel function. The forecasts that are obtained using this method are accurate and therefore potentially useful to electricity retailers in terms of risk management.
Resumo:
An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.
Resumo:
This article presents new theoretical and empirical evidence on the forecasting ability of prediction markets. We develop a model that predicts that the time until expiration of a prediction market should negatively affect the accuracy of prices as a forecasting tool in the direction of a ‘favourite/longshot bias’. That is, high-likelihood events are underpriced, and low-likelihood events are over-priced. We confirm this result using a large data set of prediction market transaction prices. Prediction markets are reasonably well calibrated when time to expiration is relatively short, but prices are significantly biased for events farther in the future. When time value of money is considered, the miscalibration can be exploited to earn excess returns only when the trader has a relatively low discount rate.