54 resultados para Forecasts


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study focuses on an alluvial plain situated within a large meander of the Logan River at Josephville near Beaudesert which supports a factory that processes gelatine. The plant draws water from on site bores, as well as the Logan River, for its production processes and produces approximately 1.5 ML per day (Douglas Partners, 2004) of waste water containing high levels of dissolved ions. At present a series of treatment ponds are used to aerate the waste water reducing the level of organic matter; the water is then used to irrigate grazing land around the site. Within the study the hydrogeology is investigated, a conceptual groundwater model is produced and a numerical groundwater flow model is developed from this. On the site are several bores that access groundwater, plus a network of monitoring bores. Assessment of drilling logs shows the area is formed from a mixture of poorly sorted Quaternary alluvial sediments with a laterally continuous aquifer comprised of coarse sands and fine gravels that is in contact with the river. This aquifer occurs at a depth of between 11 and 15 metres and is overlain by a heterogeneous mixture of silts, sands and clays. The study investigates the degree of interaction between the river and the groundwater within the fluvially derived sediments for reasons of both environmental monitoring and sustainability of the potential local groundwater resource. A conceptual hydrogeological model of the site proposes two hydrostratigraphic units, a basal aquifer of coarse-grained materials overlain by a thick semi-confining unit of finer materials. From this, a two-layer groundwater flow model and hydraulic conductivity distribution was developed based on bore monitoring and rainfall data using MODFLOW (McDonald and Harbaugh, 1988) and PEST (Doherty, 2004) based on GMS 6.5 software (EMSI, 2008). A second model was also considered with the alluvium represented as a single hydrogeological unit. Both models were calibrated to steady state conditions and sensitivity analyses of the parameters has demonstrated that both models are very stable for changes in the range of ± 10% for all parameters and still reasonably stable for changes up to ± 20% with RMS errors in the model always less that 10%. The preferred two-layer model was found to give the more realistic representation of the site, where water level variations and the numerical modeling showed that the basal layer of coarse sands and fine gravels is hydraulically connected to the river and the upper layer comprising a poorly sorted mixture of silt-rich clays and sands of very low permeability limits infiltration from the surface to the lower layer. The paucity of historical data has limited the numerical modelling to a steady state one based on groundwater levels during a drought period and forecasts for varying hydrological conditions (e.g. short term as well as prolonged dry and wet conditions) cannot reasonably be made from such a model. If future modelling is to be undertaken it is necessary to establish a regular program of groundwater monitoring and maintain a long term database of water levels to enable a transient model to be developed at a later stage. This will require a valid monitoring network to be designed with additional bores required for adequate coverage of the hydrogeological conditions at the Josephville site. Further investigations would also be enhanced by undertaking pump testing to investigate hydrogeological properties in the aquifer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years the air transport industry has experienced unprecedented growth, driven by strong local and global economies. Whether this growth can continue in the face of anticipated oil crises; international economic forecasts and recent influenza outbreaks is yet to be seen. One thing is certain, airport owners and operators will continue to be faced with challenging environments in which to do business. In response, many airports recognize the value in diversifying their revenue streams through a variety of landside property developments within the airport boundary. In Australia it is the type and intended market of this development that is a point of contention between private airport corporations and their surrounding municipalities. The aim of this preliminary research is to identify and categorize on-airport development occurring at the twenty-two privatized Australian airports which are administered under the Airports Act [1996]. This new knowledge will assist airport and municipal planners in understanding the current extent and category of on-airport land use, allowing them to make better decisions when proposing development both within airport master plans and beyond the airport boundary in local town and municipal plans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this PhD was to further develop Bayesian spatio-temporal models (specifically the Conditional Autoregressive (CAR) class of models), for the analysis of sparse disease outcomes such as birth defects. The motivation for the thesis arose from problems encountered when analyzing a large birth defect registry in New South Wales. The specific components and related research objectives of the thesis were developed from gaps in the literature on current formulations of the CAR model, and health service planning requirements. Data from a large probabilistically-linked database from 1990 to 2004, consisting of fields from two separate registries: the Birth Defect Registry (BDR) and Midwives Data Collection (MDC) were used in the analyses in this thesis. The main objective was split into smaller goals. The first goal was to determine how the specification of the neighbourhood weight matrix will affect the smoothing properties of the CAR model, and this is the focus of chapter 6. Secondly, I hoped to evaluate the usefulness of incorporating a zero-inflated Poisson (ZIP) component as well as a shared-component model in terms of modeling a sparse outcome, and this is carried out in chapter 7. The third goal was to identify optimal sampling and sample size schemes designed to select individual level data for a hybrid ecological spatial model, and this is done in chapter 8. Finally, I wanted to put together the earlier improvements to the CAR model, and along with demographic projections, provide forecasts for birth defects at the SLA level. Chapter 9 describes how this is done. For the first objective, I examined a series of neighbourhood weight matrices, and showed how smoothing the relative risk estimates according to similarity by an important covariate (i.e. maternal age) helped improve the model’s ability to recover the underlying risk, as compared to the traditional adjacency (specifically the Queen) method of applying weights. Next, to address the sparseness and excess zeros commonly encountered in the analysis of rare outcomes such as birth defects, I compared a few models, including an extension of the usual Poisson model to encompass excess zeros in the data. This was achieved via a mixture model, which also encompassed the shared component model to improve on the estimation of sparse counts through borrowing strength across a shared component (e.g. latent risk factor/s) with the referent outcome (caesarean section was used in this example). Using the Deviance Information Criteria (DIC), I showed how the proposed model performed better than the usual models, but only when both outcomes shared a strong spatial correlation. The next objective involved identifying the optimal sampling and sample size strategy for incorporating individual-level data with areal covariates in a hybrid study design. I performed extensive simulation studies, evaluating thirteen different sampling schemes along with variations in sample size. This was done in the context of an ecological regression model that incorporated spatial correlation in the outcomes, as well as accommodating both individual and areal measures of covariates. Using the Average Mean Squared Error (AMSE), I showed how a simple random sample of 20% of the SLAs, followed by selecting all cases in the SLAs chosen, along with an equal number of controls, provided the lowest AMSE. The final objective involved combining the improved spatio-temporal CAR model with population (i.e. women) forecasts, to provide 30-year annual estimates of birth defects at the Statistical Local Area (SLA) level in New South Wales, Australia. The projections were illustrated using sixteen different SLAs, representing the various areal measures of socio-economic status and remoteness. A sensitivity analysis of the assumptions used in the projection was also undertaken. By the end of the thesis, I will show how challenges in the spatial analysis of rare diseases such as birth defects can be addressed, by specifically formulating the neighbourhood weight matrix to smooth according to a key covariate (i.e. maternal age), incorporating a ZIP component to model excess zeros in outcomes and borrowing strength from a referent outcome (i.e. caesarean counts). An efficient strategy to sample individual-level data and sample size considerations for rare disease will also be presented. Finally, projections in birth defect categories at the SLA level will be made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Much research has investigated the differences between option implied volatilities and econometric model-based forecasts. Implied volatility is a market determined forecast, in contrast to model-based forecasts that employ some degree of smoothing of past volatility to generate forecasts. Implied volatility has the potential to reflect information that a model-based forecast could not. This paper considers two issues relating to the informational content of the S&P 500 VIX implied volatility index. First, whether it subsumes information on how historical jump activity contributed to the price volatility, followed by whether the VIX reflects any incremental information pertaining to future jump activity relative to model-based forecasts. It is found that the VIX index both subsumes information relating to past jump contributions to total volatility and reflects incremental information pertaining to future jump activity. This issue has not been examined previously and expands our understanding of how option markets form their volatility forecasts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At least two important transportation planning activities rely on planning-level crash prediction models. One is motivated by the Transportation Equity Act for the 21st Century, which requires departments of transportation and metropolitan planning organizations to consider safety explicitly in the transportation planning process. The second could arise from a need for state agencies to establish incentive programs to reduce injuries and save lives. Both applications require a forecast of safety for a future period. Planning-level crash prediction models for the Tucson, Arizona, metropolitan region are presented to demonstrate the feasibility of such models. Data were separated into fatal, injury, and property-damage crashes. To accommodate overdispersion in the data, negative binomial regression models were applied. To accommodate the simultaneity of fatality and injury crash outcomes, simultaneous estimation of the models was conducted. All models produce crash forecasts at the traffic analysis zone level. Statistically significant (p-values < 0.05) and theoretically meaningful variables for the fatal crash model included population density, persons 17 years old or younger as a percentage of the total population, and intersection density. Significant variables for the injury and property-damage crash models were population density, number of employees, intersections density, percentage of miles of principal arterial, percentage of miles of minor arterials, and percentage of miles of urban collectors. Among several conclusions it is suggested that planning-level safety models are feasible and may play a role in future planning activities. However, caution must be exercised with such models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Harmful Algal Blooms (HABs) have become an important environmental concern along the western coast of the United States. Toxic and noxious blooms adversely impact the economies of coastal communities in the region, pose risks to human health, and cause mortality events that have resulted in the deaths of thousands of fish, marine mammals and seabirds. One goal of field-based research efforts on this topic is the development of predictive models of HABs that would enable rapid response, mitigation and ultimately prevention of these events. In turn, these objectives are predicated on understanding the environmental conditions that stimulate these transient phenomena. An embedded sensor network (Fig. 1), under development in the San Pedro Shelf region off the Southern California coast, is providing tools for acquiring chemical, physical and biological data at high temporal and spatial resolution to help document the emergence and persistence of HAB events, supporting the design and testing of predictive models, and providing contextual information for experimental studies designed to reveal the environmental conditions promoting HABs. The sensor platforms contained within this network include pier-based sensor arrays, ocean moorings, HF radar stations, along with mobile sensor nodes in the form of surface and subsurface autonomous vehicles. FreewaveTM radio modems facilitate network communication and form a minimally-intrusive, wireless communication infrastructure throughout the Southern California coastal region, allowing rapid and cost-effective data transfer. An emerging focus of this project is the incorporation of a predictive ocean model that assimilates near-real time, in situ data from deployed Autonomous Underwater Vehicles (AUVs). The model then assimilates the data to increase the skill of both nowcasts and forecasts, thus providing insight into bloom initiation as well as the movement of blooms or other oceanic features of interest (e.g., thermoclines, fronts, river discharge, etc.). From these predictions, deployed mobile sensors can be tasked to track a designated feature. This focus has led to the creation of a technology chain in which algorithms are being implemented for the innovative trajectory design for AUVs. Such intelligent mission planning is required to maneuver a vehicle to precise depths and locations that are the sites of active blooms, or physical/chemical features that might be sources of bloom initiation or persistence. The embedded network yields high-resolution, temporal and spatial measurements of pertinent environmental parameters and resulting biology (see Fig. 1). Supplementing this with ocean current information and remotely sensed imagery and meteorological data, we obtain a comprehensive foundation for developing a fundamental understanding of HAB events. This then directs labor- intensive and costly sampling efforts and analyses. Additionally, we provide coastal municipalities, managers and state agencies with detailed information to aid their efforts in providing responsible environmental stewardship of their coastal waters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the establishment of the first national strategic development plan in the early 1970s, the construction industry has played an important role in terms of the economic, social and cultural development of Indonesia. The industry’s contribution to Indonesia’s gross domestic product (GDP) increased from 3.9% in 1973 to 7.7% in 2007. Business Monitoring International (2009) forecasts that Indonesia is home to one of the fastest-growing construction industries in Asia despite the average construction growth rate being expected to remain under 10% over the period 2006 – 2010. Similarly, Howlett and Powell (2006) place Indonesia as one of the 20 largest construction markets in 2010. Although the prospects for the Indonesian construction industry are now very promising, many local construction firms still face serious difficulties, such as poor performance and low competitiveness. There are two main reasons behind this problem: the environment that they face is not favourable; the other is the lack of strategic direction to improve competitiveness and performance. Furthermore, although strategic management has now become more widely used by many large construction firms in developed countries, practical examples and empirical studies related to the Indonesian construction industry remain scarce. In addition, research endeavours related to these topics in developing countries appear to be limited. This has potentially become one of the factors hampering efforts to guide Indonesian construction enterprises. This research aims to construct a conceptual model to enable Indonesian construction enterprises to develop a sound long-term corporate strategy that generates competitive advantage and superior performance. The conceptual model seeks to address the main prescription of a dynamic capabilities framework (Teece, Pisano & Shuen, 1997; Teece, 2007) within the context of the Indonesian construction industry. It is hypothesised that in a rapidly changing and varied environment, competitive success arises from the continuous development and reconfiguration of firm’s specific assets achieving competitive advantage is not only dependent on the exploitation of specific assets/capabilities, but on the exploitation of all of the assets and capabilities combinations in the dynamic capabilities framework. Thus, the model is refined through sequential statistical regression analyses of survey results with a sample size of 120 valid responses. The results of this study provide empirical evidence in support of the notion that a competitive advantage is achieved via the implementation of a dynamic capability framework as an important way for a construction enterprise to improve its organisational performance. The characteristics of asset-capability combinations were found to be significant determinants of the competitive advantage of the Indonesian construction enterprises, and that such advantage sequentially contributes to organisational performance. If a dynamic capabilities framework can work in the context of Indonesia, it suggests that the framework has potential applicability in other emerging and developing countries. This study also demonstrates the importance of the multi-stage nature of the model which provides a rich understanding of the dynamic process by which asset-capability should be exploited in combination by the construction firms operating in varying levels of hostility. Such findings are believed to be useful to both academics and practitioners, however, as this research represents a dynamic capabilities framework at the enterprise level, future studies should continue to explore and examine the framework in other levels of strategic management in construction as well as in other countries where different cultures or similar conditions prevails.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerous econometric models have been proposed for forecasting property market performance, but limited success has been achieved in finding a reliable and consistent model to predict property market movements over a five to ten year timeframe. This research focuses on office rental growth forecasts and overviews many of the office rent models that have evolved over the past 20 years. A model by DiPasquale and Wheaton is selected for testing in the Brisbane, Australia office market. The adaptation of this study did not provide explanatory variables that could assist in developing a reliable, predictive model of office rental growth. In light of this result, the paper suggests a system dynamics framework that includes an econometric model based on historical data as well as user input guidance for the primary variables. The rent forecast outputs would be assessed having regard to market expectations and probability profiling undertaken for use in simulation exercises. The paper concludes with ideas for ongoing research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Overall, computer models and simulations have a rather disappointing record within the management sciences as a tool for predicting the future. Social and market environments can be influenced by an overwhelming number of variables, and it is therefore difficult to use computer models to make forecasts or to test hypotheses concerning the relationship between individual behaviours and macroscopic outcomes. At the same time, however, advocates of computer models argue that they can be used to overcome the human mind's inability to cope with several complex variables simultaneously or to understand concepts that are highly counterintuitive. This paper seeks to bridge the gap between these two perspectives by suggesting that management research can indeed benefit from computer models by using them to formulate fruitful hypotheses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the introduction of a statutory‐backed continuous disclosure regime (CDR) in 1994, regulatory reforms have significantly increased litigation risk in Australia for failure to disclose material information or for false and misleading disclosure. However, there is almost no empirical research on the impact of the reforms on corporate disclosure behaviour. Motivated by the absence of research and using management earnings forecasts (MEFs) as a disclosure proxy, this study examines (1) why managers issue earnings forecasts, (2) what firm‐specific factors influence MEF characteristics, and (3) how MEF behaviour changes as litigation risk increases. Based on theories in information economics, a theoretical framework for MEF behaviour is formulated which includes antecedent influencing factors related to firms‟ internal and external environments. Applying this framework, hypotheses are developed and tested using multivariate models and a large sample of hand-collected MEFs (7,213) issued by top 500 ASX-listed companies over the 1994 to 2008 period. The results reveal strong support for the hypotheses. First, MEFs are issued to reduce information asymmetry, litigation risk and signal superior performance. Second, firms with better financial performance, smaller earnings changes, and lower operating uncertainty provide better quality MEFs. Third, forecast frequency and quality (accuracy, timeliness and precision) noticeably improve as litigation risk increases. However, managers appear to be still reluctant to disclose earnings forecasts when there are large earnings changes, and an asymmetric treatment of news type continues to prevail (a good news bias). Thus, the findings generally provide support for the effectiveness of the CDR regulatory reforms in improving disclosure behaviour and will be valuable to market participants and corporate regulators in understanding the implications of management forecasting decisions and areas for further improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to jointly assess the impact of regulatory reform for corporate fundraising in Australia (CLERP Act 1999) and the relaxation of ASX admission rules in 1999, on the accuracy of management earnings forecasts in initial public offer (IPO) prospectuses. The relaxation of ASX listing rules permitted a new category of new economy firms (commitments test entities (CTEs))to list without a prior history of profitability, while the CLERP Act (introduced in 2000) was accompanied by tighter disclosure obligations and stronger enforcement action by the corporate regulator (ASIC). Design/methodology/approach – All IPO earnings forecasts in prospectuses lodged between 1998 and 2003 are examined to assess the pre- and post-CLERP Act impact. Based on active ASIC enforcement action in the post-reform period, IPO firms are hypothesised to provide more accurate forecasts, particularly CTE firms, which are less likely to have a reasonable basis for forecasting. Research models are developed to empirically test the impact of the reforms on CTE and non-CTE IPO firms. Findings – The new regulatory environment has had a positive impact on management forecasting behaviour. In the post-CLERP Act period, the accuracy of prospectus forecasts and their revisions significantly improved and, as expected, the results are primarily driven by CTE firms. However, the majority of prospectus forecasts continue to be materially inaccurate. Originality/value – The results highlight the need to control for both the changing nature of listed firms and the level of enforcement action when examining responses to regulatory changes to corporate fundraising activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 31st TTRA conference was held in California’s San Fernando Valley, home of Hollywood and Burbank’s movie and television studios. The twin themes of Hollywood and the new Millennium promised and delivered “something old, yet something new”. The meeting offered a historical summary, not only of the year in review but also of many features of travel research since the first literature in the field appeared in the 1970s. Also, the millennium theme set the scene for some stimulating and forward thinking discussions. The Hollywood location offered an opportunity to ponder on the value of the movie-induced tourism for Los Angeles, at a time when Hollywood Boulevard was in the midst of a much needed redevelopment programme. Hollywood Chamber of Commerce speaker Oscar Arslanian acknowledged that the face of the famous district had become tired, and that its ability to continue to attract visitors in the future lay in redeveloping its past heritage. In line with the Hollywood theme a feature of the conference was a series of six special sessions with “Stars of Travel Research”. These sessions featured: Clare Gunn, Stanley Plog, Charles Gouldner, John Hunt, Brent Ritchie, Geoffrey Crouch, Peter Williams, Douglas Frechtling, Turgut Var, Robert Christie-Mill, and John Crotts. Delegates were indeed privileged to hear from many of the pioneers of tourism research. Clare Gunn, Charles Goeldner, Turgut Var and Stanley Plog, for example, traced the history of different aspects of the tourism literature, and in line with the millennium theme, offered some thought provoking discussion on the future challenges facing tourism. These included; the commodotisation of airlines and destinations, airport and traffic congestion, environment sustainability responsibility and the looming burst of the baby-boomer bubble. Included in the conference proceedings are four papers presented by five of the “Stars”. Brent Ritchie and Geoffrey Crouch discuss the critical success factors for destinations, Clare Gunn shares his concerns about tourism being a smokestack industry, Doug Frechtling provides forecasts of outbound travel from 20 countries, and Charles Gouldner, who has attended all 31 TTRA conferences, reflects on the changes that have taken place in tourism research over 35 years...