133 resultados para patronage forecasting
Resumo:
Crash prediction models are used for a variety of purposes including forecasting the expected future performance of various transportation system segments with similar traits. The influence of intersection features on safety have been examined extensively because intersections experience a relatively large proportion of motor vehicle conflicts and crashes compared to other segments in the transportation system. The effects of left-turn lanes at intersections in particular have seen mixed results in the literature. Some researchers have found that left-turn lanes are beneficial to safety while others have reported detrimental effects on safety. This inconsistency is not surprising given that the installation of left-turn lanes is often endogenous, that is, influenced by crash counts and/or traffic volumes. Endogeneity creates problems in econometric and statistical models and is likely to account for the inconsistencies reported in the literature. This paper reports on a limited-information maximum likelihood (LIML) estimation approach to compensate for endogeneity between left-turn lane presence and angle crashes. The effects of endogeneity are mitigated using the approach, revealing the unbiased effect of left-turn lanes on crash frequency for a dataset of Georgia intersections. The research shows that without accounting for endogeneity, left-turn lanes ‘appear’ to contribute to crashes; however, when endogeneity is accounted for in the model, left-turn lanes reduce angle crash frequencies as expected by engineering judgment. Other endogenous variables may lurk in crash models as well, suggesting that the method may be used to correct simultaneity problems with other variables and in other transportation modeling contexts.
Resumo:
Predicting safety on roadways is standard practice for road safety professionals and has a corresponding extensive literature. The majority of safety prediction models are estimated using roadway segment and intersection (microscale) data, while more recently efforts have been undertaken to predict safety at the planning level (macroscale). Safety prediction models typically include roadway, operations, and exposure variables—factors known to affect safety in fundamental ways. Environmental variables, in particular variables attempting to capture the effect of rain on road safety, are difficult to obtain and have rarely been considered. In the few cases weather variables have been included, historical averages rather than actual weather conditions during which crashes are observed have been used. Without the inclusion of weather related variables researchers have had difficulty explaining regional differences in the safety performance of various entities (e.g. intersections, road segments, highways, etc.) As part of the NCHRP 8-44 research effort, researchers developed PLANSAFE, or planning level safety prediction models. These models make use of socio-economic, demographic, and roadway variables for predicting planning level safety. Accounting for regional differences - similar to the experience for microscale safety models - has been problematic during the development of planning level safety prediction models. More specifically, without weather related variables there is an insufficient set of variables for explaining safety differences across regions and states. Furthermore, omitted variable bias resulting from excluding these important variables may adversely impact the coefficients of included variables, thus contributing to difficulty in model interpretation and accuracy. This paper summarizes the results of an effort to include weather related variables, particularly various measures of rainfall, into accident frequency prediction and the prediction of the frequency of fatal and/or injury degree of severity crash models. The purpose of the study was to determine whether these variables do in fact improve overall goodness of fit of the models, whether these variables may explain some or all of observed regional differences, and identifying the estimated effects of rainfall on safety. The models are based on Traffic Analysis Zone level datasets from Michigan, and Pima and Maricopa Counties in Arizona. Numerous rain-related variables were found to be statistically significant, selected rain related variables improved the overall goodness of fit, and inclusion of these variables reduced the portion of the model explained by the constant in the base models without weather variables. Rain tends to diminish safety, as expected, in fairly complex ways, depending on rain frequency and intensity.
Resumo:
A hybrid genetic algorithm/scaled conjugate gradient regularisation method is designed to alleviate ANN `over-fitting'. In application to day-ahead load forecasting, the proposed algorithm performs better than early-stopping and Bayesian regularisation, showing promising initial results.
Resumo:
Short-term traffic flow data is characterized by rapid and dramatic fluctuations. It reflects the nature of the frequent congestion in the lane, which shows a strong nonlinear feature. Traffic state estimation based on the data gained by electronic sensors is critical for much intelligent traffic management and the traffic control. In this paper, a solution to freeway traffic estimation in Beijing is proposed using a particle filter, based on macroscopic traffic flow model, which estimates both traffic density and speed.Particle filter is a nonlinear prediction method, which has obvious advantages for traffic flows prediction. However, with the increase of sampling period, the volatility of the traffic state curve will be much dramatic. Therefore, the prediction accuracy will be affected and difficulty of forecasting is raised. In this paper, particle filter model is applied to estimate the short-term traffic flow. Numerical study is conducted based on the Beijing freeway data with the sampling period of 2 min. The relatively high accuracy of the results indicates the superiority of the proposed model.
Resumo:
Background: The transmission of hemorrhagic fever with renal syndrome (HFRS) is influenced by climatic variables. However, few studies have examined the quantitative relationship between climate variation and HFRS transmission. ---------- Objective: We examined the potential impact of climate variability on HFRS transmission and developed climate-based forecasting models for HFRS in northeastern China. ---------- Methods: We obtained data on monthly counts of reported HFRS cases in Elunchun and Molidawahaner counties for 1997–2007 from the Inner Mongolia Center for Disease Control and Prevention and climate data from the Chinese Bureau of Meteorology. Cross-correlations assessed crude associations between climate variables, including rainfall, land surface temperature (LST), relative humidity (RH), and the multivariate El Niño Southern Oscillation (ENSO) index (MEI) and monthly HFRS cases over a range of lags. We used time-series Poisson regression models to examine the independent contribution of climatic variables to HFRS transmission. ----------- Results: Cross-correlation analyses showed that rainfall, LST, RH, and MEI were significantly associated with monthly HFRS cases with lags of 3–5 months in both study areas. The results of Poisson regression indicated that after controlling for the autocorrelation, seasonality, and long-term trend, rainfall, LST, RH, and MEI with lags of 3–5 months were associated with HFRS in both study areas. The final model had good accuracy in forecasting the occurrence of HFRS. ---------- Conclusions: Climate variability plays a significant role in HFRS transmission in northeastern China. The model developed in this study has implications for HFRS control and prevention.
Resumo:
Operations management is an area concerned with the production of goods and services ensuring that business operations are efficient in utilizing resource and effective to meet customer requirements. It deals with the design and management of products, processes, services and supply chains and considers the acquisition, development, and effective and efficient utilization of resources. Unlike other engineering subjects, content of these units could be very wide and vast. It is therefore necessary to cover the content that is most related to the contemporary industries. It is also necessary to understand what engineering management skills are critical for engineers working in the contemporary organisations. Most of the operations management books contain traditional Operations Management techniques. For example ‘inventory management’ is an important topic in operations management. All OM books deal with effective method of inventory management. However, new trend in OM is Just in time (JIT) delivery or minimization of inventory. It is therefore important to decide whether to emphasise on keeping inventory (as suggested by most books) or minimization of inventory. Similarly, for OM decisions like forecasting, optimization and linear programming most organisations now a day’s use software. Now it is important for us to determine whether some of these software need to be introduced in tutorial/ lab classes. If so, what software? It is established in the Teaching and Learning literature that there must be a strong alignment between unit objectives, assessment and learning activities to engage students in learning. Literature also established that engaging students is vital for learning. However, engineering units (more specifically Operations management) is quite different from other majors. Only alignment between objectives, assessment and learning activities cannot guarantee student engagement. Unit content must be practical oriented and skills to be developed should be those demanded by the industry. Present active learning research, using a multi-method research approach, redesigned the operations management content based on latest developments in Engineering Management area and the necessity of Australian industries. The redesigned unit has significantly helped better student engagement and better learning. It was found that students are engaged in the learning if they find the contents are helpful in developing skills that are necessary in their practical life.
Resumo:
We examine the impact of continuous disclosure regulatory reform on the likelihood, frequency and qualitative characteristics of management earnings forecasts issued in New Zealand’s low private litigation environment. Using a sample of 720 earnings forecasts issued by 94 firms listed on the New Zealand Exchange before and after the reform (1999–2005), we provide strong evidence of significant changes in forecasting behaviour in the post-reform period. Specifically, firms were more likely to issue earnings forecasts to pre-empt earnings announcements and, in contrast to findings in other legal settings, those earnings forecasts exhibited higher frequency and improved qualitative characteristics (better precision and accuracy). An important implication of our findings is that public regulatory reforms may have a greater benefit in a low private litigation environment and thus add to the global debate about the effectiveness of alternative public regulatory reforms of corporate requirements.
Resumo:
In many product categories of durable goods such as TV, PC, and DVD player, the largest component of sales is generated by consumers replacing existing units. Aggregate sales models proposed by diffusion of innovation researchers for the replacement component of sales have incorporated several different replacement distributions such as Rayleigh, Weibull, Truncated Normal and Gamma. Although these alternative replacement distributions have been tested using both time series sales data and individual-level actuarial “life-tables” of replacement ages, there is no census on which distributions are more appropriate to model replacement behaviour. In the current study we are motivated to develop a new “modified gamma” distribution by two reasons. First we recognise that replacements have two fundamentally different drivers – those forced by failure and early, discretionary replacements. The replacement distribution for each of these drivers is expected to be quite different. Second, we observed a poor fit of other distributions to out empirical data. We conducted a survey of 8,077 households to empirically examine models of replacement sales for six electronic consumer durables – TVs, VCRs, DVD players, digital cameras, personal and notebook computers. This data allows us to construct individual-level “life-tables” for replacement ages. We demonstrate the new modified gamma model fits the empirical data better than existing models for all six products using both a primary and a hold-out sample.
Resumo:
Using GIS to evaluate travel behaviour is an important technique to increase our understanding of the relationship between accessibility and transport demand. In this paper, the activity space concept was used to identify the nature of participation in activities (or lack of it) amongst a group of students using a 2 day travel-activity diary. Three different indicators such as the number of unique locations visited, average daily distance travelled, and average daily activity duration were used to measure the size of activity spaces. These indicators reflect levels of accessibility, personal mobility, and the extent of participation respectively. Multiple regression analyses were used to assess the impacts of students socio-economic status and the spatial characteristics of home location. Although no differences were found in the levels of accessibility and the extent of participation measures, home location with respect to a demand responsive transport (DRT) service was found to be the most important determinant of their mobility patterns. Despite being able to travel longer distances, students who live outside of the DRT service area were found to be temporally excluded from some opportunities. Student activity spaces were also visualised within a GIS environment and a spatial analysis was conducted to underpin the evaluation of the performance of the DRT. This approach was also used to identify the activity spaces of individuals that are geographically excluded from the service. Evaluation of these results indicated that although the service currently covers areas of high demand, 90% of the activity spaces remained un-served by the DRT service. Using this data six new routes were designed to meet the coverage goal of public transport based on a measure of network impedance based on inverse activity density. Following assessment of public transport service coverage, the study was extended using a Spatial Multi Criteria Evaluation (SMCE) technique to assess the effect of service provision on patronage.
Resumo:
CRE (Corporate Real Estate) decisions should not simply deal with the management of individual facilities, but should especially be concerned with the relationships that a facility has with the corporate business strategy and with the larger real estate markets. Both the practice and the research of CRE management have historically tended to emphasize real estate issues and ignore the corporation’s business issues, causing real estate strategies to be disconnected from the goal and priorities of the corporation’s senior management. With regard to office cycles, a large number of econometric models have been proposed during the last 20 years. However, evidence from historical data and previous research in the field of real estate forecasting seem to agree only on one thing: the existence of interconnected property cycles that are concentrated on vacancy rates (demand). Vacancy also represents the linkage between the inadequacy of existing CRE strategies and the inability of existing econometric models to correctly forecast office rent cycles. Business cycles, across different industry sectors, have decreased from 5-7 years to 1-3 years today, yet corporations are still entering into leases of 5-10 years, causing hidden vacancy levels to rise. Possibly, once CRE strategies are totally in tune with the overall business, hidden vacancy will fade away providing forecasters with better quality data. The aim of this paper is not to investigate whether and when the supply-side will eventually evolve to provide flexible occupancy arrangements to accommodate corporate agility requirements, but rather to propose a general framework for corporations to improve the decision making process of their CRE executives, while emphasizing the importance of understanding the context as a precondition to effective real estate involvements.
Resumo:
Numerous econometric models have been proposed for forecasting property market performance, but limited success has been achieved in finding a reliable and consistent model to predict property market movements over a five to ten year timeframe. This research focuses on office rental growth forecasts and overviews many of the office rent models that have evolved over the past 20 years. A model by DiPasquale and Wheaton is selected for testing in the Brisbane, Australia office market. The adaptation of this study did not provide explanatory variables that could assist in developing a reliable, predictive model of office rental growth. In light of this result, the paper suggests a system dynamics framework that includes an econometric model based on historical data as well as user input guidance for the primary variables. The rent forecast outputs would be assessed having regard to market expectations and probability profiling undertaken for use in simulation exercises. The paper concludes with ideas for ongoing research.
Resumo:
The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).
Resumo:
Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed-form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This article provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox–Ingersoll–Ross and Ornstein–Uhlenbeck equations respectively.
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.
A longitudinal study of corporate earnings guidance in Australia’s continuous disclosure environment
Resumo:
Since the introduction of a statutory‐backed continuous disclosure regime (CDR) in 1994, regulatory reforms have significantly increased litigation risk in Australia for failure to disclose material information or for false and misleading disclosure. However, there is almost no empirical research on the impact of the reforms on corporate disclosure behaviour. Motivated by the absence of research and using management earnings forecasts (MEFs) as a disclosure proxy, this study examines (1) why managers issue earnings forecasts, (2) what firm‐specific factors influence MEF characteristics, and (3) how MEF behaviour changes as litigation risk increases. Based on theories in information economics, a theoretical framework for MEF behaviour is formulated which includes antecedent influencing factors related to firms‟ internal and external environments. Applying this framework, hypotheses are developed and tested using multivariate models and a large sample of hand-collected MEFs (7,213) issued by top 500 ASX-listed companies over the 1994 to 2008 period. The results reveal strong support for the hypotheses. First, MEFs are issued to reduce information asymmetry, litigation risk and signal superior performance. Second, firms with better financial performance, smaller earnings changes, and lower operating uncertainty provide better quality MEFs. Third, forecast frequency and quality (accuracy, timeliness and precision) noticeably improve as litigation risk increases. However, managers appear to be still reluctant to disclose earnings forecasts when there are large earnings changes, and an asymmetric treatment of news type continues to prevail (a good news bias). Thus, the findings generally provide support for the effectiveness of the CDR regulatory reforms in improving disclosure behaviour and will be valuable to market participants and corporate regulators in understanding the implications of management forecasting decisions and areas for further improvement.