133 resultados para patronage forecasting
em Queensland University of Technology - ePrints Archive
Resumo:
Improved forecasting of urban rail patronage is essential for effective policy development and efficient planning for new rail infrastructure. Past modelling and forecasting of urban rail patronage has been based on legacy modelling approaches and often conducted at the general level of public transport demand, rather than being specific to urban rail. This project canvassed current Australian practice and international best practice to develop and estimate time series and cross-sectional models of rail patronage for Australian mainland state capital cities. This involved the implementation of a large online survey of rail riders and non-riders for each of the state capital cities, thereby resulting in a comprehensive database of respondent socio-economic profiles, travel experience, attitudes to rail and other modes of travel, together with stated preference responses to a wide range of urban travel scenarios. Estimation of the models provided a demonstration of their ability to provide information on the major influences on the urban rail travel decision. Rail fares, congestion and rail service supply all have a strong influence on rail patronage, while a number of less significant factors such as fuel price and access to a motor vehicle are also influential. Of note, too, is the relative homogeneity of rail user profiles across the state capitals. Rail users tended to have higher incomes and education levels. They are also younger and more likely to be in full-time employment than non-rail users. The project analysis reported here represents only a small proportion of what could be accomplished utilising the survey database. More comprehensive investigation was beyond the scope of the project and has been left for future work.
Rainfall, Mosquito Density and the Transmission of Ross River Virus: A Time-Series Forecasting Model
Comparison of Regime Switching, Probit and Logit Models in Dating and Forecasting US Business Cycles
Resumo:
The Queensland Department of Public Works (DPW) holds a significant interest in the Brisbane Central Business District (CBD) in controlling approximately 20 percent of the office space within its confines. This comprises a total of 333,903 square metres of space, of which 170,111 square metres is owned and 163,792 square metres is leased from the private sector. The department’s nominal ownership extends to several enduring, landmark buildings as well as several modern office towers. The portfolio includes the oldest building in the CBD, being the former Commissariat Stores building and one of the newest, a 15,000 square metre office tower under construction at 33 Charlotte Street.
Resumo:
With growing concern over the use of the car in our urbanized society, there have emerged a number of lobby groups and professional bodies promoting a return to public transport, walking and cycling, with the urban village as the key driving land use, as a means of making our cities’ transportation systems more sustainable. This research has aimed at developing a framework applicable to the Australian setting that can facilitate increased passenger patronage of rail based urban transport systems from adjacent or associated land uses. The framework specifically tested the application of the Park & Ride and Transit Oriented Development (TOD) concepts and their applicability within the cultural, institutional, political and transit operational characteristics of Australian society. The researcher found that, although the application of the TOD concept had been limited to small pockets of town houses and mixed use developments around stations, the development industry and emerging groups within the community are posed to embrace the concept and bring with it increased rail patronage. The lack of a clear commitment to infrastructure and supporting land uses is a major barrier to the implementation of TODs. The research findings demonstrated significant scope for the size of a TOD to expand to a much greater radius of activity from the public transport interchange, than the commonly quoted 400 to 600 meters, thus incorporating many more residents and potential patrons. The provision of Park & Rides, and associated support facilities like Kiss & Rides, have followed worldwide trends of high patronage demands from the middle and outer car dependent suburbs of our cities. The data collection and analysis gathered by the researcher demonstrated that in many cases Park & Rides should form part of a TOD to ensure ease of access to rail stations by all modes and patron types. The question, however, remains how best to plan the incorporation of a Park & Ride within a TOD and still maintain those features that attract and promote TODs as a living entity.
Resumo:
In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.
Resumo:
At least two important transportation planning activities rely on planning-level crash prediction models. One is motivated by the Transportation Equity Act for the 21st Century, which requires departments of transportation and metropolitan planning organizations to consider safety explicitly in the transportation planning process. The second could arise from a need for state agencies to establish incentive programs to reduce injuries and save lives. Both applications require a forecast of safety for a future period. Planning-level crash prediction models for the Tucson, Arizona, metropolitan region are presented to demonstrate the feasibility of such models. Data were separated into fatal, injury, and property-damage crashes. To accommodate overdispersion in the data, negative binomial regression models were applied. To accommodate the simultaneity of fatality and injury crash outcomes, simultaneous estimation of the models was conducted. All models produce crash forecasts at the traffic analysis zone level. Statistically significant (p-values < 0.05) and theoretically meaningful variables for the fatal crash model included population density, persons 17 years old or younger as a percentage of the total population, and intersection density. Significant variables for the injury and property-damage crash models were population density, number of employees, intersections density, percentage of miles of principal arterial, percentage of miles of minor arterials, and percentage of miles of urban collectors. Among several conclusions it is suggested that planning-level safety models are feasible and may play a role in future planning activities. However, caution must be exercised with such models.
Resumo:
The driving task requires sustained attention during prolonged periods, and can be performed in highly predictable or repetitive environments. Such conditions could create hypovigilance and impair performance towards critical events. Identifying such impairment in monotonous conditions has been a major subject of research, but no research to date has attempted to predict it in real-time. This pilot study aims to show that performance decrements due to monotonous tasks can be predicted through mathematical modelling taking into account sensation seeking levels. A short vigilance task sensitive to short periods of lapses of vigilance called Sustained Attention to Response Task is used to assess participants‟ performance. The framework for prediction developed on this task could be extended to a monotonous driving task. A Hidden Markov Model (HMM) is proposed to predict participants‟ lapses in alertness. Driver‟s vigilance evolution is modelled as a hidden state and is correlated to a surrogate measure: the participant‟s reactions time. This experiment shows that the monotony of the task can lead to an important decline in performance in less than five minutes. This impairment can be predicted four minutes in advance with an 86% accuracy using HMMs. This experiment showed that mathematical models such as HMM can efficiently predict hypovigilance through surrogate measures. The presented model could result in the development of an in-vehicle device that detects driver hypovigilance in advance and warn the driver accordingly, thus offering the potential to enhance road safety and prevent road crashes.
Resumo:
Purpose – The purpose of this paper is to jointly assess the impact of regulatory reform for corporate fundraising in Australia (CLERP Act 1999) and the relaxation of ASX admission rules in 1999, on the accuracy of management earnings forecasts in initial public offer (IPO) prospectuses. The relaxation of ASX listing rules permitted a new category of new economy firms (commitments test entities (CTEs))to list without a prior history of profitability, while the CLERP Act (introduced in 2000) was accompanied by tighter disclosure obligations and stronger enforcement action by the corporate regulator (ASIC). Design/methodology/approach – All IPO earnings forecasts in prospectuses lodged between 1998 and 2003 are examined to assess the pre- and post-CLERP Act impact. Based on active ASIC enforcement action in the post-reform period, IPO firms are hypothesised to provide more accurate forecasts, particularly CTE firms, which are less likely to have a reasonable basis for forecasting. Research models are developed to empirically test the impact of the reforms on CTE and non-CTE IPO firms. Findings – The new regulatory environment has had a positive impact on management forecasting behaviour. In the post-CLERP Act period, the accuracy of prospectus forecasts and their revisions significantly improved and, as expected, the results are primarily driven by CTE firms. However, the majority of prospectus forecasts continue to be materially inaccurate. Originality/value – The results highlight the need to control for both the changing nature of listed firms and the level of enforcement action when examining responses to regulatory changes to corporate fundraising activities.
Resumo:
Forecasts generated by time series models traditionally place greater weight on more recent observations. This paper develops an alternative semi-parametric method for forecasting that does not rely on this convention and applies it to the problem of forecasting asset return volatility. In this approach, a forecast is a weighted average of historical volatility, with the greatest weight given to periods that exhibit similar market conditions to the time at which the forecast is being formed. Weighting is determined by comparing short-term trends in volatility across time (as a measure of market conditions) by means of a multivariate kernel scheme. It is found that the semi-parametric method produces forecasts that are significantly more accurate than a number of competing approaches at both short and long forecast horizons.
Resumo:
Forecasts of volatility and correlation are important inputs into many practical financial problems. Broadly speaking, there are two ways of generating forecasts of these variables. Firstly, time-series models apply a statistical weighting scheme to historical measurements of the variable of interest. The alternative methodology extracts forecasts from the market traded value of option contracts. An efficient options market should be able to produce superior forecasts as it utilises a larger information set of not only historical information but also the market equilibrium expectation of options market participants. While much research has been conducted into the relative merits of these approaches, this thesis extends the literature along several lines through three empirical studies. Firstly, it is demonstrated that there exist statistically significant benefits to taking the volatility risk premium into account for the implied volatility for the purposes of univariate volatility forecasting. Secondly, high-frequency option implied measures are shown to lead to superior forecasts of the intraday stochastic component of intraday volatility and that these then lead on to superior forecasts of intraday total volatility. Finally, the use of realised and option implied measures of equicorrelation are shown to dominate measures based on daily returns.
Resumo:
Our aim is to develop a set of leading performance indicators to enable managers of large projects to forecast during project execution how various stakeholders will perceive success months or even years into the operation of the output. Large projects have many stakeholders who have different objectives for the project, its output, and the business objectives they will deliver. The output of a large project may have a lifetime that lasts for years, or even decades, and ultimate impacts that go beyond its immediate operation. How different stakeholders perceive success can change with time, and so the project manager needs leading performance indicators that go beyond the traditional triple constraint to forecast how key stakeholders will perceive success months or even years later. In this article, we develop a model for project success that identifies how project stakeholders might perceive success in the months and years following a project. We identify success or failure factors that will facilitate or mitigate against achievement of those success criteria, and a set of potential leading performance indicators that forecast how stakeholders will perceive success during the life of the project's output. We conducted a scale development study with 152 managers of large projects and identified two project success factor scales and seven stakeholder satisfaction scales that can be used by project managers to predict stakeholder satisfaction on projects and so may be used by the managers of large projects for the basis of project control.
Resumo:
Client owners usually need an estimate or forecast of their likely building costs in advance of detailed design in order to confirm the financial feasibility of their projects. Because of their timing in the project life cycle, these early stage forecasts are characterized by the minimal amount of information available concerning the new (target) project to the point that often only its size and type are known. One approach is to use the mean contract sum of a sample, or base group, of previous projects of a similar type and size to the project for which the estimate is needed. Bernoulli’s law of large numbers implies that this base group should be as large as possible. However, increasing the size of the base group inevitably involves including projects that are less and less similar to the target project. Deciding on the optimal number of base group projects is known as the homogeneity or pooling problem. A method of solving the homogeneity problem is described involving the use of closed form equations to compare three different sampling arrangements of previous projects for their simulated forecasting ability by a cross-validation method, where a series of targets are extracted, with replacement, from the groups and compared with the mean value of the projects in the base groups. The procedure is then demonstrated with 450 Hong Kong projects (with different project types: Residential, Commercial centre, Car parking, Social community centre, School, Office, Hotel, Industrial, University and Hospital) clustered into base groups according to their type and size.
Resumo:
Air pollution has significant impacts on both the environment and human health. Therefore, urban areas have received ever growing attention, because they not only have the highest concentrations of air pollutants, but they also have the highest human population. In modern societies, urban air quality (UAQ) is routinely evaluated and local authorities provide regular reports to the public about current UAQ levels. Both local and international authorities also recommended that some air pollutant concentrations remain below a certain level, with the aim of reducing emissions and improving the air quality, both in urban areas and on a more regional scale. In some countries, protocols aimed at reducing emissions have come in force as a result of international agreements.