933 resultados para rainfall-runoff empirical statistical model
Resumo:
Recently, Drǎgulescu and Yakovenko proposed an analytical formula for computing the probability density function of stock log returns, based on the Heston model, which they tested empirically. Their research design inadvertently favourably biased the fit of the data to the Heston model, thus overstating their empirical results. Furthermore, Drǎgulescu and Yakovenko did not perform any goodness-of-fit statistical tests. This study employs a research design that facilitates statistical tests of the goodness-of-fit of the Heston model to empirical returns. Robustness checks are also performed. In brief, the Heston model outperformed the Gaussian model only at high frequencies and even so does not provide a statistically acceptable fit to the data. The Gaussian model performed (marginally) better at medium and low frequencies, at which points the extra parameters of the Heston model have adverse impacts on the test statistics. © 2005 Taylor & Francis Group Ltd.
Resumo:
The techniques and insights from two distinct areas of financial economic modelling are combined to provide evidence of the influence of firm size on the volatility of stock portfolio returns. Portfolio returns are characterized by positive serial correlation induced by the varying levels of non-synchronous trading among the component stocks. This serial correlation is greatest for portfolios of small firms. The conditional volatility of stock returns has been shown to be well represented by the GARCH family of statistical processes. Using a GARCH model of the variance of capitalization-based portfolio returns, conditioned on the autocorrelation structure in the conditional mean, striking differences related to firm size are uncovered.
Resumo:
Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.
Resumo:
This thesis examines the effect of rights issue announcements on stock prices by companies listed on the Kuala Lumpur Stock Exchange (KLSE) between 1987 to 1996. The emphasis is to report whether the KLSE is semi strongly efficient with respect to the announcement of rights issues and to check whether the implications of corporate finance theories on the effect of an event can be supported in the context of an emerging market. Once the effect is established, potential determinants of abnormal returns identified by previous empirical work and corporate financial theory are analysed. By examining 70 companies making clean rights issue announcements, this thesis will hopefully shed light on some important issues in long term corporate financing. Event study analysis is used to check on the efficiency of the Malaysian stock market; while cross-sectional regression analysis is executed to identify possible explanators of the rights issue announcements' effect. To ensure the results presented are not contaminated, econometric and statistical issues raised in both analyses have been taken into account. Given the small amount of empirical research conducted in this part of the world, the results of this study will hopefully be of use to investors, security analysts, corporate financial managements, regulators and policy makers as well as those who are interested in capital market based research of an emerging market. It is found that the Malaysian stock market is not semi strongly efficient since there exists a persistent non-zero abnormal return. This finding is not consistent with the hypothesis that security returns adjust rapidly to reflect new information. It may be possible that the result is influenced by the sample, consisting mainly of below average size companies which tend to be thinly traded. Nevertheless, these issues have been addressed. Another important issue which has emerged from the study is that there is some evidence to suggest that insider trading activity existed in this market. In addition to these findings, when the rights issue announcements' effect is compared to the implications of corporate finance theories in predicting the sign of abnormal returns, the signalling model, asymmetric information model, perfect substitution hypothesis and Scholes' information hypothesis cannot be supported.
Resumo:
The research concerned the assessment of the pathways utilized by heavy metal pollutants in urban stormwater runoff. A separately sewered urban residential catchment of approximately 107 hectares in Chelmsley Wood, north-east Birmingham was the subject of the field investigation. The catchment area, almost entirely residential, had no immediate industrial heavy metal input, however, industry was situated to the north of the catchment. The perimeter of the catchment was bounded by the M6 motorway on the northern and eastern sides and was believed to contribute to aerial deposition. Metal inputs to the ground surface were assumed to be confined to normal suburban activities, namely, aerial deposition, vehicular activity and anthropological activities. A programme of field work was undertaken over a 12 month period, from July 1983 to July 1984. Monthly deposition rates were monitored using a network of deposit cannisters and roadside sediment and soil samples were taken. Stormwater samples were obtained for 19 separate events. All samples were analysed for iron, lead, zinc, copper, chromium, nickel and cadmium content. Rainfall was recorded on site with additional meteorological data obtained from local Meteorological Offices. Use was made of a simple conceptual model designed for the catchment to substantiate hypotheses derived from site investigations and literature, to investigate the pathways utilized for the transportation of heavy metals throughout the catchment.
Resumo:
This paper develops and tests a learning organization model derived from HRM and dynamic capability literatures in order to ascertain the model's applicability across divergent global contexts. We define a learning organization as one capable of achieving on-going strategic renewal, arguing based on dynamic capability theory that the model has three necessary antecedents: HRM focus, developmental orientation and customer-facing remit. Drawing on a sample comprising nearly 6000 organizations across 15 countries, we show that learning organizations exhibit higher performance than their less learning-inclined counterparts. We also demonstrate that innovation fully mediates the relationship between our conceptualization of the learning organization and organizational performance in 11 of the 15 countries we examined. It is the first time in our knowledge that these questions have been tested in a major, cross-global study, and our work contributes to both HRM and dynamic capability literatures, especially where the focus is the applicability of best practice parameters across national boundaries.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The financial community is well aware that continued underfunding of state and local government pension plans poses many public policy and fiduciary management concerns. However, a well-defined theoretical rationale has not been developed to explain why and how public sector pension plans underfund. This study uses three methods: a survey of national pension experts, an incomplete covariance panel method, and field interviews.^ A survey of national public sector pension experts was conducted to provide a conceptual framework by which underfunding could be evaluated. Experts suggest that plan design, fiscal stress, and political culture factors impact underfunding. However, experts do not agree with previous research findings that unions actively pursue underfunding to secure current wage increases.^ Within the conceptual framework and determinants identified by experts, several empirical regularities are documented for the first time. Analysis of 173 local government pension plans, observed from 1987 to 1992, was conducted. Findings indicate that underfunding occurs in plans that have lower retirement ages, increased costs due to benefit enhancements, when the sponsor faces current year operating deficits, or when a local government relies heavily on inelastic revenue sources. Results also suggest that elected officials artificially inflate interest rate assumptions to reduce current pension costs, consequently shifting these costs to future generations. In concurrence with some experts there is no data to support the assumption that highly unionized employees secure more funding than less unionized employees.^ Empirical results provide satisfactory but not overwhelming statistical power, and only minor predictive capacity. To further explore why underfunding occurs, field interviews were carried out with 62 local government officials. Practitioners indicated that perceived fiscal stress, the willingness of policymakers to advance funding, bargaining strategies used by union officials, apathy by employees and retirees, pension board composition, and the level of influence by internal pension experts has an impact on funding outcomes.^ A pension funding process model was posited by triangulating the expert survey, empirical findings, and field survey results. The funding process model should help shape and refine our theoretical knowledge of state and local government pension underfunding in the future. ^
Resumo:
Crash reduction factors (CRFs) are used to estimate the potential number of traffic crashes expected to be prevented from investment in safety improvement projects. The method used to develop CRFs in Florida has been based on the commonly used before-and-after approach. This approach suffers from a widely recognized problem known as regression-to-the-mean (RTM). The Empirical Bayes (EB) method has been introduced as a means to addressing the RTM problem. This method requires the information from both the treatment and reference sites in order to predict the expected number of crashes had the safety improvement projects at the treatment sites not been implemented. The information from the reference sites is estimated from a safety performance function (SPF), which is a mathematical relationship that links crashes to traffic exposure. The objective of this dissertation was to develop the SPFs for different functional classes of the Florida State Highway System. Crash data from years 2001 through 2003 along with traffic and geometric data were used in the SPF model development. SPFs for both rural and urban roadway categories were developed. The modeling data used were based on one-mile segments that contain homogeneous traffic and geometric conditions within each segment. Segments involving intersections were excluded. The scatter plots of data show that the relationships between crashes and traffic exposure are nonlinear, that crashes increase with traffic exposure in an increasing rate. Four regression models, namely, Poisson (PRM), Negative Binomial (NBRM), zero-inflated Poisson (ZIP), and zero-inflated Negative Binomial (ZINB), were fitted to the one-mile segment records for individual roadway categories. The best model was selected for each category based on a combination of the Likelihood Ratio test, the Vuong statistical test, and the Akaike's Information Criterion (AIC). The NBRM model was found to be appropriate for only one category and the ZINB model was found to be more appropriate for six other categories. The overall results show that the Negative Binomial distribution model generally provides a better fit for the data than the Poisson distribution model. In addition, the ZINB model was found to give the best fit when the count data exhibit excess zeros and over-dispersion for most of the roadway categories. While model validation shows that most data points fall within the 95% prediction intervals of the models developed, the Pearson goodness-of-fit measure does not show statistical significance. This is expected as traffic volume is only one of the many factors contributing to the overall crash experience, and that the SPFs are to be applied in conjunction with Accident Modification Factors (AMFs) to further account for the safety impacts of major geometric features before arriving at the final crash prediction. However, with improved traffic and crash data quality, the crash prediction power of SPF models may be further improved.