19 resultados para Estimating

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the common techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we introduce three novel techniques for tackling such problems, and investigate their performance using synthetic data. We then apply these techniques to the problem of extracting the distribution of wind vector directions from radar scatterometer data gathered by a remote-sensing satellite.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that one of the obstacles to effective forecasting of exchange rates is heteroscedasticity (non-stationary conditional variance). The autoregressive conditional heteroscedastic (ARCH) model and its variants have been used to estimate a time dependent variance for many financial time series. However, such models are essentially linear in form and we can ask whether a non-linear model for variance can improve results just as non-linear models (such as neural networks) for the mean have done. In this paper we consider two neural network models for variance estimation. Mixture Density Networks (Bishop 1994, Nix and Weigend 1994) combine a Multi-Layer Perceptron (MLP) and a mixture model to estimate the conditional data density. They are trained using a maximum likelihood approach. However, it is known that maximum likelihood estimates are biased and lead to a systematic under-estimate of variance. More recently, a Bayesian approach to parameter estimation has been developed (Bishop and Qazaz 1996) that shows promise in removing the maximum likelihood bias. However, up to now, this model has not been used for time series prediction. Here we compare these algorithms with two other models to provide benchmark results: a linear model (from the ARIMA family), and a conventional neural network trained with a sum-of-squares error function (which estimates the conditional mean of the time series with a constant variance noise model). This comparison is carried out on daily exchange rate data for five currencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a general methodology for estimating and incorporating uncertainty in the controller and forward models for noisy nonlinear control problems. Conditional distribution modeling in a neural network context is used to estimate uncertainty around the prediction of neural network outputs. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localize the possible control solutions to consider. A nonlinear multivariable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non Gaussian distributions of control signal as well as processes with hysteresis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Professional English football combines publicly traded ownership shares with an active and observable wagering market. This article utilizes the information from these markets, presenting a model that may be used to estimate the impact of matches on club values. Such information is potentially useful as clubs assess the values of players and coaches based on their anticipated contributions to team performance. The article also illustrates the modelling of ‘binomial events,’ such as win/lose, hire/do not hire or approval/disapproval, and how market-determined price responses illuminate expectations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims to contribute to the debate about the role of the public sector in stimulating greater use of private sector equity for business start-up and growth in two ways. First, to examine the extent to which the provision of public sector equity finance enables individual firms to raise additional funds in the private sector market place. Second, to consider the methodological implications for an economic impact assessment of industrial policy interventions (especially those which include an equity component) at the level of the individual firm. We assess the extent to which there may be indirect positive effects (externalities) associated with public sector financial assistance to individual firms and if so how they distort standard evaluation methodologies designed to estimate the level of additionality of that support. The paper draws upon the results of a recent study of the impact of Enterprise Ireland (EI) financial assistance to indigenous Irish industry in the period 2000 to 2002. The paper demonstrates that a process of re-calibration is necessary in estimates of economic impact in order to account for these positive externalities and the result in this study was a ‘boost’ to additionality. In operational and conceptual terms, the study underlines the importance of the relationship between private and public sector sources of equity finance as an important dynamic in the attempt by industrial and regional policy to stimulate the number of firms with viable investment proposals accessing external equity finance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simple models of time-varying risk premia are used to measure the risk premia in long-term UK government bonds. The parameters of the models can be estimated using nonlinear seemingly unrelated regression (NL-SUR), which permits efficient use of information across the entire yield curve and facilitates the testing of various cross-sectional restrictions. The estimated time-varying premia are found to be substantially different to those estimated using models that assume constant risk premia. © 2004 Taylor and Francis Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical compound action potentials (ECAPs) of the cochlear nerve are used clinically for quick and efficient cochlear implant parameter setting. The ECAP is the aggregate response of nerve fibres at various distances from the recording electrode, and the magnitude of the ECAP is therefore related to the number of fibres excited by a particular stimulus. Current methods, such as the masker-probe or alternating polarity methods, use the ECAP magnitude at various stimulus levels to estimate the neural threshold, from which the parameters are calculated. However, the correlation between ECAP threshold and perceptual threshold is not always good, with ECAP threshold typically being much higher than perceptual threshold. The lower correlation is partly due to the very different pulse rates used for ECAPs (below 100 Hz) and clinical programs (hundreds of Hz up to several kHz). Here we introduce a new method of estimating ECAP threshold for cochlear implants based upon the variability of the response. At neural threshold, where some but not all fibers respond, there is a different response each trial. This inter-trial variability can be detected overlaying the constant variability of the system noise. The large stimulus artefact, which requires additional trials for artefact rejection in the standard ECAP magnitude methods, is not consequential, as it has little variability. The variability method therefore consists of simply presenting a pulse and recording the ECAP, and as such is quicker than other methods. It also has the potential to be run at high rates like clinical programs, potentially improving the correlation with behavioural threshold. Preliminary data is presented that shows a detectable variability increase shortly after probe offset, at probe levels much lower than those producing a detectable ECAP magnitude. Care must be taken, however, to avoid saturation of the recording amplifier saturation; in our experiments we found a gain of 300 to be optimal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein–Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate whether consideration of store-level heterogeneity in marketing mix effects improves the accuracy of the marketing mix elasticities, fit, and forecasting accuracy of the widely-applied SCAN*PRO model of store sales. Models with continuous and discrete representations of heterogeneity, estimated using hierarchical Bayes (HB) and finite mixture (FM) techniques, respectively, are empirically compared to the original model, which does not account for store-level heterogeneity in marketing mix effects, and is estimated using ordinary least squares (OLS). The empirical comparisons are conducted in two contexts: Dutch store-level scanner data for the shampoo product category, and an extensive simulation experiment. The simulation investigates how between- and within-segment variance in marketing mix effects, error variance, the number of weeks of data, and the number of stores impact the accuracy of marketing mix elasticities, model fit, and forecasting accuracy. Contrary to expectations, accommodating store-level heterogeneity does not improve the accuracy of marketing mix elasticities relative to the homogeneous SCAN*PRO model, suggesting that little may be lost by employing the original homogeneous SCAN*PRO model estimated using ordinary least squares. Improvements in fit and forecasting accuracy are also fairly modest. We pursue an explanation for this result since research in other contexts has shown clear advantages from assuming some type of heterogeneity in market response models. In an Afterthought section, we comment on the controversial nature of our result, distinguishing factors inherent to household-level data and associated models vs. general store-level data and associated models vs. the unique SCAN*PRO model specification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the potential for cost savings in the general Practice units of a Primary Care Trust (PCT) in the UK. We have used Data Envelopment Analysis (DEA) to identify benchmark Practices, which offer the lowest aggregate referral and drugs costs controlling for the number, age, gender, and deprivation level of the patients registered with each Practice. For the remaining, non-benchmark Practices, estimates of the potential for savings on referral and drug costs were obtained. Such savings could be delivered through a combination of the following actions: (i) reducing the levels of referrals and prescriptions without affecting their mix (£15.74 m savings were identified, representing 6.4% of total expenditure); (ii) switching between inpatient and outpatient referrals and/or drug treatment to exploit differences in their unit costs (£10.61 m savings were identified, representing 4.3% of total expenditure); (iii) seeking a different profile of referral and drug unit costs (£11.81 m savings were identified, representing 4.8% of total expenditure). © 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT