15 resultados para Nonparametric regression techniques

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The kinematic mapping of a rigid open-link manipulator is a homomorphism between Lie groups. The homomorphisrn has solution groups that act on an inverse kinematic solution element. A canonical representation of solution group operators that act on a solution element of three and seven degree-of-freedom (do!) dextrous manipulators is determined by geometric analysis. Seven canonical solution groups are determined for the seven do! Robotics Research K-1207 and Hollerbach arms. The solution element of a dextrous manipulator is a collection of trivial fibre bundles with solution fibres homotopic to the Torus. If fibre solutions are parameterised by a scalar, a direct inverse funct.ion that maps the scalar and Cartesian base space coordinates to solution element fibre coordinates may be defined. A direct inverse pararneterisation of a solution element may be approximated by a local linear map generated by an inverse augmented Jacobian correction of a linear interpolation. The action of canonical solution group operators on a local linear approximation of the solution element of inverse kinematics of dextrous manipulators generates cyclical solutions. The solution representation is proposed as a model of inverse kinematic transformations in primate nervous systems. Simultaneous calibration of a composition of stereo-camera and manipulator kinematic models is under-determined by equi-output parameter groups in the composition of stereo-camera and Denavit Hartenberg (DH) rnodels. An error measure for simultaneous calibration of a composition of models is derived and parameter subsets with no equi-output groups are determined by numerical experiments to simultaneously calibrate the composition of homogeneous or pan-tilt stereo-camera with DH models. For acceleration of exact Newton second-order re-calibration of DH parameters after a sequential calibration of stereo-camera and DH parameters, an optimal numerical evaluation of DH matrix first order and second order error derivatives with respect to a re-calibration error function is derived, implemented and tested. A distributed object environment for point and click image-based tele-command of manipulators and stereo-cameras is specified and implemented that supports rapid prototyping of numerical experiments in distributed system control. The environment is validated by a hierarchical k-fold cross validated calibration to Cartesian space of a radial basis function regression correction of an affine stereo model. Basic design and performance requirements are defined for scalable virtual micro-kernels that broker inter-Java-virtual-machine remote method invocations between components of secure manageable fault-tolerant open distributed agile Total Quality Managed ISO 9000+ conformant Just in Time manufacturing systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate what sort of people become social entrepreneurs, and in what way they differ from business entrepreneurs. More importantly, to investigate in what socio-economic context entrepreneurial individuals are more likely to become social than business entrepreneurs. These questions are important for policy because there has been a shift from direct to indirect delivery of many public services in the UK, requiring a professional approach to social enterprise. Design/methodology/approach – Evidence is presented from the Global Entrepreneurship Monitor (GEM) UK survey based upon a representative sample of around 21,000 adults aged between 16 and 64 years interviewed in 2009. The authors use logistic multivariate regression techniques to identify differences between business and social entrepreneurs in demographic characteristics, effort, aspiration, use of resources, industry choice, deprivation, and organisational structure. Findings – The results show that the odds of an early-stage entrepreneur being a social rather than a business entrepreneur are reduced if they are from an ethnic minority, if they work ten hours or more per week on the venture, and if they have a family business background; while they are increased if they have higher levels of education and if they are a settled in-migrant to their area. While women social entrepreneurs are more likely than business entrepreneurs to be women, this is due to gender-based differences in time commitment to the venture. In addition, the more deprived the community they live in, the more likely women entrepreneurs are to be social than business entrepreneurs. However, this does not hold in the most deprived areas where we argue civic society is weakest and therefore not conducive to support any form of entrepreneurial endeavour based on community engagement. Originality/value – The paper's findings suggest that women may be motivated to become social entrepreneurs by a desire to improve the socio-economic environment of the community in which they live and see social enterprise creation as an appropriate vehicle with which to address local problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This is the first study to provide comprehensive analyses of the relative performance of both socially responsible investment (SRI) and Islamic mutual funds. The analysis proceeds in two stages. In the first, the performance of the two categories of funds is measured using partial frontier methods. In the second stage, we use quantile regression techniques.By combining two variants of the Free Disposal Hull (FDH) methods (order-m and order-?) in the first stage of analysis and quantile regression in the second stage, we provide detailed analyses of the impact of different covariates across methods and across different quantiles. In spite of the differences in the screening criteria and portfolio management of both types of funds, variation in the performance is only found for some of the quantiles of the conditional distribution of mutual fund performance. We established that for the most inefficient funds the superior performance of SRI funds is significant. In contrast, for the best mutual funds this evidence vanished and even Islamic funds perform better than SRI.These results show the benefits of performing the analysis using quantile regression.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Estimation of economic relationships often requires imposition of constraints such as positivity or monotonicity on each observation. Methods to impose such constraints, however, vary depending upon the estimation technique employed. We describe a general methodology to impose (observation-specific) constraints for the class of linear regression estimators using a method known as constraint weighted bootstrapping. While this method has received attention in the nonparametric regression literature, we show how it can be applied for both parametric and nonparametric estimators. A benefit of this method is that imposing numerous constraints simultaneously can be performed seamlessly. We apply this method to Norwegian dairy farm data to estimate both unconstrained and constrained parametric and nonparametric models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This is the first study to provide comprehensive analyses of the relative performance of both socially responsible investment (SRI) and Islamic mutual funds. The analysis proceeds in two stages. In the first, the performance of the two categories of funds is measured using partial frontier methods. In the second stage, we use quantile regression techniques. By combining two variants of the Free Disposal Hull (FDH) methods (order- m and order- α) in the first stage of analysis and quantile regression in the second stage, we provide detailed analyses of the impact of different covariates across methods and across different quantiles. In spite of the differences in the screening criteria and portfolio management of both types of funds, variation in the performance is only found for some of the quantiles of the conditional distribution of mutual fund performance. We established that for the most inefficient funds the superior performance of SRI funds is significant. In contrast, for the best mutual funds this evidence vanished and even Islamic funds perform better than SRI. These results show the benefits of performing the analysis using quantile regression. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bayesian analysis of neural networks is difficult because the prior over functions has a complex form, leading to implementations that either make approximations or use Monte Carlo integration techniques. In this paper I investigate the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis to be carried out exactly using matrix operations. The method has been tested on two challenging problems and has produced excellent results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researchers often use 3-way interactions in moderated multiple regression analysis to test the joint effect of 3 independent variables on a dependent variable. However, further probing of significant interaction terms varies considerably and is sometimes error prone. The authors developed a significance test for slope differences in 3-way interactions and illustrate its importance for testing psychological hypotheses. Monte Carlo simulations revealed that sample size, magnitude of the slope difference, and data reliability affected test power. Application of the test to published data yielded detection of some slope differences that were undetected by alternative probing techniques and led to changes of results and conclusions. The authors conclude by discussing the test's applicability for psychological research. Copyright 2006 by the American Psychological Association.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a study of three techniques to improve performance of some standard fore-casting models, application to the energy demand and prices. We focus on forecasting demand and price one-day ahead. First, the wavelet transform was used as a pre-processing procedure with two approaches: multicomponent-forecasts and direct-forecasts. We have empirically compared these approaches and found that the former consistently outperformed the latter. Second, adaptive models were introduced to continuously update model parameters in the testing period by combining ?lters with standard forecasting methods. Among these adaptive models, the adaptive LR-GARCH model was proposed for the fi?rst time in the thesis. Third, with regard to noise distributions of the dependent variables in the forecasting models, we used either Gaussian or Student-t distributions. This thesis proposed a novel algorithm to infer parameters of Student-t noise models. The method is an extension of earlier work for models that are linear in parameters to the non-linear multilayer perceptron. Therefore, the proposed method broadens the range of models that can use a Student-t noise distribution. Because these techniques cannot stand alone, they must be combined with prediction models to improve their performance. We combined these techniques with some standard forecasting models: multilayer perceptron, radial basis functions, linear regression, and linear regression with GARCH. These techniques and forecasting models were applied to two datasets from the UK energy markets: daily electricity demand (which is stationary) and gas forward prices (non-stationary). The results showed that these techniques provided good improvement to prediction performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-linear relationships are common in microbiological research and often necessitate the use of the statistical techniques of non-linear regression or curve fitting. In some circumstances, the investigator may wish to fit an exponential model to the data, i.e., to test the hypothesis that a quantity Y either increases or decays exponentially with increasing X. This type of model is straight forward to fit as taking logarithms of the Y variable linearises the relationship which can then be treated by the methods of linear regression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The techniques associated with regression, whether linear or non-linear, are some of the most useful statistical procedures that can be applied in clinical studies in optometry. 2. In some cases, there may be no scientific model of the relationship between X and Y that can be specified in advance and the objective may be to provide a ‘curve of best fit’ for predictive purposes. In such cases, the fitting of a general polynomial type curve may be the best approach. 3. An investigator may have a specific model in mind that relates Y to X and the data may provide a test of this hypothesis. Some of these curves can be reduced to a linear regression by transformation, e.g., the exponential and negative exponential decay curves. 4. In some circumstances, e.g., the asymptotic curve or logistic growth law, a more complex process of curve fitting involving non-linear estimation will be required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Evaluation of anterior chamber depth (ACD) can potentially identify those patients at risk of angle-closure glaucoma. We aimed to: compare van Herick’s limbal chamber depth (LCDvh) grades with LCDorb grades calculated from the Orbscan anterior chamber angle values; determine Smith’s technique ACD and compare to Orbscan ACD; and calculate a constant for Smith’s technique using Orbscan ACD. Methods Eighty participants free from eye disease underwent LCDvh grading, Smith’s technique ACD, and Orbscan anterior chamber angle and ACD measurement. Results LCDvh overestimated grades by a mean of 0.25 (coefficient of repeatability [CR] 1.59) compared to LCDorb. Smith’s technique (constant 1.40 and 1.31) overestimated ACD by a mean of 0.33 mm (CR 0.82) and 0.12 mm (CR 0.79) respectively, compared to Orbscan. Using linear regression, we determined a constant of 1.22 for Smith’s slit-length method. Conclusions Smith’s technique (constant 1.31) provided an ACD that is closer to that found with Orbscan compared to a constant of 1.40 or LCDvh. Our findings also suggest that Smith’s technique would produce values closer to that obtained with Orbscan by using a constant of 1.22.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a constrained nonparametric method of estimating an input distance function. A regression function is estimated via kernel methods without functional form assumptions. To guarantee that the estimated input distance function satisfies its properties, monotonicity constraints are imposed on the regression surface via the constraint weighted bootstrapping method borrowed from statistics literature. The first, second, and cross partial analytical derivatives of the estimated input distance function are derived, and thus the elasticities measuring input substitutability can be computed from them. The method is then applied to a cross-section of 3,249 Norwegian timber producers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regressiontechniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a nave random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists' long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies. © 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Dirichlet process mixture model (DPMM) is a ubiquitous, flexible Bayesian nonparametric statistical model. However, full probabilistic inference in this model is analytically intractable, so that computationally intensive techniques such as Gibbs sampling are required. As a result, DPMM-based methods, which have considerable potential, are restricted to applications in which computational resources and time for inference is plentiful. For example, they would not be practical for digital signal processing on embedded hardware, where computational resources are at a serious premium. Here, we develop a simplified yet statistically rigorous approximate maximum a-posteriori (MAP) inference algorithm for DPMMs. This algorithm is as simple as DP-means clustering, solves the MAP problem as well as Gibbs sampling, while requiring only a fraction of the computational effort. (For freely available code that implements the MAP-DP algorithm for Gaussian mixtures see http://www.maxlittle.net/.) Unlike related small variance asymptotics (SVA), our method is non-degenerate and so inherits the “rich get richer” property of the Dirichlet process. It also retains a non-degenerate closed-form likelihood which enables out-of-sample calculations and the use of standard tools such as cross-validation. We illustrate the benefits of our algorithm on a range of examples and contrast it to variational, SVA and sampling approaches from both a computational complexity perspective as well as in terms of clustering performance. We demonstrate the wide applicabiity of our approach by presenting an approximate MAP inference method for the infinite hidden Markov model whose performance contrasts favorably with a recently proposed hybrid SVA approach. Similarly, we show how our algorithm can applied to a semiparametric mixed-effects regression model where the random effects distribution is modelled using an infinite mixture model, as used in longitudinal progression modelling in population health science. Finally, we propose directions for future research on approximate MAP inference in Bayesian nonparametrics.