57 resultados para SQUARES


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adaptive filters are now becoming increasingly studied for their suitability in application to complex and non-stationary signals. Many adaptive filters utilise a reference input, that is used to form an estimate of the noise in the target signal. In this paper we discuss the application of adaptive filters for high electromyography contaminated electroencephalography data. We propose the use of multiple referential inputs instead of the traditional single input. These references are formed using multiple EMG sensors during an EEG experiment, each reference input is processed and ordered through firstly determining the Pearson’s r-squared correlation coefficient, from this a weighting metric is determined and used to scale and order the reference channels according to the paradigm shown in this paper. This paper presents the use and application of the Adaptive-Multi-Reference (AMR) Least Means Square adaptive filter in the domain of electroencephalograph signal acquisition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phylogenetic generalised least squares (PGLS) is one of the most commonly employed phylogenetic comparative methods. The technique, a modification of generalised least squares, uses knowledge of phylogenetic relationships to produce an estimate of expected covariance in cross-species data. Closely related species are assumed to have more similar traits because of their shared ancestry and hence produce more similar residuals from the least squares regression line. By taking into account the expected covariance structure of these residuals, modified slope and intercept estimates are generated that can account for interspecific autocorrelation due to phylogeny. Here, we provide a basic conceptual background to PGLS, for those unfamiliar with the approach. We describe the requirements for a PGLS analysis and highlight the packages that can be used to implement the method. We show how phylogeny is used to calculate the expected covariance structure in the data and how this is applied to the generalised least squares regression equation. We demonstrate how PGLS can incorporate information about phylogenetic signal, the extent to which closely related species truly are similar, and how it controls for this signal appropriately, thereby negating concerns about unnecessarily ‘correcting’ for phylogeny. In addition to discussing the appropriate way to present the results of PGLS analyses, we highlight some common misconceptions about the approach and commonly encountered problems with the method. These include misunderstandings about what phylogenetic signal refers to in the context of PGLS (residuals errors, not the traits themselves), and issues associated with unknown or uncertain phylogeny.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding how agents formulate their expectations about Fed behavior is important for market participants because they can potentially use this information to make more accurate estimates of stock and bond prices. Although it is commonly assumed that agents learn over time, there is scant empirical evidence in support of this assumption. Thus, in this paper we test if the forecast of the three month T-bill rate in the Survey of Professional Forecasters (SPF) is consistent with least squares learning when there are discrete shifts in monetary policy. We first derive the mean, variance and autocovariances of the forecast errors from a recursive least squares learning algorithm when there are breaks in the structure of the model. We then apply the Bai and Perron (1998) test for structural change to a forecasting model for the three month T-bill rate in order to identify changes in monetary policy. Having identified the policy regimes, we then estimate the implied biases in the interest rate forecasts within each regime. We find that when the forecast errors from the SPF are corrected for the biases due to shifts in policy, the forecasts are consistent with least squares learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the use and spatial patterns of newly developed public squares in urban villages in the City of Shenzhen, China. Given the lack of information about how this type of public space has been used by the Chinese, this paper provides insights that enable the development of more user friendly public space in China. The research is based on the fieldwork carried out in 2014 to examine public squares in four urban villages in Shenzhen. Direct observation and activity mapping have been used as major methodology for this research. The focus of this paper will be placed not only the formal aspects such as the design aspiration, scale and provision of public amenity, but also on the usage that includes types of users, there daily activity as well as their location preference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our study revisits and challenges two core conventional meta-regression estimators: the prevalent use of‘mixed-effects’ or random-effects meta-regression analysis and the correction of standard errors that defines fixed-effects meta-regression analysis (FE-MRA). We show how and explain why an unrestricted weighted least squares MRA (WLS-MRA) estimator is superior to conventional random-effects (or mixed-effects) meta-regression when there is publication (or small-sample) bias that is as good as FE-MRA in all cases and better than fixed effects in most practical applications. Simulations and statistical theory show that WLS-MRA provides satisfactory estimates of meta-regression coefficients that are practically equivalent to mixed effects or random effects when there is no publication bias. When there is publication selection bias, WLS-MRA always has smaller bias than mixed effects or random effects. In practical applications, an unrestricted WLS meta-regression is likely to give practically equivalent or superior estimates to fixed-effects, random-effects, and mixed-effects meta-regression approaches. However, random-effects meta-regression remains viable and perhaps somewhat preferable if selection for statistical significance (publication bias) can be ruled out and when random, additive normal heterogeneity is known to directly affect the ‘true’ regression coefficient.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fuzzy logic provides a mathematical formalism for a unified treatment of vagueness and imprecision that are ever present in decision support and expert systems in many areas. The choice of aggregation operators is crucial to the behavior of the system that is intended to mimic human decision making. This paper discusses how aggregation operators can be selected and adjusted to fit empirical data—a series of test cases. Both parametric and nonparametric regression are considered and compared. A practical application of the proposed methods to electronic implementation of clinical guidelines is presented

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lasso procedure is an estimator-shrinkage and variable selection method. This paper shows that there always exists an interval of tuning parameter values such that the corresponding mean squared prediction error for the lasso estimator is smaller than for the ordinary least squares estimator. For an estimator satisfying some condition such as unbiasedness, the paper defines the corresponding generalized lasso estimator. Its mean squared prediction error is shown to be smaller than that of the estimator for values of the tuning parameter in some interval. This implies that all unbiased estimators are not admissible. Simulation results for five models support the theoretical results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, research on exploring the potential of several popular equalization techniques while overcoming their disadvantages has been conducted. First, extensive literature survey on equalization is conducted. The focus has been placed on several popular linear equalization algorithm such as the conventional least-mean-square (LMS) algorithm, the recursive least squares (RLS) algorithm, the fi1tered-X LMS algorithm and their development. The approach in analysing the performance of the filtered-X LMS Algorithm, a heuristic method based on linear time-invariant operator theory is provided to analyse the robust perfonnance of the filtered-X structure. It indicates that the extra filter could enhance the stability margin of the corresponding non filtered X structure. To overcome the slow convergence problem while keeping the simplicity of the LMS based algorithms, an H2 optimal initialization is proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the problem of obtaining the weights of the ordered weighted aggregation (OWA) operators from observations. The problem is formulated as a restricted least squares and uniform approximation problems. We take full advantage of the linearity of the problem. In the former case, a well known technique of non-negative least squares is used. In a case of uniform approximation, we employ a recently developed cutting angle method of global optimisation. Both presented methods give results superior to earlier approaches, and do not require complicated nonlinear constructions. Additional restrictions, such as degree of orness of the operator, can be easily introduced

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many problems in chemistry depend on the ability to identify the global minimum or maximum of a function. Examples include applications in chemometrics, optimization of reaction or operating conditions, and non-linear least-squares analysis. This paper presents the results of the application of a new method of deterministic global optimization, called the cutting angle method (CAM), as applied to the prediction of molecular geometries. CAM is shown to be competitive with other global optimization techniques for several benchmark molecular conformation problem. CAM is a general method that can also be applied to other computational problems involving global minima, global maxima or finding the roots of nonlinear equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the authors explore the potential of several popular equalization techniques while overcoming their disadvantages. First, extensive literature survey on equalization is conducted. The focus is on popular linear equalization algorithms such as the conventional least-mean-square (LMS) algorithm , The recursive least-squares (RLS) algorithm, the filtered-X LMS algorithm and their development. To overcome the slow convergence problem while keeping the simplicity of the LMS based algorithms, an H2 optimal initialization is proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the relationship between different classes of institutional investors and firm performance. Using industry level data from Finland, which is characterized by various institutional investors who own multiple ownership stakes in different firms across a broad spectrum of industries, the paper exhibits two novelties. First, unlike previous studies which treated institutional investors as a monolithic group, we segment them in classes. Second, we recognize the joint determination of firm performance and institutional ownership. We account for this issue in the context of a system of equations, using three stage least squares methodology. The empirical results suggest a significant two-way feedback between firm performance and institutional equity ownership. However, this effect is not symmetric. We find that institutional investors with likely investment and business ties with firms have adverse (negative) effect on firm performance and the impact is very significant in comparison to the negative effect of firm performance on institutional ownership.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

he aetiology of osteoporotic vertebral fractures is multi-factorial, and cannot be explained solely by low bone mass. After sustaining an initial vertebral fracture, the risk of subsequent fracture increases greatly. Examination of physiologic loads imposed on vertebral bodies may help to explain a mechanism underlying this fracture cascade. This study tested the hypothesis that model-derived segmental vertebral loading is greater in individuals who have sustained an osteoporotic vertebral fracture compared to those with osteoporosis and no history of fracture. Flexion moments, and compression and shear loads were calculated from T2 to L5 in 12 participants with fractures (66.4 ± 6.4 years, 162.2 ± 5.1 cm, 69.1 ± 11.2 kg) and 19 without fractures (62.9 ± 7.9 years, 158.3 ± 4.4 cm, 59.3 ± 8.9 kg) while standing. Static analysis was used to solve gravitational loads while muscle-derived forces were calculated using a detailed trunk muscle model driven by optimization with a cost function set to minimise muscle fatigue. Least squares regression was used to derive polynomial functions to describe normalised load profiles. Regression co-efficients were compared between groups to examine differences in loading profiles. Loading at the fractured level, and at one level above and below, were also compared between groups. The fracture group had significantly greater normalised compression (p = 0.0008) and shear force (p < 0.0001) profiles and a trend for a greater flexion moment profile. At the level of fracture, a significantly greater flexion moment (p = 0.001) and shear force (p < 0.001) was observed in the fracture group. A greater flexion moment (p = 0.003) and compression force (p = 0.007) one level below the fracture, and a greater flexion moment (p = 0.002) and shear force (p = 0.002) one level above the fracture was observed in the fracture group. The differences observed in multi-level spinal loading between the groups may explain a mechanism for increased risk of subsequent vertebral fractures. Interventions aimed at restoring vertebral morphology or reduce thoracic curvature may assist in normalising spine load profiles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"No photographed images. All handmade. It's all these squares, lines. The main techniques were bleaching and dyeing and sticking letraset-type material to the film strip. Used the pos/neg thing, inserting film strips to sustain shapes, otherwise you're talking about the one film all the time: it begins to look the same. There is a growing need to sustain shapes, patterns, etc. Hence the squares, lines. Breaking away from the rush of shapes. It's more of a problem to get away from in Vision because there are no photographic images. A very ordered film. Very Dutch. Took it all out of 800 ft. of this type of stuff and ended up with 150 ft. of selected squares and circles. The images don't rush, they much more fold over the top of one another. Mondrian-inspired." http://www.innersense.com.au/mif/debruyn_films.html