900 resultados para Incorrect Generalized Least Squares


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los turistas urbanos se caracterizan por ser uno de los segmentos de mayor crecimiento en los mercados turísticos actuales. Monterrey (México), uno de los principales destinos urbanos del país, ha apostado en la actualidad por mejorar su competitividad. Esta investigación se propuso encontrar evidencia acerca de la relación causal de la motivación de viaje sobre la imagen percibida del destino, dos variables importantes por su influencia en la satisfacción de los visitantes. Una revisión de la literatura permitió proponer constructos teóricos integrados en un instrumento para la recogida de datos vía encuesta a una muestra representativa. Por medio del método de regresión y ecuaciones estructurales por mínimos cuadrados parciales (PLS), se identificaron los componentes principales de ambas variables y se obtuvo un modelo explicativo de la imagen percibida del destino en función de la motivación de viaje. Finalmente, se emiten recomendaciones para la gestión del destino urbano en función de los resultados obtenidos. ABSTRACT: Abstract Urban tourists are recognized as one of the fastest growing segments in today’s tourism markets. Monterrey, Mexico, one of the main urban destinations in the country aims at improving its competitiveness. This research work had the purpose of finding evidence on the causal relationship between travel motivation and destination image, two important variables because of their influence on visitors’ satisfaction. A literature review enabled the proposal of a research instrument with theoretically based constructs to gather data through survey from a representative sample. Using regression and structural equations modelling by partial least squares (pls) a set of main components of both variables were identified thus enabling the obtention of a explanatory model of destination image in terms of travel motivations. Finally based on the results some recommendations of tourism management are given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A finite-strain solid–shell element is proposed. It is based on least-squares in-plane assumed strains, assumed natural transverse shear and normal strains. The singular value decomposition (SVD) is used to define local (integration-point) orthogonal frames-of-reference solely from the Jacobian matrix. The complete finite-strain formulation is derived and tested. Assumed strains obtained from least-squares fitting are an alternative to the enhanced-assumed-strain (EAS) formulations and, in contrast with these, the result is an element satisfying the Patch test. There are no additional degrees-of-freedom, as it is the case with the enhanced-assumed-strain case, even by means of static condensation. Least-squares fitting produces invariant finite strain elements which are shear-locking free and amenable to be incorporated in large-scale codes. With that goal, we use automatically generated code produced by AceGen and Mathematica. All benchmarks show excellent results, similar to the best available shell and hybrid solid elements with significantly lower computational cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A finite-strain solid–shell element is proposed. It is based on least-squares in-plane assumed strains, assumed natural transverse shear and normal strains. The singular value decomposition (SVD) is used to define local (integration-point) orthogonal frames-of- reference solely from the Jacobian matrix. The complete finite-strain formulation is derived and tested. Assumed strains obtained from least-squares fitting are an alternative to the enhanced-assumed-strain (EAS) formulations and, in contrast with these, the result is an element satisfying the Patch test. There are no additional degrees-of-freedom, as it is the case with the enhanced- assumed-strain case, even by means of static condensation. Least-squares fitting produces invariant finite strain elements which are shear-locking free and amenable to be incorporated in large-scale codes. With that goal, we use automatically generated code produced by AceGen and Mathematica. All benchmarks show excellent results, similar to the best available shell and hybrid solid elements with significantly lower computational cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two novelties are introduced: (i) a finite-strain semi-implicit integration algorithm compatible with current element technologies and (ii) the application to assumed-strain hexahedra. The Löwdin algo- rithm is adopted to obtain evolving frames applicable to finite strain anisotropy and a weighted least- squares algorithm is used to determine the mixed strain. Löwdin frames are very convenient to model anisotropic materials. Weighted least-squares circumvent the use of internal degrees-of-freedom. Het- erogeneity of element technologies introduce apparently incompatible constitutive requirements. Assumed-strain and enhanced strain elements can be either formulated in terms of the deformation gradient or the Green–Lagrange strain, many of the high-performance shell formulations are corotational and constitutive constraints (such as incompressibility, plane stress and zero normal stress in shells) also depend on specific element formulations. We propose a unified integration algorithm compatible with possibly all element technologies. To assess its validity, a least-squares based hexahedral element is implemented and tested in depth. Basic linear problems as well as 5 finite-strain examples are inspected for correctness and competitive accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper compares the performance of the complex nonlinear least squares algorithm implemented in the LEVM/LEVMW software with the performance of a genetic algorithm in the characterization of an electrical impedance of known topology. The effect of the number of measured frequency points and of measurement uncertainty on the estimation of circuit parameters is presented. The analysis is performed on the equivalent circuit impedance of a humidity sensor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective

To examine whether early inflammation is related to cortisol levels at 18 months corrected age (CA) in children born very preterm.

Study Design

Infants born ≤ 32 weeks gestational age were recruited in the NICU, and placental histopathology, MRI, and chart review were obtained. At 18 months CA developmental assessment and collection of 3 salivary cortisol samples were carried out. Generalized least squares was used to analyze data from 85 infants providing 222 cortisol samples.

Results

Infants exposed to chorioamnionitis with funisitis had a significantly different pattern of cortisol across the samples compared to infants with chorioamnionitis alone or no prenatal inflammation (F[4,139] = 7.3996, P <.0001). Postnatal infections, necrotizing enterocolitis and chronic lung disease were not significantly associated with the cortisol pattern at 18 months CA.

Conclusion

In children born very preterm, prenatal inflammatory stress may contribute to altered programming of the HPA axis.

Keywords: preterm, chorioamnionitis, funisitis, premature infants, hypothalamic-pituitary-adrenal axis, infection, cortisol, stress

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A geostatistical version of the classical Fisher rule (linear discriminant analysis) is presented.This method is applicable when a large dataset of multivariate observations is available within a domain split in several known subdomains, and it assumes that the variograms (or covariance functions) are comparable between subdomains, which only differ in the mean values of the available variables. The method consists on finding the eigen-decomposition of the matrix W-1B, where W is the matrix of sills of all direct- and cross-variograms, and B is the covariance matrix of the vectors of weighted means within each subdomain, obtained by generalized least squares. The method is used to map peat blanket occurrence in Northern Ireland, with data from the Tellus
survey, which requires a minimal change to the general recipe: to use compositionally-compliant variogram tools and models, and work with log-ratio transformed data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes finite-sample procedures for testing the SURE specification in multi-equation regression models, i.e. whether the disturbances in different equations are contemporaneously uncorrelated or not. We apply the technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] to obtain exact tests based on standard LR and LM zero correlation tests. We also suggest a MC quasi-LR (QLR) test based on feasible generalized least squares (FGLS). We show that the latter statistics are pivotal under the null, which provides the justification for applying MC tests. Furthermore, we extend the exact independence test proposed by Harvey and Phillips (1982) to the multi-equation framework. Specifically, we introduce several induced tests based on a set of simultaneous Harvey/Phillips-type tests and suggest a simulation-based solution to the associated combination problem. The properties of the proposed tests are studied in a Monte Carlo experiment which shows that standard asymptotic tests exhibit important size distortions, while MC tests achieve complete size control and display good power. Moreover, MC-QLR tests performed best in terms of power, a result of interest from the point of view of simulation-based tests. The power of the MC induced tests improves appreciably in comparison to standard Bonferroni tests and, in certain cases, outperforms the likelihood-based MC tests. The tests are applied to data used by Fischer (1993) to analyze the macroeconomic determinants of growth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the role of natural resource windfalls in explaining the efficiency of public expenditures. Using a rich dataset of expenditures and public good provision for 1,836 municipalities in Peru for period 2001-2010, we estimate a non-monotonic relationship between the efficiency of public good provision and the level of natural resource transfers. Local governments that were extremely favored by the boom of mineral prices were more efficient in using fiscal windfalls whereas those benefited with modest transfers were more inefficient. These results can be explained by the increase in political competition associated with the boom. However, the fact that increases in efficiency were related to reductions in public good provision casts doubts about the beneficial effects of political competition in promoting efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The scaling of metabolic rates to body size is widely considered to be of great biological and ecological importance, and much attention has been devoted to determining its theoretical and empirical value. Most debate centers on whether the underlying power law describing metabolic rates is 2/3 (as predicted by scaling of surface area/volume relationships) or 3/4 ("Kleiber's law"). Although recent evidence suggests that empirically derived exponents vary among clades with radically different metabolic strategies, such as ectotherms and endotherms, models, such as the metabolic theory of ecology, depend on the assumption that there is at least a predominant, if not universal, metabolic scaling exponent. Most analyses claimed to support the predictions of general models, however, failed to control for phylogeny. We used phylogenetic generalized least-squares models to estimate allometric slopes for both basal metabolic rate (BMR) and field metabolic rate (FMR) in mammals. Metabolic rate scaling conformed to no single theoretical prediction, but varied significantly among phylogenetic lineages. In some lineages we found a 3/4 exponent, in others a 2/3 exponent, and in yet others exponents differed significantly from both theoretical values. Analysis of the phylogenetic signal in the data indicated that the assumptions of neither species-level analysis nor independent contrasts were met. Analyses that assumed no phylogenetic signal in the data (species-level analysis) or a strong phylogenetic signal (independent contrasts), therefore, returned estimates of allometric slopes that were erroneous in 30% and 50% of cases, respectively. Hence, quantitative estimation of the phylogenetic signal is essential for determining scaling exponents. The lack of evidence for a predominant scaling exponent in these analyses suggests that general models of metabolic scaling, and macro-ecological theories that depend on them, have little explanatory power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Roads and topography can determine patterns of land use and distribution of forest cover, particularly in tropical regions. We evaluated how road density, land use, and topography affected forest fragmentation, deforestation and forest regrowth in a Brazilian Atlantic Forest region near the city of Sao Paulo. We mapped roads and land use/land cover for three years (1962, 1981 and 2000) from historical aerial photographs, and summarized the distribution of roads, land use/land cover and topography within a grid of 94 non-overlapping 100 ha squares. We used generalized least squares regression models for data analysis. Our models showed that forest fragmentation and deforestation depended on topography, land use and road density, whereas forest regrowth depended primarily on land use. However, the relationships between these variables and forest dynamics changed in the two studied periods; land use and slope were the strongest predictors from 1962 to 1981, and past (1962) road density and land use were the strongest predictors for the following period (1981-2000). Roads had the strongest relationship with deforestation and forest fragmentation when the expansions of agriculture and buildings were limited to already deforested areas, and when there was a rapid expansion of development, under influence of Sao Paulo city. Furthermore, the past(1962)road network was more important than the recent road network (1981) when explaining forest dynamics between 1981 and 2000, suggesting a long-term effect of roads. Roads are permanent scars on the landscape and facilitate deforestation and forest fragmentation due to increased accessibility and land valorization, which control land-use and land-cover dynamics. Topography directly affected deforestation, agriculture and road expansion, mainly between 1962 and 1981. Forest are thus in peril where there are more roads, and long-term conservation strategies should consider ways to mitigate roads as permanent landscape features and drivers facilitators of deforestation and forest fragmentation. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we deal with robust inference in heteroscedastic measurement error models Rather than the normal distribution we postulate a Student t distribution for the observed variables Maximum likelihood estimates are computed numerically Consistent estimation of the asymptotic covariance matrices of the maximum likelihood and generalized least squares estimators is also discussed Three test statistics are proposed for testing hypotheses of interest with the asymptotic chi-square distribution which guarantees correct asymptotic significance levels Results of simulations and an application to a real data set are also reported (C) 2009 The Korean Statistical Society Published by Elsevier B V All rights reserved

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.