39 resultados para non-normal space


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper discusses inference in self exciting threshold autoregressive (SETAR)models. Of main interest is inference for the threshold parameter. It iswell-known that the asymptotics of the corresponding estimator depend uponwhether the SETAR model is continuous or not. In the continuous case, thelimiting distribution is normal and standard inference is possible. Inthe discontinuous case, the limiting distribution is non-normal and cannotbe estimated consistently. We show valid inference can be drawn by theuse of the subsampling method. Moreover, the method can even be extendedto situations where the (dis)continuity of the model is unknown. In thiscase, also the inference for the regression parameters of the modelbecomes difficult and subsampling can be used advantageously there aswell. In addition, we consider an hypothesis test for the continuity ofthe SETAR model. A simulation study examines small sample performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Structural equation models (SEM) are commonly used to analyze the relationship between variables some of which may be latent, such as individual ``attitude'' to and ``behavior'' concerning specific issues. A number of difficulties arise when we want to compare a large number of groups, each with large sample size, and the manifest variables are distinctly non-normally distributed. Using an specific data set, we evaluate the appropriateness of the following alternative SEM approaches: multiple group versus MIMIC models, continuous versus ordinal variables estimation methods, and normal theory versus non-normal estimation methods. The approaches are applied to the ISSP-1993 Environmental data set, with the purpose of exploring variation in the mean level of variables of ``attitude'' to and ``behavior''concerning environmental issues and their mutual relationship across countries. Issues of both theoretical and practical relevance arise in the course of this application.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this paper is to quantitatively characterize the climatology of daily precipitation indices in Catalonia (northeastern Iberian Peninsula) from 1951 to 2003. This work has been performed analyzing a subset of the ETCCDI (Expert Team on Climate Change Detection and Indices) precipitation indices calculated from a new interpolated dataset of daily precipitation, namely SPAIN02, regular at 0.2° horizontal resolution (around 20 km) and from two high-quality stations: the Ebro and Fabra observatories. Using a jack-knife technique, we have found that the sampling error of the SPAIN02 regional averaged is relatively low. The trend analysis has been implemented using a Circular Block Bootstrap procedure applicable to non-normal distributions and autocorrelated series. A running trend analysis has been applied to analyze the trend persistence. No general trends at a regional scale are observed, considering the annual or the seasonal regional averaged series of all the indices for all the time windows considered. Only the consecutive dry days index (CDD) at annual scale shows a locally coherent spatial trend pattern; around 30% of the Catalonia area has experienced an increase of around 2¿3 days decade¿1. The Ebro and Fabra observatories show a similar CDD trend, mainly due to the summer contribution. Besides this, a significant decrease in total precipitation (around ¿10 mm decade¿1) and in the index "highest precipitation amount in five-day period" (RX5DAY, around ¿5 mm decade¿1), have been found in summer for the Ebro observatory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present study discusses retention criteria for principal components analysis (PCA) applied to Likert scale items typical in psychological questionnaires. The main aim is to recommend applied researchers to restrain from relying only on the eigenvalue-than-one criterion; alternative procedures are suggested for adjusting for sampling error. An additional objective is to add evidence on the consequences of applying this rule when PCA is used with discrete variables. The experimental conditions were studied by means of Monte Carlo sampling including several sample sizes, different number of variables and answer alternatives, and four non-normal distributions. The results suggest that even when all the items and thus the underlying dimensions are independent, eigenvalues greater than one are frequent and they can explain up to 80% of the variance in data, meeting the empirical criterion. The consequences of using Kaiser"s rule are illustrated with a clinical psychology example. The size of the eigenvalues resulted to be a function of the sample size and the number of variables, which is also the case for parallel analysis as previous research shows. To enhance the application of alternative criteria, an R package was developed for deciding the number of principal components to retain by means of confidence intervals constructed about the eigenvalues corresponding to lack of relationship between discrete variables.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The aim of this study was to evaluate how hospital capacity was managed focusing on standardizing the admission and discharge processes. Methods: This study was set in a 900-bed university affiliated hospital of the National Health Service, near Barcelona (Spain). This is a cross-sectional study of a set of interventions which were gradually implemented between April and December 2008. Mainly, they were focused on standardizing the admission and discharge processes to improve patient flow. Primary administrative data was obtained from the 2007 and 2009 Hospital Database. Main outcome measures were median length of stay, percentage of planned discharges, number of surgery cancellations and median number of delayed emergency admissions at 8:00 am. For statistical bivariate analysis, we used a Chi-squared for linear trend for qualitative variables and a Wilcoxon signed ranks test and a Mann–Whitney test for non-normal continuous variables. Results: The median patients’ global length of stay was 8.56 days in 2007 and 7.93 days in 2009 (p<0.051). The percentage of patients admitted the same day as surgery increased from 64.87% in 2007 to 86.01% in 2009 (p<0.05). The number of cancelled interventions due to lack of beds was 216 patients in 2007 and 42 patients in 2009. The median number of planned discharges went from 43.05% in 2007 to 86.01% in 2009 (p<0.01). The median number of emergency patients waiting for an in-hospital bed at 8:00 am was 5 patients in 2007 and 3 patients in 2009 (p<0.01). Conclusions: In conclusion, standardization of admission and discharge processes are largely in our control. There is a significant opportunity to create important benefits for increasing bed capacity and hospital throughput.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The aim of this study was to evaluate how hospital capacity was managed focusing on standardizing the admission and discharge processes. Methods: This study was set in a 900-bed university affiliated hospital of the National Health Service, near Barcelona (Spain). This is a cross-sectional study of a set of interventions which were gradually implemented between April and December 2008. Mainly, they were focused on standardizing the admission and discharge processes to improve patient flow. Primary administrative data was obtained from the 2007 and 2009 Hospital Database. Main outcome measures were median length of stay, percentage of planned discharges, number of surgery cancellations and median number of delayed emergency admissions at 8:00¿am. For statistical bivariate analysis, we used a Chi-squared for linear trend for qualitative variables and a Wilcoxon signed ranks test and a Mann¿Whitney test for non-normal continuous variables. Results:The median patients' global length of stay was 8.56 days in 2007 and 7.93 days in 2009 (p<0.051). The percentage of patients admitted the same day as surgery increased from 64.87% in 2007 to 86.01% in 2009 (p<0.05). The number of cancelled interventions due to lack of beds was 216 patients in 2007 and 42 patients in 2009. The median number of planned discharges went from 43.05% in 2007 to 86.01% in 2009 (p<0.01). The median number of emergency patients waiting for an in-hospital bed at 8:00¿am was 5 patients in 2007 and 3 patients in 2009 (p<0.01). Conclusions: In conclusion, standardization of admission and discharge processes are largely in our control. There is a significant opportunity to create important benefits for increasing bed capacity and hospital throughput.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study is to confirm the factorial structure of the Identification-Commitment Inventory (ICI) developed within the frame of the Human System Audit (HSA) (Quijano et al. in Revist Psicol Soc Apl 10(2):27-61, 2000; Pap Psicól Revist Col Of Psicó 29:92-106, 2008). Commitment and identification are understood by the HSA at an individual level as part of the quality of human processes and resources in an organization; and therefore as antecedents of important organizational outcomes, such as personnel turnover intentions, organizational citizenship behavior, etc. (Meyer et al. in J Org Behav 27:665-683, 2006). The theoretical integrative model which underlies ICI Quijano et al. (2000) was tested in a sample (N = 625) of workers in a Spanish public hospital. Confirmatory factor analysis through structural equation modeling was performed. Elliptical least square solution was chosen as estimator procedure on account of non-normal distribution of the variables. The results confirm the goodness of fit of an integrative model, which underlies the relation between Commitment and Identification, although each one is operatively different.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we establish lower and upper Gaussian bounds for the probability density of the mild solution to the stochastic heat equation with multiplicative noise and in any space dimension. The driving perturbation is a Gaussian noise which is white in time with some spatially homogeneous covariance. These estimates are obtained using tools of the Malliavin calculus. The most challenging part is the lower bound, which is obtained by adapting a general method developed by Kohatsu-Higa to the underlying spatially homogeneous Gaussian setting. Both lower and upper estimates have the same form: a Gaussian density with a variance which is equal to that of the mild solution of the corresponding linear equation with additive noise.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There are several determinants that influence household location decisions. More concretely, recent economic literature assigns an increasingly important role to the variables governing quality of life. Nevertheless, the spatial stationarity of the parameters is implicitly assumed in most studies. Here we analyse the role of quality of life in urban economics and test for the spatial stationarity of the relationship between city growth and quality of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An algebraic decay rate is derived which bounds the time required for velocities to equilibrate in a spatially homogeneous flow-through model representing the continuum limit of a gas of particles interacting through slightly inelastic collisions. This rate is obtained by reformulating the dynamical problem as the gradient flow of a convex energy on an infinite-dimensional manifold. An abstract theory is developed for gradient flows in length spaces, which shows how degenerate convexity (or even non-convexity) | if uniformly controlled | will quantify contractivity (limit expansivity) of the flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This note describes ParallelKnoppix, a bootable CD that allows econometricians with average knowledge of computers to create and begin using a high performance computing cluster for parallel computing in very little time. The computers used may be heterogeneous machines, and clusters of up to 200 nodes are supported. When the cluster is shut down, all machines are in their original state, so their temporary use in the cluster does not interfere with their normal uses. An example shows how a Monte Carlo study of a bootstrap test procedure may be done in parallel. Using a cluster of 20 nodes, the example runs approximately 20 times faster than it does on a single computer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we prove T1 type necessary and sufficient conditions for the boundedness on inhomogeneous Lipschitz spaces of fractional integrals and singular integrals defined on a measure metric space whose measure satisfies a n-dimensional growth. We also show that hypersingular integrals are bounded on these spaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper characterizes a mixed strategy Nash equilibrium in a one-dimensional Downsian model of two-candidate elections with a continuous policy space, where candidates are office motivated and one candidate enjoys a non-policy advantage over the other candidate. We assume that voters have quadratic preferences over policies and that their ideal points are drawn from a uniform distribution over the unit interval. In our equilibrium the advantaged candidate chooses the expected median voter with probability one and the disadvantaged candidate uses a mixed strategy that is symmetric around it. We show that this equilibrium exists if the number of voters is large enough relative to the size of the advantage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper tries to resolve some of the main shortcomings in the empirical literature of location decisions for new plants, i.e. spatial effects and overdispersion. Spatial effects are omnipresent, being a source of overdispersion in the data as well as a factor shaping the functional relationship between the variables that explain a firm’s location decisions. Using Count Data models, empirical researchers have dealt with overdispersion and excess zeros by developments of the Poisson regression model. This study aims to take this a step further, by adopting Bayesian methods and models in order to tackle the excess of zeros, spatial and non-spatial overdispersion and spatial dependence simultaneously. Data for Catalonia is used and location determinants are analysed to that end. The results show that spatial effects are determinant. Additionally, overdispersion is descomposed into an unstructured iid effect and a spatially structured effect. Keywords: Bayesian Analysis, Spatial Models, Firm Location. JEL Classification: C11, C21, R30.