15 resultados para Latent factor analysis

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Center for Epidemiologic Studies-Depression Scale (CES-D) is the most frequently used scale for measuring depressive symptomatology in caregiving research. The aim of this study is to test its construct structure and measurement equivalence between caregivers from two Spanish-speaking countries. Face-to-face interviews were carried out with 595 female dementia caregivers from Madrid, Spain, and from Coahuila, Mexico. The structure of the CES-D was analyzed using exploratory and confirmatory factor analysis (EFA and CFA, respectively). Measurement invariance across samples was analyzed comparing a baseline model with a more restrictive model. Significant differences between means were found for 7 items. The results of the EFA clearly supported a four-factor solution. The CFA for the whole sample with the four factors revealed high and statistically significant loading coefficients for all items (except item number 4). When equality constraints were imposed to test for the invariance between countries, the change in chi-square was significant, indicating that complete invariance could not be assumed. Significant between-countries differences were found for three of the four latent factor mean scores. Although the results provide general support for the original four-factor structure, caution should be exercised on reporting comparisons of depression scores between Spanish-speaking countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Growth in availability and ability of modern statistical software has resulted in greater numbers of research techniques being applied across the marketing discipline. However, with such advances come concerns that techniques may be misinterpreted by researchers. This issue is critical since misinterpretation could cause erroneous findings. This paper investigates some assumptions regarding: 1) the assessment of discriminant validity; and 2) what confirmatory factor analysis accomplishes. Examples that address these points are presented, and some procedural remedies are suggested based upon the literature. This paper is, therefore, primarily concerned with the development of measurement theory and practice. If advances in theory development are not based upon sound methodological practice, we as researchers could be basing our work upon shaky foundations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experiments combining different groups or factors are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the number of replications required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the degrees of freedom (DF) of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than simply the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for each error term of the ANOVA. Finally, in a factorial experiment, it is important to define the design of the experiment in detail because this determines the appropriate type of ANOVA. We will discuss some of the common variations of factorial ANOVA in future statnotes. If there is doubt about which ANOVA to use, the researcher should seek advice from a statistician with experience of research in applied microbiology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PCA/FA is a method of analyzing complex data sets in which there are no clearly defined X or Y variables. It has multiple uses including the study of the pattern of variation between individual entities such as patients with particular disorders and the detailed study of descriptive variables. In most applications, variables are related to a smaller number of ‘factors’ or PCs that account for the maximum variance in the data and hence, may explain important trends among the variables. An increasingly important application of the method is in the ‘validation’ of questionnaires that attempt to relate subjective aspects of a patients experience with more objective measures of vision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of non-linear latent variable model called the Generative Topographic Mapping, for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used Self-Organizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of non-linear latent variable model called the Generative Topographic Mapping, for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used Self-Organizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis describes the Generative Topographic Mapping (GTM) --- a non-linear latent variable model, intended for modelling continuous, intrinsically low-dimensional probability distributions, embedded in high-dimensional spaces. It can be seen as a non-linear form of principal component analysis or factor analysis. It also provides a principled alternative to the self-organizing map --- a widely established neural network model for unsupervised learning --- resolving many of its associated theoretical problems. An important, potential application of the GTM is visualization of high-dimensional data. Since the GTM is non-linear, the relationship between data and its visual representation may be far from trivial, but a better understanding of this relationship can be gained by computing the so-called magnification factor. In essence, the magnification factor relates the distances between data points, as they appear when visualized, to the actual distances between those data points. There are two principal limitations of the basic GTM model. The computational effort required will grow exponentially with the intrinsic dimensionality of the density model. However, if the intended application is visualization, this will typically not be a problem. The other limitation is the inherent structure of the GTM, which makes it most suitable for modelling moderately curved probability distributions of approximately rectangular shape. When the target distribution is very different to that, theaim of maintaining an `interpretable' structure, suitable for visualizing data, may come in conflict with the aim of providing a good density model. The fact that the GTM is a probabilistic model means that results from probability theory and statistics can be used to address problems such as model complexity. Furthermore, this framework provides solid ground for extending the GTM to wider contexts than that of this thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In Statnotes 24 and 25, multiple linear regression, a statistical method that examines the relationship between a single dependent variable (Y) and two or more independent variables (X), was described. The principle objective of such an analysis was to determine which of the X variables had a significant influence on Y and to construct an equation that predicts Y from the X variables. ‘Principal components analysis’ (PCA) and ‘factor analysis’ (FA) are also methods of examining the relationships between different variables but they differ from multiple regression in that no distinction is made between the dependent and independent variables, all variables being essentially treated the same. Originally, PCA and FA were regarded as distinct methods but in recent times they have been combined into a single analysis, PCA often being the first stage of a FA. The basic objective of a PCA/FA is to examine the relationships between the variables or the ‘structure’ of the variables and to determine whether these relationships can be explained by a smaller number of ‘factors’. This statnote describes the use of PCA/FA in the analysis of the differences between the DNA profiles of different MRSA strains introduced in Statnote 26.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The pattern of correlation between two sets of variables can be tested using canonical variate analysis (CVA). CVA, like principal components analysis (PCA) and factor analysis (FA) (Statnote 27, Hilton & Armstrong, 2011b), is a multivariate analysis Essentially, as in PCA/FA, the objective is to determine whether the correlations between two sets of variables can be explained by a smaller number of ‘axes of correlation’ or ‘canonical roots’.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis explores efforts to conjoin organisational contexts and capabilities in explaining sustainable competitive advantage. Oliver (1997) argued organisations need to balance the need to conform to industry’s requirements to attain legitimization (e.g. DiMaggio & Powell, 1983), and the need for resource optimization (e.g. Barney, 1991). The author hypothesized that such balance can be viewed as movements along the homogeneity-heterogeneity continuum. An organisation in a homogenous industry possesses similar characteristics as its competitors, as opposed to a heterogeneous industry in which organisations within are differentiated and competitively positioned (Oliver, 1997). The movement is influenced by the dynamic environmental conditions that an organisation is experiencing. The author extended Oliver’s (1997) propositions of combining RBV’s focus on capabilities with institutional theory’s focus on organisational context, as well as redefining organisational receptivity towards change (ORC) factors from Butler and Allen’s (2008) findings. The authors contributed to the theoretical development of ORC theory to explain the attainment of sustainable competitive advantage. ORC adopts the assumptions from both institutional and RBV theories, where the receptivity factors include both organisational contexts and capabilities. The thesis employed a mixed method approach in which sequential qualitative quantitative studies were deployed to establish a robust, reliable, and valid ORC scale. The adoption of Hinkin’s (1995) three-phase scale development process was updated, thus items generated from interviews and literature reviews went through numerous exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) to achieve convergent, discriminant, and nomological validities. Samples in the first phase (semi structured interviews) were hotel owners and managers. In the second phase, samples were MBA students, and employees of private and public sectors. In the third phase, samples were hotel managers. The final ORC scale is a parsimonious second higher-order latent construct. The first-order constructs comprises four latent receptivity factors which are ideological vision (4 items), leading change (4 items), implementation capacity (4 items), and change orientation (7 items). Hypotheses testing revealed that high levels of perceived environmental uncertainty leads to high levels of receptivity factor. Furthermore, the study found a strong positive correlation between receptivity factors and competitive advantage, and between receptivity factors and organisation performance. Mediation analyses revealed that receptivity factors partially mediate the relationship between perceived environmental uncertainty, competitive advantage and organisational performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: In today's competitive scenario, effective supply chain management is increasingly dependent on third-party logistics (3PL) companies' capabilities and performance. The dissemination of information technology (IT) has contributed to change the supply chain role of 3PL companies and IT is considered an important element influencing the performance of modern logistics companies. Therefore, the purpose of this paper is to explore the relationship between IT and 3PLs' performance, assuming that logistics capabilities play a mediating role in this relationship. Design/methodology/approach: Empirical evidence based on a questionnaire survey conducted on a sample of logistics service companies operating in the Italian market was used to test a conceptual resource-based view (RBV) framework linking IT adoption, logistics capabilities and firm performance. Factor analysis and ordinary least square (OLS) regression analysis have been used to test hypotheses. The focus of the paper is multidisciplinary in nature; management of information systems, strategy, logistics and supply chain management approaches have been combined in the analysis. Findings: The results indicate strong relationships among data gathering technologies, transactional capabilities and firm performance, in terms of both efficiency and effectiveness. Moreover, a positive correlation between enterprise information technologies and 3PL financial performance has been found. Originality/value: The paper successfully uses the concept of logistics capabilities as mediating factor between IT adoption and firm performance. Objective measures have been proposed for IT adoption and logistics capabilities. Direct and indirect relationships among variables have been successfully tested. © Emerald Group Publishing Limited.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Defining 'effectiveness' in the context of community mental health teams (CMHTs) has become increasingly difficult under the current pattern of provision required in National Health Service mental health services in England. The aim of this study was to establish the characteristics of multi-professional team working effectiveness in adult CMHTs to develop a new measure of CMHT effectiveness. The study was conducted between May and November 2010 and comprised two stages. Stage 1 used a formative evaluative approach based on the Productivity Measurement and Enhancement System to develop the scale with multiple stakeholder groups over a series of qualitative workshops held in various locations across England. Stage 2 analysed responses from a cross-sectional survey of 1500 members in 135 CMHTs from 11 Mental Health Trusts in England to determine the scale's psychometric properties. Based on an analysis of its structural validity and reliability, the resultant 20-item scale demonstrated good psychometric properties and captured one overall latent factor of CMHT effectiveness comprising seven dimensions: improved service user well-being, creative problem-solving, continuous care, inter-team working, respect between professionals, engagement with carers and therapeutic relationships with service users. The scale will be of significant value to CMHTs and healthcare commissioners both nationally and internationally for monitoring, evaluating and improving team functioning in practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To analyze differences in the variables associated with severity of suicidal intent and in the main factors associated with intent when comparing younger and older adults. DESIGN: Observational, descriptive cross-sectional study. SETTING: Four general hospitals in Madrid, Spain. PARTICIPANTS: Eight hundred seventy suicide attempts by 793 subjects split into two groups: 18-54 year olds and subjects older than 55 years. MEASUREMENTS: The authors tested the factorial latent structure of suicidal intent through multigroup confirmatory factor analysis for categorical outcomes and performed statistical tests of invariance across age groups using the DIFFTEST procedure. Then, they tested a multiple indicators-multiple causes (MIMIC) model including different covariates regressed on the latent factor "intent" and performed two separate MIMIC models for younger and older adults to test for differential patterns. RESULTS: Older adults had higher suicidal intent than younger adults (z = 2.63, p = 0.009). The final model for the whole sample showed a relationship of intent with previous attempts, support, mood disorder, personality disorder, substance-related disorder, and schizophrenia and other psychotic disorders. The model showed an adequate fit (chi²[12] = 22.23, p = 0.035; comparative fit index = 0.986; Tucker-Lewis index = 0.980; root mean square error of approximation = 0.031; weighted root mean square residual = 0.727). All covariates had significant weights in the younger group, but in the older group, only previous attempts and mood disorders were significantly related to intent severity. CONCLUSIONS: The pattern of variables associated with suicidal intent varies with age. Recognition, and treatment of geriatric depression may be the most effective measure to prevent suicidal behavior in older adults.