967 resultados para Parallel factor analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:


Purpose – The purpose of this paper is to investigate and uncover key determinants that could explain partners' commitment to risk management in public-private partnership projects so that partners' risk management commitment is taken into the consideration of optimal risk allocation strategies.

Design/methodology/approach – Based on an extensive literature review and an examination of the purchasing power parity (PPP) market, an industry-wide questionnaire survey was conducted to collect the data for a confirmatory factor analysis. Necessary statistical tests are conducted to ensure the validity of the analysis results.

Findings – The factor analysis results show that the procedure of confirmatory factor analysis is statistically appropriate and satisfactory. As a result, partners' organizational commitment to risk management in public-private partnerships can now be determined by a set of components, namely general attitude to a risk, perceived one's own ability to manage a risk, and the perceived reward for bearing a risk.

Practical implications – It is recommended, based on the empirical results shown in this paper, that, in addition to partners' risk management capability, decision-makers, both from public and private sectors, should also seriously consider partners' risk management commitment. Both factors influence the formation of optimal risk allocation strategies, either by their individual or interacting effects. Future research may therefore explore how to form optimal risk allocation strategies by integrating organizational capability and commitment, the determinants and measurement of which have been established in this study.

Originality/value – This paper makes an original contribution to the general body of knowledge on risk allocation in large-scale infrastructure projects in Australia adopting the procurement method of public-private partnership. In particular, this paper has innovatively established a measurement model of organisational commitment to risk management, which is crucial to determining optimal risk allocation strategies and in turn achieving project success. The score coefficients of all obtained components can be used to construct components by linear combination so that commitment to risk management can be measured. Previous research has barely focused on this topic.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background : The Beck Depression Inventory (BDI) is frequently employed as measure of depression in studies of obesity. The aim of the study was to assess the factorial structure of the BDI in obese patients prior to bariatric surgery.

Methods : Confirmatory factor analysis was conducted on the current published factor analyses of the BDI. Three published models were initially analysed with two additional modified models subsequently included. A sample of 285 patients presenting for Lap-Band® surgery was used.

Results : The published bariatric model by Munoz et al. was not an adequate fit to the data. The general model by Shafer et al. was a good fit to the data but had substantial limitations. The weight loss item did not significantly load on any factor in either model. A modified Shafer model and a proposed model were tested, and both were found to be a good fit to the data with minimal differences between the two. A proposed model, in which two items, weight loss and appetite, were omitted, was suggested to be the better model with good reliability.

Conclusions : The previously published factor analysis in bariatric candidates by Munoz et al. was a poor fit to the data, and use of this factor structure should be seriously reconsidered within the obese population. The hypothesised model was the best fit to the data. The findings of the study suggest that the existing published models are not adequate for investigating depression in obese patients seeking surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multimedia content understanding research requires rigorous approach to deal with the complexity of the data. At the crux of this problem is the method to deal with multilevel data whose structure exists at multiple scales and across data sources. A common example is modeling tags jointly with images to improve retrieval, classification and tag recommendation. Associated contextual observation, such as metadata, is rich that can be exploited for content analysis. A major challenge is the need for a principal approach to systematically incorporate associated media with the primary data source of interest. Taking a factor modeling approach, we propose a framework that can discover low-dimensional structures for a primary data source together with other associated information. We cast this task as a subspace learning problem under the framework of Bayesian nonparametrics and thus the subspace dimensionality and the number of clusters are automatically learnt from data instead of setting these parameters a priori. Using Beta processes as the building block, we construct random measures in a hierarchical structure to generate multiple data sources and capture their shared statistical at the same time. The model parameters are inferred efficiently using a novel combination of Gibbs and slice sampling. We demonstrate the applicability of the proposed model in three applications: image retrieval, automatic tag recommendation and image classification. Experiments using two real-world datasets show that our approach outperforms various state-of-the-art related methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exploratory factor analysis (hereafter, factor analysis) is a complex statistical method that is integral to many fields of research. Using factor analysis requires researchers to make several decisions, each of which affects the solutions generated. In this paper, we focus on five major decisions that are made in conducting factor analysis: (i) establishing how large the sample needs to be, (ii) choosing between factor analysis and principal components analysis, (iii) determining the number of factors to retain, (iv) selecting a method of data extraction, and (v) deciding upon the methods of factor rotation. The purpose of this paper is threefold: (i) to review the literature with respect to these five decisions, (ii) to assess current practices in nursing research, and (iii) to offer recommendations for future use. The literature reviews illustrate that factor analysis remains a dynamic field of study, with recent research having practical implications for those who use this statistical method. The assessment was conducted on 54 factor analysis (and principal components analysis) solutions presented in the results sections of 28 papers published in the 2012 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. The main findings from the assessment were that researchers commonly used (a) participants-to-items ratios for determining sample sizes (used for 43% of solutions), (b) principal components analysis (61%) rather than factor analysis (39%), (c) the eigenvalues greater than one rule and screen tests to decide upon the numbers of factors/components to retain (61% and 46%, respectively), (d) principal components analysis and unweighted least squares as methods of data extraction (61% and 19%, respectively), and (e) the Varimax method of rotation (44%). In general, well-established, but out-dated, heuristics and practices informed decision making with respect to the performance of factor analysis in nursing studies. Based on the findings from factor analysis research, it seems likely that the use of such methods may have had a material, adverse effect on the solutions generated. We offer recommendations for future practice with respect to each of the five decisions discussed in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Diet composition is one of the factors that may contribute to intraindividual variability in the anticoagulant response to warfarin. Aim of the study To determine the associations between food pattern and anticoagulant response to warfarin in a group of Brazilian patients with vascular disease. Methods Recent and usual food intakes were assessed in 115 patients receiving warfarin; and corresponding plasma phylloquinone (vitamin K-1), serum triglyceride concentrations, prothrombin time (PT), and International Normalized Ratio (INR) were determined. A factor analysis was used to examine the association of specific foods and biochemical variables with anticoagulant data. Results Mean age was 59 +/- 15 years. Inadequate anticoagulation, defined as values of INR 2 or 3, was found in 48% of the patients. Soybean oil and kidney beans were the primary food sources of phylloquinone intake. Factor analysis yielded four separate factors, explaining 56.4% of the total variance in the data set. The factor analysis revealed that intakes of kidney beans and soybean oil, 24-h recall of phylloquinone intake, PT and INR loaded significantly on factor 1. Triglycerides, PT, INR, plasma phylloquinone, and duration of anticoagulation therapy loaded on factor 3. Conclusion Fluctuations in phylloquinone intake, particularly from kidney beans, and plasma phylloquinone concentrations were associated with variation in measures of anticoagulation (PT and INR) in a Brazilian group of patients with vascular disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed at evaluating the validity, reliability, and factorial invariance of the complete (34-item) and shortened (8-item and 16-item) versions of the Body Shape Questionnaire (BSQ) when applied to Brazilian university students. A total of 739 female students with a mean age of 20.44 (standard deviation = 2.45) years participated. Confirmatory factor analysis was conducted to verify the degree to which the one-factor structure satisfies the proposal for the BSQ's expected structure. Two items of the 34-item version were excluded because they had factor weights (lambda)< 40. All models had adequate convergent validity (average variance extracted =.43-.58; composite reliability=.85-.97) and internal consistency (alpha =.85-.97). The 8-item B version was considered the best shortened BSQ version (Akaike information criterion = 84.07, Bayes information criterion = 157.75, Browne-Cudeck criterion= 84.46), with strong invariance for independent samples (Delta chi(2)lambda(7)= 5.06, Delta chi(2)Cov(8)= 5.11, Delta chi(2)Res(16) = 19.30). (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Factor analysis was used to develop a more detailed description of the human hand to be used in the creation of glove sizes; currently gloves sizes are small, medium, and large. The created glove sizes provide glove designers with the ability to create a glove design that can provide fit to the majority of hand variations in both the male and female populations. The research used the American National Survey (ANSUR) data that was collected in 1988. This data contains eighty-six length, width, height, and circumference measurements of the human hand for one thousand male subjects and thirteen hundred female subjects. Eliminating redundant measurements reduced the data to forty-six essential measurements. Factor analysis grouped the variables to form three factors. The factors were used to generate hand sizes by using percentiles along each factor axis. Two different sizing systems were created. The first system contains 125 sizes for male and female. The second system contains 7 sizes for males and 14 sizes for females. The sizing systems were compared to another hand sizing system that was created using the ANSUR database indicating that the systems created using factor analysis provide better fit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. Patients with terminal heart failure have increased more than the available organs leading to a high mortality rate on the waiting list. Use of Marginal and expanded criteria donors has increased due to the heart shortage. Objective. We analyzed all heart transplantations (HTx) in Sao Paulo state over 8 years for donor profile and recipient risk factors. Method. This multi-institutional review collected HTx data from all institutions in the state of Sao Paulo, Brazil. From 2002 to 2008 (6 years), only 512 (28.8%) of 1777 available heart donors were accepted for transplantation. All medical records were analyzed retrospectively; none of the used donors was excluded, even those considered to be nonstandard. Results. The hospital mortality rate was 27.9% (n = 143) and the average follow-up time was 29.4 +/- 28.4 months. The survival rate was 55.5% (n = 285) at 6 years after HTx. Univariate analysis showed the following factors to impact survival: age (P = .0004), arterial hypertension (P = .4620), norepinephrine (P = .0450), cardiac arrest (P = .8500), diabetes mellitus (P = .5120), infection (P = .1470), CKMB (creatine kinase MB) (P = .8694), creatinine (P = .7225), and Na+ (P = .3273). On multivariate analysis, only age showed significance; logistic regression showed a significant cut-off at 40 years: organs from donors older than 40 years showed a lower late survival rates (P = .0032). Conclusions. Donor age older than 40 years represents an important risk factor for survival after HTx. Neither donor gender nor norepinephrine use negatively affected early survival.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis some multivariate spectroscopic methods for the analysis of solutions are proposed. Spectroscopy and multivariate data analysis form a powerful combination for obtaining both quantitative and qualitative information and it is shown how spectroscopic techniques in combination with chemometric data evaluation can be used to obtain rapid, simple and efficient analytical methods. These spectroscopic methods consisting of spectroscopic analysis, a high level of automation and chemometric data evaluation can lead to analytical methods with a high analytical capacity, and for these methods, the term high-capacity analysis (HCA) is suggested. It is further shown how chemometric evaluation of the multivariate data in chromatographic analyses decreases the need for baseline separation. The thesis is based on six papers and the chemometric tools used are experimental design, principal component analysis (PCA), soft independent modelling of class analogy (SIMCA), partial least squares regression (PLS) and parallel factor analysis (PARAFAC). The analytical techniques utilised are scanning ultraviolet-visible (UV-Vis) spectroscopy, diode array detection (DAD) used in non-column chromatographic diode array UV spectroscopy, high-performance liquid chromatography with diode array detection (HPLC-DAD) and fluorescence spectroscopy. The methods proposed are exemplified in the analysis of pharmaceutical solutions and serum proteins. In Paper I a method is proposed for the determination of the content and identity of the active compound in pharmaceutical solutions by means of UV-Vis spectroscopy, orthogonal signal correction and multivariate calibration with PLS and SIMCA classification. Paper II proposes a new method for the rapid determination of pharmaceutical solutions by the use of non-column chromatographic diode array UV spectroscopy, i.e. a conventional HPLC-DAD system without any chromatographic column connected. In Paper III an investigation is made of the ability of a control sample, of known content and identity to diagnose and correct errors in multivariate predictions something that together with use of multivariate residuals can make it possible to use the same calibration model over time. In Paper IV a method is proposed for simultaneous determination of serum proteins with fluorescence spectroscopy and multivariate calibration. Paper V proposes a method for the determination of chromatographic peak purity by means of PCA of HPLC-DAD data. In Paper VI PARAFAC is applied for the decomposition of DAD data of some partially separated peaks into the pure chromatographic, spectral and concentration profiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last decade, a multi-modal approach has been established in human experimental pain research for assessing pain thresholds and responses to various experimental pain modalities. Studies have concluded that differences in responses to pain stimuli are mainly related to variation between individuals rather than variation in response to different stimulus modalities. In a factor analysis of 272 consecutive volunteers (137 men and 135 women) who underwent tests with different experimental pain modalities, it was determined whether responses to different pain modalities represent distinct individual uncorrelated dimensions of pain perception. Volunteers underwent single painful electrical stimulation, repeated painful electrical stimulation (temporal summation), test for reflex receptive field, pressure pain stimulation, heat pain stimulation, cold pain stimulation, and a cold pressor test (ice water test). Five distinct factors were found representing responses to 5 distinct experimental pain modalities: pressure, heat, cold, electrical stimulation, and reflex-receptive fields. Each of the factors explained approximately 8% to 35% of the observed variance, and the 5 factors cumulatively explained 94% of the variance. The correlation between the 5 factors was near null (median ρ=0.00, range -0.03 to 0.05), with 95% confidence intervals for pairwise correlations between 2 factors excluding any relevant correlation. Results were almost similar for analyses stratified according to gender and age. Responses to different experimental pain modalities represent different specific dimensions and should be assessed in combination in future pharmacological and clinical studies to represent the complexity of nociception and pain experience.