895 resultados para exploratory factor analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The optimal source precoding matrix and relay amplifying matrix have been developed in recent works on multiple-input multiple-output (MIMO) relay communication systems assuming that the instantaneous channel state information (CSI) is available. However, in practical relay communication systems, the instantaneous CSI is unknown, and therefore, has to be estimated at the destination node. In this paper, we develop a novel channel estimation algorithm for two-hop MIMO relay systems using the parallel factor (PARAFAC) analysis. The proposed algorithm provides the destination node with full knowledge of all channel matrices involved in the communication. Compared with existing approaches, the proposed algorithm requires less number of training data blocks, yields smaller channel estimation error, and is applicable for both one-way and two-way MIMO relay systems with single or multiple relay nodes. Numerical examples demonstrate the effectiveness of the PARAFAC-based channel estimation algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:


Purpose – The purpose of this paper is to investigate and uncover key determinants that could explain partners' commitment to risk management in public-private partnership projects so that partners' risk management commitment is taken into the consideration of optimal risk allocation strategies.

Design/methodology/approach – Based on an extensive literature review and an examination of the purchasing power parity (PPP) market, an industry-wide questionnaire survey was conducted to collect the data for a confirmatory factor analysis. Necessary statistical tests are conducted to ensure the validity of the analysis results.

Findings – The factor analysis results show that the procedure of confirmatory factor analysis is statistically appropriate and satisfactory. As a result, partners' organizational commitment to risk management in public-private partnerships can now be determined by a set of components, namely general attitude to a risk, perceived one's own ability to manage a risk, and the perceived reward for bearing a risk.

Practical implications – It is recommended, based on the empirical results shown in this paper, that, in addition to partners' risk management capability, decision-makers, both from public and private sectors, should also seriously consider partners' risk management commitment. Both factors influence the formation of optimal risk allocation strategies, either by their individual or interacting effects. Future research may therefore explore how to form optimal risk allocation strategies by integrating organizational capability and commitment, the determinants and measurement of which have been established in this study.

Originality/value – This paper makes an original contribution to the general body of knowledge on risk allocation in large-scale infrastructure projects in Australia adopting the procurement method of public-private partnership. In particular, this paper has innovatively established a measurement model of organisational commitment to risk management, which is crucial to determining optimal risk allocation strategies and in turn achieving project success. The score coefficients of all obtained components can be used to construct components by linear combination so that commitment to risk management can be measured. Previous research has barely focused on this topic.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new time-frequency approach to the underdetermined blind source separation using the parallel factor decomposition of third-order tensors. Without any constraint on the number of active sources at an auto-term time-frequency point, this approach can directly separate the sources as long as the uniqueness condition of parallel factor decomposition is satisfied. Compared with the existing two-stage methods where the mixing matrix should be estimated at first and then used to recover the sources, our approach yields better source separation performance in the presence of noise. Moreover, the mixing matrix can be estimated at the same time of the source separation process. Numerical simulations are presented to show the superior performance of the proposed approach to some of the existing two-stage blind source separation methods that use the time-frequency representation as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background : The Beck Depression Inventory (BDI) is frequently employed as measure of depression in studies of obesity. The aim of the study was to assess the factorial structure of the BDI in obese patients prior to bariatric surgery.

Methods : Confirmatory factor analysis was conducted on the current published factor analyses of the BDI. Three published models were initially analysed with two additional modified models subsequently included. A sample of 285 patients presenting for Lap-Band® surgery was used.

Results : The published bariatric model by Munoz et al. was not an adequate fit to the data. The general model by Shafer et al. was a good fit to the data but had substantial limitations. The weight loss item did not significantly load on any factor in either model. A modified Shafer model and a proposed model were tested, and both were found to be a good fit to the data with minimal differences between the two. A proposed model, in which two items, weight loss and appetite, were omitted, was suggested to be the better model with good reliability.

Conclusions : The previously published factor analysis in bariatric candidates by Munoz et al. was a poor fit to the data, and use of this factor structure should be seriously reconsidered within the obese population. The hypothesised model was the best fit to the data. The findings of the study suggest that the existing published models are not adequate for investigating depression in obese patients seeking surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multimedia content understanding research requires rigorous approach to deal with the complexity of the data. At the crux of this problem is the method to deal with multilevel data whose structure exists at multiple scales and across data sources. A common example is modeling tags jointly with images to improve retrieval, classification and tag recommendation. Associated contextual observation, such as metadata, is rich that can be exploited for content analysis. A major challenge is the need for a principal approach to systematically incorporate associated media with the primary data source of interest. Taking a factor modeling approach, we propose a framework that can discover low-dimensional structures for a primary data source together with other associated information. We cast this task as a subspace learning problem under the framework of Bayesian nonparametrics and thus the subspace dimensionality and the number of clusters are automatically learnt from data instead of setting these parameters a priori. Using Beta processes as the building block, we construct random measures in a hierarchical structure to generate multiple data sources and capture their shared statistical at the same time. The model parameters are inferred efficiently using a novel combination of Gibbs and slice sampling. We demonstrate the applicability of the proposed model in three applications: image retrieval, automatic tag recommendation and image classification. Experiments using two real-world datasets show that our approach outperforms various state-of-the-art related methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine bivariate extensions of Aït-Sahalia’s approach to the estimation of univariate diffusions. Our message is that extending his idea to a bivariate setting is not straightforward. In higher dimensions, as opposed to the univariate case, the elements of the Itô and Fokker-Planck representations do not coincide; and, even imposing sensible assumptions on the marginal drifts and volatilities is not sufficient to obtain direct generalisations. We develop exploratory estimation and testing procedures, by parametrizing the drifts of both component processes and setting restrictions on the terms of either the Itô or the Fokker-Planck covariance matrices. This may lead to highly nonlinear ordinary differential equations, where the definition of boundary conditions is crucial. For the methods developed, the Fokker-Planck representation seems more tractable than the Itô’s. Questions for further research include the design of regularity conditions on the time series dependence in the data, the kernels actually used and the bandwidths, to obtain asymptotic properties for the estimators proposed. A particular case seems promising: “causal bivariate models” in which only one of the diffusions contributes to the volatility of the other. Hedging strategies which estimate separately the univariate diffusions at stake may thus be improved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a set of Brazilian commercial gasoline representative samples from São Paulo State, selected by HCA, plus six samples obtained directly from refineries were analysed by a high-sensitive gas chromatographic (GC) method ASTM D6733. The levels of saturated hydrocarbons and anhydrous ethanol obtained by GC were correlated with the quality obtained from Brazilian Government Petroleum, Natural Gas and Biofuels Agency (ANP) specifications through exploratory analysis (HCA and PCA). This correlation showed that the GC method, together with HCA and PCA, could be employed as a screening technique to determine compliance with the prescribed legal standards of Brazilian gasoline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Diet composition is one of the factors that may contribute to intraindividual variability in the anticoagulant response to warfarin. Aim of the study To determine the associations between food pattern and anticoagulant response to warfarin in a group of Brazilian patients with vascular disease. Methods Recent and usual food intakes were assessed in 115 patients receiving warfarin; and corresponding plasma phylloquinone (vitamin K-1), serum triglyceride concentrations, prothrombin time (PT), and International Normalized Ratio (INR) were determined. A factor analysis was used to examine the association of specific foods and biochemical variables with anticoagulant data. Results Mean age was 59 +/- 15 years. Inadequate anticoagulation, defined as values of INR 2 or 3, was found in 48% of the patients. Soybean oil and kidney beans were the primary food sources of phylloquinone intake. Factor analysis yielded four separate factors, explaining 56.4% of the total variance in the data set. The factor analysis revealed that intakes of kidney beans and soybean oil, 24-h recall of phylloquinone intake, PT and INR loaded significantly on factor 1. Triglycerides, PT, INR, plasma phylloquinone, and duration of anticoagulation therapy loaded on factor 3. Conclusion Fluctuations in phylloquinone intake, particularly from kidney beans, and plasma phylloquinone concentrations were associated with variation in measures of anticoagulation (PT and INR) in a Brazilian group of patients with vascular disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed at evaluating the validity, reliability, and factorial invariance of the complete (34-item) and shortened (8-item and 16-item) versions of the Body Shape Questionnaire (BSQ) when applied to Brazilian university students. A total of 739 female students with a mean age of 20.44 (standard deviation = 2.45) years participated. Confirmatory factor analysis was conducted to verify the degree to which the one-factor structure satisfies the proposal for the BSQ's expected structure. Two items of the 34-item version were excluded because they had factor weights (lambda)< 40. All models had adequate convergent validity (average variance extracted =.43-.58; composite reliability=.85-.97) and internal consistency (alpha =.85-.97). The 8-item B version was considered the best shortened BSQ version (Akaike information criterion = 84.07, Bayes information criterion = 157.75, Browne-Cudeck criterion= 84.46), with strong invariance for independent samples (Delta chi(2)lambda(7)= 5.06, Delta chi(2)Cov(8)= 5.11, Delta chi(2)Res(16) = 19.30). (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Factor analysis was used to develop a more detailed description of the human hand to be used in the creation of glove sizes; currently gloves sizes are small, medium, and large. The created glove sizes provide glove designers with the ability to create a glove design that can provide fit to the majority of hand variations in both the male and female populations. The research used the American National Survey (ANSUR) data that was collected in 1988. This data contains eighty-six length, width, height, and circumference measurements of the human hand for one thousand male subjects and thirteen hundred female subjects. Eliminating redundant measurements reduced the data to forty-six essential measurements. Factor analysis grouped the variables to form three factors. The factors were used to generate hand sizes by using percentiles along each factor axis. Two different sizing systems were created. The first system contains 125 sizes for male and female. The second system contains 7 sizes for males and 14 sizes for females. The sizing systems were compared to another hand sizing system that was created using the ANSUR database indicating that the systems created using factor analysis provide better fit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. Patients with terminal heart failure have increased more than the available organs leading to a high mortality rate on the waiting list. Use of Marginal and expanded criteria donors has increased due to the heart shortage. Objective. We analyzed all heart transplantations (HTx) in Sao Paulo state over 8 years for donor profile and recipient risk factors. Method. This multi-institutional review collected HTx data from all institutions in the state of Sao Paulo, Brazil. From 2002 to 2008 (6 years), only 512 (28.8%) of 1777 available heart donors were accepted for transplantation. All medical records were analyzed retrospectively; none of the used donors was excluded, even those considered to be nonstandard. Results. The hospital mortality rate was 27.9% (n = 143) and the average follow-up time was 29.4 +/- 28.4 months. The survival rate was 55.5% (n = 285) at 6 years after HTx. Univariate analysis showed the following factors to impact survival: age (P = .0004), arterial hypertension (P = .4620), norepinephrine (P = .0450), cardiac arrest (P = .8500), diabetes mellitus (P = .5120), infection (P = .1470), CKMB (creatine kinase MB) (P = .8694), creatinine (P = .7225), and Na+ (P = .3273). On multivariate analysis, only age showed significance; logistic regression showed a significant cut-off at 40 years: organs from donors older than 40 years showed a lower late survival rates (P = .0032). Conclusions. Donor age older than 40 years represents an important risk factor for survival after HTx. Neither donor gender nor norepinephrine use negatively affected early survival.