59 resultados para data reduction by factor analysis

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:


Purpose – The purpose of this paper is to investigate and uncover key determinants that could explain partners' commitment to risk management in public-private partnership projects so that partners' risk management commitment is taken into the consideration of optimal risk allocation strategies.

Design/methodology/approach – Based on an extensive literature review and an examination of the purchasing power parity (PPP) market, an industry-wide questionnaire survey was conducted to collect the data for a confirmatory factor analysis. Necessary statistical tests are conducted to ensure the validity of the analysis results.

Findings – The factor analysis results show that the procedure of confirmatory factor analysis is statistically appropriate and satisfactory. As a result, partners' organizational commitment to risk management in public-private partnerships can now be determined by a set of components, namely general attitude to a risk, perceived one's own ability to manage a risk, and the perceived reward for bearing a risk.

Practical implications – It is recommended, based on the empirical results shown in this paper, that, in addition to partners' risk management capability, decision-makers, both from public and private sectors, should also seriously consider partners' risk management commitment. Both factors influence the formation of optimal risk allocation strategies, either by their individual or interacting effects. Future research may therefore explore how to form optimal risk allocation strategies by integrating organizational capability and commitment, the determinants and measurement of which have been established in this study.

Originality/value – This paper makes an original contribution to the general body of knowledge on risk allocation in large-scale infrastructure projects in Australia adopting the procurement method of public-private partnership. In particular, this paper has innovatively established a measurement model of organisational commitment to risk management, which is crucial to determining optimal risk allocation strategies and in turn achieving project success. The score coefficients of all obtained components can be used to construct components by linear combination so that commitment to risk management can be measured. Previous research has barely focused on this topic.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new time-frequency approach to the underdetermined blind source separation using the parallel factor decomposition of third-order tensors. Without any constraint on the number of active sources at an auto-term time-frequency point, this approach can directly separate the sources as long as the uniqueness condition of parallel factor decomposition is satisfied. Compared with the existing two-stage methods where the mixing matrix should be estimated at first and then used to recover the sources, our approach yields better source separation performance in the presence of noise. Moreover, the mixing matrix can be estimated at the same time of the source separation process. Numerical simulations are presented to show the superior performance of the proposed approach to some of the existing two-stage blind source separation methods that use the time-frequency representation as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background : The Beck Depression Inventory (BDI) is frequently employed as measure of depression in studies of obesity. The aim of the study was to assess the factorial structure of the BDI in obese patients prior to bariatric surgery.

Methods : Confirmatory factor analysis was conducted on the current published factor analyses of the BDI. Three published models were initially analysed with two additional modified models subsequently included. A sample of 285 patients presenting for Lap-Band® surgery was used.

Results : The published bariatric model by Munoz et al. was not an adequate fit to the data. The general model by Shafer et al. was a good fit to the data but had substantial limitations. The weight loss item did not significantly load on any factor in either model. A modified Shafer model and a proposed model were tested, and both were found to be a good fit to the data with minimal differences between the two. A proposed model, in which two items, weight loss and appetite, were omitted, was suggested to be the better model with good reliability.

Conclusions : The previously published factor analysis in bariatric candidates by Munoz et al. was a poor fit to the data, and use of this factor structure should be seriously reconsidered within the obese population. The hypothesised model was the best fit to the data. The findings of the study suggest that the existing published models are not adequate for investigating depression in obese patients seeking surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multimedia content understanding research requires rigorous approach to deal with the complexity of the data. At the crux of this problem is the method to deal with multilevel data whose structure exists at multiple scales and across data sources. A common example is modeling tags jointly with images to improve retrieval, classification and tag recommendation. Associated contextual observation, such as metadata, is rich that can be exploited for content analysis. A major challenge is the need for a principal approach to systematically incorporate associated media with the primary data source of interest. Taking a factor modeling approach, we propose a framework that can discover low-dimensional structures for a primary data source together with other associated information. We cast this task as a subspace learning problem under the framework of Bayesian nonparametrics and thus the subspace dimensionality and the number of clusters are automatically learnt from data instead of setting these parameters a priori. Using Beta processes as the building block, we construct random measures in a hierarchical structure to generate multiple data sources and capture their shared statistical at the same time. The model parameters are inferred efficiently using a novel combination of Gibbs and slice sampling. We demonstrate the applicability of the proposed model in three applications: image retrieval, automatic tag recommendation and image classification. Experiments using two real-world datasets show that our approach outperforms various state-of-the-art related methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ust-Noticeable-Differences (JND) as a dead-band in perceptual analysis has been widely used for more than a decade. This technique has been employed for data reduction in hap tic data transmission systems by several researchers. In fact, researchers use two different JND coefficients that are JNDV and JNDF for velocity and force data respectively. For position data, they usually rely on the resolution of hap tic display device to omit data that are unperceivable to human. In this paper, pruning undesirable position data that are produced by the vibration of the device or subject and/or noise in transmission line is addressed. It is shown that using inverse JNDV for position data can prune undesirable position data. Comparison of the results of the proposed method in this paper with several well known filters and some available methods proposed by other researchers is performed. It is shown that combination of JNDV could provide lower error with desirable curve smoothness, and as little as possible computation effort and complexity. It also has been shown that this method reduces much more data rather than using forward-JNDV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embodied energy (EE) analysis has become an important area of energy research, in attempting to trace the direct and indirect energy requirements of products and services throughout their supply chain. Typically, input-output (I-O) models have been used to calculate EE because they are considered to be comprehensive in their analysis. However, a major deficiency of using I-O models is that they have inherent errors and therefore cannot be reliably applied to individual cases. Thus, there is a need for the ability to disaggregate an I-O model into its most important 'energy paths', for the purpose of integrating case-specific data. This paper presents a new hybrid method for conducting EE analyses for individual buildings, which retains the completeness of the I-O model. This new method is demonstrated by application to an Australian residential building. Only 52% of the energy paths derived from the I-O model were substituted using case-specific data. This indicates that previous system boundaries for EE studies of individual residential buildings are less than optimal. It is envisaged that the proposed method will provide construction professionals with more accurate and reliable data for conducting life cycle energy analysis of buildings. Furthermore, by analysing the unmodified energy paths, further data collection can be prioritized effectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The optimal source precoding matrix and relay amplifying matrix have been developed in recent works on multiple-input multiple-output (MIMO) relay communication systems assuming that the instantaneous channel state information (CSI) is available. However, in practical relay communication systems, the instantaneous CSI is unknown, and therefore, has to be estimated at the destination node. In this paper, we develop a novel channel estimation algorithm for two-hop MIMO relay systems using the parallel factor (PARAFAC) analysis. The proposed algorithm provides the destination node with full knowledge of all channel matrices involved in the communication. Compared with existing approaches, the proposed algorithm requires less number of training data blocks, yields smaller channel estimation error, and is applicable for both one-way and two-way MIMO relay systems with single or multiple relay nodes. Numerical examples demonstrate the effectiveness of the PARAFAC-based channel estimation algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exploratory factor analysis (hereafter, factor analysis) is a complex statistical method that is integral to many fields of research. Using factor analysis requires researchers to make several decisions, each of which affects the solutions generated. In this paper, we focus on five major decisions that are made in conducting factor analysis: (i) establishing how large the sample needs to be, (ii) choosing between factor analysis and principal components analysis, (iii) determining the number of factors to retain, (iv) selecting a method of data extraction, and (v) deciding upon the methods of factor rotation. The purpose of this paper is threefold: (i) to review the literature with respect to these five decisions, (ii) to assess current practices in nursing research, and (iii) to offer recommendations for future use. The literature reviews illustrate that factor analysis remains a dynamic field of study, with recent research having practical implications for those who use this statistical method. The assessment was conducted on 54 factor analysis (and principal components analysis) solutions presented in the results sections of 28 papers published in the 2012 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. The main findings from the assessment were that researchers commonly used (a) participants-to-items ratios for determining sample sizes (used for 43% of solutions), (b) principal components analysis (61%) rather than factor analysis (39%), (c) the eigenvalues greater than one rule and screen tests to decide upon the numbers of factors/components to retain (61% and 46%, respectively), (d) principal components analysis and unweighted least squares as methods of data extraction (61% and 19%, respectively), and (e) the Varimax method of rotation (44%). In general, well-established, but out-dated, heuristics and practices informed decision making with respect to the performance of factor analysis in nursing studies. Based on the findings from factor analysis research, it seems likely that the use of such methods may have had a material, adverse effect on the solutions generated. We offer recommendations for future practice with respect to each of the five decisions discussed in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Open-data has created an unprecedented opportunity with new challenges for ecosystem scientists. Skills in data management are essential to acquire, manage, publish, access and re-use data. These skills span many disciplines and require trans-disciplinary collaboration. Science synthesis centres support analysis and synthesis through collaborative 'Working Groups' where domain specialists work together to synthesise existing information to provide insight into critical problems. The Australian Centre for Ecological Analysis and Synthesis (ACEAS) served a wide range of stakeholders, from scientists to policy-makers to managers. This paper investigates the level of sophistication in data management in the ecosystem science community through the lens of the ACEAS experience, and identifies the important factors required to enable us to benefit from this new data-world and produce innovative science. ACEAS promoted the analysis and synthesis of data to solve transdisciplinary questions, and promoted the publication of the synthesised data. To do so, it provided support in many of the key skillsets required. Analysis and synthesis in multi-disciplinary and multi-organisational teams, and publishing data were new for most. Data were difficult to discover and access, and to make ready for analysis, largely due to lack of metadata. Data use and publication were hampered by concerns about data ownership and a desire for data citation. A web portal was created to visualise geospatial datasets to maximise data interpretation. By the end of the experience there was a significant increase in appreciation of the importance of a Data Management Plan. It is extremely doubtful that the work would have occurred or data delivered without the support of the Synthesis centre, as few of the participants had the necessary networks or skills. It is argued that participation in the Centre provided an important learning opportunity, and has resulted in improved knowledge and understanding of good data management practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physiological and genetic information has been critical to the successful diagnosis and prognosis of complex diseases. In this paper, we introduce a support-confidence-correlation framework to accurately discover truly meaningful and interesting association rules between complex physiological and genetic data for disease factor analysis, such as type II diabetes (T2DM). We propose a novel Multivariate and Multidimensional Association Rule mining system based on Change Detection (MMARCD). Given a complex data set u i (e.g. u 1 numerical data streams, u 2 images, u 3 videos, u 4 DNA/RNA sequences) observed at each time tick t, MMARCD incrementally finds correlations and hidden variables that summarise the key relationships across the entire system. Based upon MMARCD, we are able to construct a correlation network for human diseases. © 2012 Springer-Verlag.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When wearable and personal health device and sensors capture data such as heart rate and body temperature for fitness tracking and health services, they simply transfer data without filtering or optimising. This can cause over-loading to the sensors as well as rapid battery consumption when they interact with Internet of Things (IoT) networks, which are expected to increase and de-mand more health data from device wearers. To solve the problem, this paper proposes to infer sensed data to reduce the data volume, which will affect the bandwidth and battery power reduction that are essential requirements to sensor devices. This is achieved by applying beacon data points after the inferencing of data processing utilising variance rates, which compare the sensed data with ad-jacent data before and after. This novel approach verifies by experiments that data volume can be saved by up to 99.5% with a 98.62% accuracy. Whilst most existing works focus on sensor network improvements such as routing, operation and reading data algorithms, we efficiently reduce data volume to reduce band-width and battery power consumption while maintaining accuracy by implement-ing intelligence and optimisation in sensor devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper evaluates a recently developed hybrid method for the embodied energy analysis of the Australian construction industry. It was found that the truncation associated with process analysis can be up to 80%, whilst the use of input-output analysis alone does not always provide a perfect model for replacing process data. There is also a considerable lack in the quantity and possibly quality of process data currently available. These findings suggest that current best-practice methods are sufficiently accurate for most typical applications, but this is heavily dependant upon data quality and availability. The hybrid method evaluated can be used for the optimisation of embodied energy and for identifying opportunities for improvements in energy efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Although there is increasing recognition that quality of life (QOL) and health-related quality of life (HRQOL) are important outcome variables in clinical trials for children with cerebral palsy, there are substantial limitations in existing measures of QOL. This study identify themes of QOL for children with cerebral palsy and their parents to guide the development of a new condition-specific QOL scale. Methods: A qualitative study of parent and child views on QOL composition was conducted, using a grounded theory framework. Families participated in semistructured interviews on QOL until thematic saturation was reached (n = 28 families). Results: Overall, 13 themes emerged from the interviews: physical health, body pain and discomfort, daily living tasks, participation in regular physical and social activities, emotional well-being and self-esteem, interaction with the community, communication, family health, supportive physical environment, future QOL, provision of, and access to services, financial stability, and social well-being. Conclusions: Research with parents and children with cerebral palsy, representative of severity across the disease spectrum and socio-economic status, reinforced and expanded on the traditional themes that have underpinned QOL measurement development. This has implications not only for the development of a new QOL scale for children with cerebral palsy, but also for clinical interventions and community care management.