58 resultados para improved principal components analysis (IPCA) algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the first paper that introduces a nonlinearity test for principal component models. The methodology involves the division of the data space into disjunct regions that are analysed using principal component analysis using the cross-validation principle. Several toy examples have been successfully analysed and the nonlinearity test has subsequently been applied to data from an internal combustion engine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a fast algorithm for moving window principal component analysis (MWPCA) which will adapt a principal component model. This incorporates the concept of recursive adaptation within a moving window to (i) adapt the mean and variance of the process variables, (ii) adapt the correlation matrix, and (iii) adjust the PCA model by recomputing the decomposition. This paper shows that the new algorithm is computationally faster than conventional moving window techniques, if the window size exceeds 3 times the number of variables, and is not affected by the window size. A further contribution is the introduction of an N-step-ahead horizon into the process monitoring. This implies that the PCA model, identified N-steps earlier, is used to analyze the current observation. For monitoring complex chemical systems, this work shows that the use of the horizon improves the ability to detect slowly developing drifts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study examined the consistency over time of individual differences in behavioral and physiological responsiveness of calves to intuitively alarming test situations as well as the relationships between behavioral and physiological measures. Twenty Holstein Friesian heifer calves were individually subjected to the same series of two behavioral and two hypothalamo-pituitary-adrenocortical (HPA) axis reactivity tests at 3, 13 and 26 weeks of age. Novel environment (open field, OF) and novel object (NO) tests involved measurement of behavioral, plasma cortisol and heart rate responses. Plasma ACTH and/or cortisol response profiles were determined after administration of exogenous CRH and ACTH, respectively, in the HPA axis reactivity tests. Principal component analysis (PCA) was used to condense correlated measures within ages into principal components reflecting independent dimensions underlying the calves' reactivity. Cortisol responses to the OF and NO tests were positively associated with the latency to contact and negatively related to the time spent in contact with the NO. Individual differences in scores of a principal component summarizing this pattern of inter-correlations, as well as differences in separate measures of adrenocortical and behavioral reactivity in the OF and NO tests proved highly consistent over time. The cardiac response to confinement in a start box prior to the OF test was positively associated with the cortisol responses to the OF and NO tests at 26 weeks of age. HPA axis reactivity to ACTH or CRH was unrelated to adrenocortical and behavioral responses to novelty. These findings strongly suggest that the responsiveness of calves was mediated by stable individual characteristics. Correlated adrenocortical and behavioral responses to novelty may reflect underlying fearfulness, defining the individual's susceptibility to the elicitation of fear. Other independent characteristics mediating reactivity may include activity or coping style (related to locomotion) and underlying sociality (associated with vocalization). (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The monitoring of multivariate systems that exhibit non-Gaussian behavior is addressed. Existing work advocates the use of independent component analysis (ICA) to extract the underlying non-Gaussian data structure. Since some of the source signals may be Gaussian, the use of principal component analysis (PCA) is proposed to capture the Gaussian and non-Gaussian source signals. A subsequent application of ICA then allows the extraction of non-Gaussian components from the retained principal components (PCs). A further contribution is the utilization of a support vector data description to determine a confidence limit for the non-Gaussian components. Finally, a statistical test is developed for determining how many non-Gaussian components are encapsulated within the retained PCs, and associated monitoring statistics are defined. The utility of the proposed scheme is demonstrated by a simulation example, and the analysis of recorded data from an industrial melter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective
To examine the psychometric properties of an internet version of a children and young person's quality of life measure originally designed as a paper questionnaire.

Methods
Participants were 3,440 10 and 11 year old children in Northern Ireland who completed the KIDSCREEN-27 online as part of a general attitudinal survey. The questionnaire was animated using cartoon characters that are familiar to most children and the questions appeared on screen and were read aloud by actors.

Results
Exploratory principal component analysis of the online version of the questionnaire supported the existence of five components in line with the paper version. The items loaded on the components that would be expected based on previous findings with five domains - physical well-being,psychological well-being, autonomy and parents, social support and peers and school environment.Internal consistency reliability of the five domains was measured using Cronbach's alpha and the results suggested that the scale scores were reliable. The domain scores were similar to those reported in the literature for the paper version.

Conclusions
These results suggest that the factor structure and internal consistency reliability scores of the KIDSCREEN-27 embedded within an online survey are comparable to those reported in the literature for the paper version.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a Statistical Shape Model for Human Figure Segmentation in gait sequences. Point Distribution Models (PDM) generally use Principal Component analysis (PCA) to describe the main directions of variation in the training set. However, PCA assumes a number of restrictions on the data that do not always hold. In this work, we explore the potential of Independent Component Analysis (ICA) as an alternative shape decomposition to the PDM-based Human Figure Segmentation. The shape model obtained enables accurate estimation of human figures despite segmentation errors in the input silhouettes and has really good convergence qualities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To investigate the quality of life and priorities of patients with glaucoma.

METHODS: Patients diagnosed with glaucoma and no other ocular comorbidity were consecutively recruited. Clinical information was collected. Participants were asked to complete three questionnaires: EuroQuol (EQ-5D), time tradeoff (TTO), and choice-based conjoint analysis. The latter used five-attribute outcomes: (1) reading and seeing detail, (2) peripheral vision, (3) darkness and glare, (4) household chores, and (5) outdoor mobility. Visual field loss was estimated by using binocular integrated visual fields (IVFs).

RESULTS: Of 84 patients invited to participate, 72 were enrolled in the study. The conjoint utilities showed that the two main priorities were "reading and seeing detail" and "outdoor mobility." This rank order was stable across all segmentations of the data by demographic or visual state. However, the relative emphasis of these priorities changed with increasing visual field loss, with concerns for central vision increasing, whereas those for outdoor mobility decreased. Two subgroups of patients with differing priorities on the two main attributes were identified. Only 17% of patients (those with poorer visual acuity) were prepared to consider TTO. A principal component analysis revealed relatively independent components (i.e., low correlations) between the three different methodologies for assessing quality of life.

CONCLUSIONS: Assessments of quality of life using different methodologies have been shown to produce different outcomes with low intercorrelations between them. Only a minority of patients were prepared to trade time for a return to normal vision. Conjoint analysis showed two subgroups with different priorities. Severity of glaucoma influenced the relative importance of priorities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geogenic nickel (Ni), vanadium (V) and chromium (Cr) are present at elevated levels in soils in Northern Ireland. Whilst Ni, V and Cr total soil concentrations share common geological origins, their respective levels of oral bioaccessibility are influenced by different soil-geochemical factors. Oral bioaccessibility extractions were carried out on 145 soil samples overlying 9 different bedrock types to measure the bioaccessible portions of Ni, V and Cr. Principal component analysis identified two components (PC1 and PC2) accounting for 69% of variance across 13 variables from the Northern Ireland Tellus Survey geochemical data. PC1 was associated with underlying basalt bedrock, higher bioaccessible Cr concentrations and lower Ni bioaccessibility. PC2 was associated with regional variance in soil chemistry and hosted factors accounting for higher Ni and V bioaccessibility. Eight per cent of total V was solubilised by gastric extraction on average across the study area. High median proportions of bioaccessible Ni were observed in soils overlying sedimentary rock types. Whilst Cr bioaccessible fractions were low (max = 5.4%), the highest measured bioaccessible Cr concentration reached 10.0 mg kg-1, explained by factors linked to PC1 including high total Cr concentrations in soils overlying basalt bedrock.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reducing wafer metrology continues to be a major target in semiconductor manufacturing efficiency initiatives due to it being a high cost, non-value added operation that impacts on cycle-time and throughput. However, metrology cannot be eliminated completely given the important role it plays in process monitoring and advanced process control. To achieve the required manufacturing precision, measurements are typically taken at multiple sites across a wafer. The selection of these sites is usually based on a priori knowledge of wafer failure patterns and spatial variability with additional sites added over time in response to process issues. As a result, it is often the case that in mature processes significant redundancy can exist in wafer measurement plans. This paper proposes a novel methodology based on Forward Selection Component Analysis (FSCA) for analyzing historical metrology data in order to determine the minimum set of wafer sites needed for process monitoring. The paper also introduces a virtual metrology (VM) based approach for reconstructing the complete wafer profile from the optimal sites identified by FSCA. The proposed methodology is tested and validated on a wafer manufacturing metrology dataset. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of handheld near infrared (NIR) instrumentation, as a tool for rapid analysis, has the potential to be used widely in the animal feed sector. A comparison was made between handheld NIR and benchtop instruments in terms of proximate analysis of poultry feed using off-the-shelf calibration models and including statistical analysis. Additionally, melamine adulterated soya bean products were used to develop qualitative and quantitative calibration models from the NIRS spectral data with excellent calibration models and prediction statistics obtained. With regards to the quantitative approach, the coefficients of determination (R2) were found to be 0.94-0.99 with the corresponding values for the root mean square error of calibration and prediction were found to be 0.081-0.215 % and 0.095-0.288 % respectively. In addition, cross validation was used to further validate the models with the root mean square error of cross validation found to be 0.101-0.212 %. Furthermore, by adopting a qualitative approach with the spectral data and applying Principal Component Analysis, it was possible to discriminate between adulterated and pure samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existence of loose particles left inside the sealed electronic devices is one of the main factors affecting the reliability of the whole system. It is important to identify the particle material for analyzing their source. The conventional material identification algorithms mainly rely on time, frequency and wavelet domain features. However, these features are usually overlapped and redundant, resulting in unsatisfactory material identification accuracy. The main objective of this paper is to improve the accuracy of material identification. First, the principal component analysis (PCA) is employed to reselect the nine features extracted from time and frequency domains, leading to six less correlated principal components. And then the reselected principal components are used for material identification using a support vector machine (SVM). Finally, the experimental results show that this new method can effectively distinguish the type of materials including wire, aluminum and tin particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistics are regularly used to make some form of comparison between trace evidence or deploy the exclusionary principle (Morgan and Bull, 2007) in forensic investigations. Trace evidence are routinely the results of particle size, chemical or modal analyses and as such constitute compositional data. The issue is that compositional data including percentages, parts per million etc. only carry relative information. This may be problematic where a comparison of percentages and other constraint/closed data is deemed a statistically valid and appropriate way to present trace evidence in a court of law. Notwithstanding an awareness of the existence of the constant sum problem since the seminal works of Pearson (1896) and Chayes (1960) and the introduction of the application of log-ratio techniques (Aitchison, 1986; Pawlowsky-Glahn and Egozcue, 2001; Pawlowsky-Glahn and Buccianti, 2011; Tolosana-Delgado and van den Boogaart, 2013) the problem that a constant sum destroys the potential independence of variances and covariances required for correlation regression analysis and empirical multivariate methods (principal component analysis, cluster analysis, discriminant analysis, canonical correlation) is all too often not acknowledged in the statistical treatment of trace evidence. Yet the need for a robust treatment of forensic trace evidence analyses is obvious. This research examines the issues and potential pitfalls for forensic investigators if the constant sum constraint is ignored in the analysis and presentation of forensic trace evidence. Forensic case studies involving particle size and mineral analyses as trace evidence are used to demonstrate the use of a compositional data approach using a centred log-ratio (clr) transformation and multivariate statistical analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, we introduce an original distance definition for graphs, called the Markov-inverse-F measure (MiF). This measure enables the integration of classical graph theory indices with new knowledge pertaining to structural feature extraction from semantic networks. MiF improves the conventional Jaccard and/or Simpson indices, and reconciles both the geodesic information (random walk) and co-occurrence adjustment (degree balance and distribution). We measure the effectiveness of graph-based coefficients through the application of linguistic graph information for a neural activity recorded during conceptual processing in the human brain. Specifically, the MiF distance is computed between each of the nouns used in a previous neural experiment and each of the in-between words in a subgraph derived from the Edinburgh Word Association Thesaurus of English. From the MiF-based information matrix, a machine learning model can accurately obtain a scalar parameter that specifies the degree to which each voxel in (the MRI image of) the brain is activated by each word or each principal component of the intermediate semantic features. Furthermore, correlating the voxel information with the MiF-based principal components, a new computational neurolinguistics model with a network connectivity paradigm is created. This allows two dimensions of context space to be incorporated with both semantic and neural distributional representations.