77 resultados para Differenzial Imaging, Principal Component Analysis, esopianeti, SPHERE, IFS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a Statistical Shape Model for Human Figure Segmentation in gait sequences. Point Distribution Models (PDM) generally use Principal Component analysis (PCA) to describe the main directions of variation in the training set. However, PCA assumes a number of restrictions on the data that do not always hold. In this work, we explore the potential of Independent Component Analysis (ICA) as an alternative shape decomposition to the PDM-based Human Figure Segmentation. The shape model obtained enables accurate estimation of human figures despite segmentation errors in the input silhouettes and has really good convergence qualities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reducing wafer metrology continues to be a major target in semiconductor manufacturing efficiency initiatives due to it being a high cost, non-value added operation that impacts on cycle-time and throughput. However, metrology cannot be eliminated completely given the important role it plays in process monitoring and advanced process control. To achieve the required manufacturing precision, measurements are typically taken at multiple sites across a wafer. The selection of these sites is usually based on a priori knowledge of wafer failure patterns and spatial variability with additional sites added over time in response to process issues. As a result, it is often the case that in mature processes significant redundancy can exist in wafer measurement plans. This paper proposes a novel methodology based on Forward Selection Component Analysis (FSCA) for analyzing historical metrology data in order to determine the minimum set of wafer sites needed for process monitoring. The paper also introduces a virtual metrology (VM) based approach for reconstructing the complete wafer profile from the optimal sites identified by FSCA. The proposed methodology is tested and validated on a wafer manufacturing metrology dataset. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows that current multivariate statistical monitoring technology may not detect incipient changes in the variable covariance structure nor changes in the geometry of the underlying variable decomposition. To overcome these deficiencies, the local approach is incorporated into the multivariate statistical monitoring framework to define two new univariate statistics for fault detection. Fault isolation is achieved by constructing a fault diagnosis chart which reveals changes in the covariance structure resulting from the presence of a fault. A theoretical analysis is presented and the proposed monitoring approach is exemplified using application studies involving recorded data from two complex industrial processes. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the first paper that shows and theoretically analyses that the presence of auto-correlation can produce considerable alterations in the Type I and Type II errors in univariate and multivariate statistical control charts. To remove this undesired effect, linear inverse ARMA filter are employed and the application studies in this paper show that false alarms (increased Type I errors) and an insensitive monitoring statistics (increased Type II errors) were eliminated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper theoretically analysis the recently proposed "Extended Partial Least Squares" (EPLS) algorithm. After pointing out some conceptual deficiencies, a revised algorithm is introduced that covers the middle ground between Partial Least Squares and Principal Component Analysis. It maximises a covariance criterion between a cause and an effect variable set (partial least squares) and allows a complete reconstruction of the recorded data (principal component analysis). The new and conceptually simpler EPLS algorithm has successfully been applied in detecting and diagnosing various fault conditions, where the original EPLS algorithm did only offer fault detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Guanine-rich DNA repeat sequences located at the terminal ends of chromosomal DNA can fold in a sequence-dependent manner into G-quadruplex structures, notably the terminal 150–200 nucleotides at the 3' end, which occur as a single-stranded DNA overhang. The crystal structures of quadruplexes with two and four human telomeric repeats show an all-parallel-stranded topology that is readily capable of forming extended stacks of such quadruplex structures, with external TTA loops positioned to potentially interact with other macromolecules. This study reports on possible arrangements for these quadruplex dimers and tetramers, which can be formed from 8 or 16 telomeric DNA repeats, and on a methodology for modeling their interactions with small molecules. A series of computational methods including molecular dynamics, free energy calculations, and principal components analysis have been used to characterize the properties of these higher-order G-quadruplex dimers and tetramers with parallel-stranded topology. The results confirm the stability of the central G-tetrads, the individual quadruplexes, and the resulting multimers. Principal components analysis has been carried out to highlight the dominant motions in these G-quadruplex dimer and multimer structures. The TTA loop is the most flexible part of the model and the overall multimer quadruplex becoming more stable with the addition of further G-tetrads. The addition of a ligand to the model confirms the hypothesis that flat planar chromophores stabilize G-quadruplex structures by making them less flexible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geologic and environmental factors acting over varying spatial scales can control
trace element distribution and mobility in soils. In turn, the mobility of an element in soil will affect its oral bioaccessibility. Geostatistics, kriging and principal component analysis (PCA) were used to explore factors and spatial ranges of influence over a suite of 8 element oxides, soil organic carbon (SOC), pH, and the trace elements nickel (Ni), vanadium (V) and zinc (Zn). Bioaccessibility testing was carried out previously using the Unified BARGE Method on a sub-set of 91 soil samples from the Northern Ireland Tellus1 soil archive. Initial spatial mapping of total Ni, V and Zn concentrations shows their distributions are correlated spatially with local geologic formations, and prior correlation analyses showed that statistically significant controls were exerted over trace element bioaccessibility by the 8 oxides, SOC and pH. PCA applied to the geochemistry parameters of the bioaccessibility sample set yielded three principal components accounting for 77% of cumulative variance in the data
set. Geostatistical analysis of oxide, trace element, SOC and pH distributions using 6862 sample locations also identified distinct spatial ranges of influence for these variables, concluded to arise from geologic forming processes, weathering processes, and localised soil chemistry factors. Kriging was used to conduct a spatial PCA of Ni, V and Zn distributions which identified two factors comprising the majority of distribution variance. This was spatially accounted for firstly by basalt rock types, with the second component associated with sandstone and limestone in the region. The results suggest trace element bioaccessibility and distribution is controlled by chemical and geologic processes which occur over variable spatial ranges of influence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of handheld near infrared (NIR) instrumentation, as a tool for rapid analysis, has the potential to be used widely in the animal feed sector. A comparison was made between handheld NIR and benchtop instruments in terms of proximate analysis of poultry feed using off-the-shelf calibration models and including statistical analysis. Additionally, melamine adulterated soya bean products were used to develop qualitative and quantitative calibration models from the NIRS spectral data with excellent calibration models and prediction statistics obtained. With regards to the quantitative approach, the coefficients of determination (R2) were found to be 0.94-0.99 with the corresponding values for the root mean square error of calibration and prediction were found to be 0.081-0.215 % and 0.095-0.288 % respectively. In addition, cross validation was used to further validate the models with the root mean square error of cross validation found to be 0.101-0.212 %. Furthermore, by adopting a qualitative approach with the spectral data and applying Principal Component Analysis, it was possible to discriminate between adulterated and pure samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistics are regularly used to make some form of comparison between trace evidence or deploy the exclusionary principle (Morgan and Bull, 2007) in forensic investigations. Trace evidence are routinely the results of particle size, chemical or modal analyses and as such constitute compositional data. The issue is that compositional data including percentages, parts per million etc. only carry relative information. This may be problematic where a comparison of percentages and other constraint/closed data is deemed a statistically valid and appropriate way to present trace evidence in a court of law. Notwithstanding an awareness of the existence of the constant sum problem since the seminal works of Pearson (1896) and Chayes (1960) and the introduction of the application of log-ratio techniques (Aitchison, 1986; Pawlowsky-Glahn and Egozcue, 2001; Pawlowsky-Glahn and Buccianti, 2011; Tolosana-Delgado and van den Boogaart, 2013) the problem that a constant sum destroys the potential independence of variances and covariances required for correlation regression analysis and empirical multivariate methods (principal component analysis, cluster analysis, discriminant analysis, canonical correlation) is all too often not acknowledged in the statistical treatment of trace evidence. Yet the need for a robust treatment of forensic trace evidence analyses is obvious. This research examines the issues and potential pitfalls for forensic investigators if the constant sum constraint is ignored in the analysis and presentation of forensic trace evidence. Forensic case studies involving particle size and mineral analyses as trace evidence are used to demonstrate the use of a compositional data approach using a centred log-ratio (clr) transformation and multivariate statistical analyses.