14 resultados para correlation analysis
em Cambridge University Engineering Department Publications Database
Resumo:
Copyright © (2014) by the International Machine Learning Society (IMLS) All rights reserved. Classical methods such as Principal Component Analysis (PCA) and Canonical Correlation Analysis (CCA) are ubiquitous in statistics. However, these techniques are only able to reveal linear re-lationships in data. Although nonlinear variants of PCA and CCA have been proposed, these are computationally prohibitive in the large scale. In a separate strand of recent research, randomized methods have been proposed to construct features that help reveal nonlinear patterns in data. For basic tasks such as regression or classification, random features exhibit little or no loss in performance, while achieving drastic savings in computational requirements. In this paper we leverage randomness to design scalable new variants of nonlinear PCA and CCA; our ideas extend to key multivariate analysis tools such as spectral clustering or LDA. We demonstrate our algorithms through experiments on real- world data, on which we compare against the state-of-the-art. A simple R implementation of the presented algorithms is provided.
Resumo:
In the field of motor control, two hypotheses have been controversial: whether the brain acquires internal models that generate accurate motor commands, or whether the brain avoids this by using the viscoelasticity of musculoskeletal system. Recent observations on relatively low stiffness during trained movements support the existence of internal models. However, no study has revealed the decrease in viscoelasticity associated with learning that would imply improvement of internal models as well as synergy between the two hypothetical mechanisms. Previously observed decreases in electromyogram (EMG) might have other explanations, such as trajectory modifications that reduce joint torques. To circumvent such complications, we required strict trajectory control and examined only successful trials having identical trajectory and torque profiles. Subjects were asked to perform a hand movement in unison with a target moving along a specified and unusual trajectory, with shoulder and elbow in the horizontal plane at the shoulder level. To evaluate joint viscoelasticity during the learning of this movement, we proposed an index of muscle co-contraction around the joint (IMCJ). The IMCJ was defined as the summation of the absolute values of antagonistic muscle torques around the joint and computed from the linear relation between surface EMG and joint torque. The IMCJ during isometric contraction, as well as during movements, was confirmed to correlate well with joint stiffness estimated using the conventional method, i.e., applying mechanical perturbations. Accordingly, the IMCJ during the learning of the movement was computed for each joint of each trial using estimated EMG-torque relationship. At the same time, the performance error for each trial was specified as the root mean square of the distance between the target and hand at each time step over the entire trajectory. The time-series data of IMCJ and performance error were decomposed into long-term components that showed decreases in IMCJ in accordance with learning with little change in the trajectory and short-term interactions between the IMCJ and performance error. A cross-correlation analysis and impulse responses both suggested that higher IMCJs follow poor performances, and lower IMCJs follow good performances within a few successive trials. Our results support the hypothesis that viscoelasticity contributes more when internal models are inaccurate, while internal models contribute more after the completion of learning. It is demonstrated that the CNS regulates viscoelasticity on a short- and long-term basis depending on performance error and finally acquires smooth and accurate movements while maintaining stability during the entire learning process.
Resumo:
Transport critical current measurements have been carried out on melt-processed thick films of YBa2Cu3O7-δ on yttria-stabilized zirconia in fields of up to 8 T both within grains and across grain boundaries. These measurements yield Jc values of ∼3000 A cm-2 at 4.2 K and zero magnetic field and 400 A cm -2 at 77 K and zero magnetic field, taking the entire sample width as the definitive dimension. Optical and scanning electron microscopy reveals that the thick-film grains consist typically of a central "hub" region ∼50 μm in diameter, which is well connected to radial subgrains or "spokes" which extend ∼1 mm to define the complete grain structure. Attempts have been made to correlate the transport measurements of inter- and intra-hub-and-spoke (H-S) critical current with values of this parameter derived previously from magnetization measurements. Analysis of the transport measurements indicates that current flow through H-S grains is constrained to paths along the spokes via the grain hub. Taking the size of the hub as the definitive dimension yields an intra-H-S grain Jc of ∼60 000 A cm-2 at 4.2 K and 0 T, which is in reasonable agreement with the magnetization data. Experiments in which the hub is removed from individual grains confirm that this feature determines critically the J c of the film.
Resumo:
LIMA (Laser-induced Ion Mass Analysis) is a new technique capable of compositional analysis of thin films and surface regions. Under UHV conditions a focused laser beam evaporates and ionizes a microvolume of specimen material from which a mass spectrum is obtained. LIMA has been used to examine a range of thin film materials with applications in electronic devices. The neutral photon probe avoids charging problems, and low conductivity materials are examined without prior metallization. Analyses of insulating silicon oxides, nitrides, and oxynitrides confirm estimates of composition from infrared measurements. However, the hydrogen content of hydrogenated amorphous silicon (a-Si : H) found by LIMA shows no correlation with values given by infrared absorption analysis. Explanations are proposed and discussed. © 1985.
Resumo:
The Particle Image Velocimetry (PIV) technique is an image processing tool to obtain instantaneous velocity measurements during an experiment. The basic principle of PIV analysis is to divide the image into small patches and calculate the locations of the individual patches in consecutive images with the help of cross correlation functions. This paper focuses on the application of the PIV analysis in dynamic centrifuge tests on small scale tunnels in loose, dry sand. Digital images were captured during the application of the earthquake loading on tunnel models using a fast digital camera capable of taking digital images at 1000 frames per second at 1 Megapixel resolution. This paper discusses the effectiveness of the existing methods used to conduct PIV analyses on dynamic centrifuge tests. Results indicate that PIV analysis in dynamic testing requires special measures in order to obtain reasonable deformation data. Nevertheless, it was possible to obtain interesting mechanisms regarding the behaviour of the tunnels from PIV analyses. © 2010 Taylor & Francis Group, London.
Resumo:
An approach by which the detrented fluctuation analysis (DFA) method can be used to help diagnose heart failure was demonstrated. DFA was applied to patients suffering from congestive heart failure (CHF) to check correlations between DFA indices and CHF, and determine a correlation between DFA indices and mortality, with a particular attention to the residue parameter, which is a measure of the departure of the DFA from its power law approximation. DFA parameters proved to be useful as a complement to the physiological parameters weber and FE to sort out the patients into three prognostic group.
Resumo:
One of the main causes of failure of historic buildings is represented by the differential settlements of foundations. Finite element analysis provides a useful tool for predicting the consequences of given ground displacements in terms of structural damage and also assesses the need of strengthening techniques. The actual damage classification for buildings subject to settlement bases the assessment of the potential damage on the expected crack pattern of the structure. In this paper, the correlation between the physical description of the damage in terms of crack width and the interpretation of the finite element analysis output is analyzed. Different discrete and continuum crack models are applied to simulate an experiment carried on a scale model of a masonry historical building, the Loggia Palace in Brescia (Italy). Results are discussed and a modified version of the fixed total strain smeared crack model is evaluated, in order to solve the problem related to the calculation of the exact crack width.
Resumo:
The sustainable remediation concept, aimed at maximizing the net environmental, social, and economic benefits in contaminated site remediation, is being increasingly recognized by industry, governments, and academia. However, there is limited understanding of actual sustainable behaviour being adopted and the determinants of such sustainable behaviour. The present study identified 27 sustainable practices in remediation. An online questionnaire survey was used to rank and compare them in the US (n=112) and the UK (n=54). The study also rated ten promoting factors, nine barriers, and 17 types of stakeholders' influences. Subsequently, factor analysis and general linear models were used to determine the effects of internal characteristics (i.e. country, organizational characteristics, professional role, personal experience and belief) and external forces (i.e. promoting factors, barriers, and stakeholder influences). It was found that US and UK practitioners adopted many sustainable practices to similar extents. Both US and UK practitioners perceived the most effectively adopted sustainable practices to be reducing the risk to site workers, protecting groundwater and surface water, and reducing the risk to the local community. Comparing the two countries, we found that the US adopted innovative in-situ remediation more effectively; while the UK adopted reuse, recycling, and minimizing material usage more effectively. As for the overall determinants of sustainable remediation, the country of origin was found not to be a significant determinant. Instead, organizational policy was found to be the most important internal characteristic. It had a significant positive effect on reducing distant environmental impact, sustainable resource usage, and reducing remediation cost and time (p<0.01). Customer competitive pressure was found to be the most extensively significant external force. In comparison, perceived stakeholder influence, especially that of primary stakeholders (site owner, regulator, and primary consultant), did not appear to have as extensive a correlation with the adoption of sustainability as one would expect.