900 resultados para ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use long instrumental temperature series together with available field reconstructions of sea-level pressure (SLP) and three-dimensional climate model simulations to analyze relations between temperature anomalies and atmospheric circulation patterns over much of Europe and the Mediterranean for the late winter/early spring (January–April, JFMA) season. A Canonical Correlation Analysis (CCA) investigates interannual to interdecadal covariability between a new gridded SLP field reconstruction and seven long instrumental temperature series covering the past 250 years. We then present and discuss prominent atmospheric circulation patterns related to anomalous warm and cold JFMA conditions within different European areas spanning the period 1760–2007. Next, using a data assimilation technique, we link gridded SLP data with a climate model (EC-Bilt-Clio) for a better dynamical understanding of the relationship between large scale circulation and European climate. We thus present an alternative approach to reconstruct climate for the pre-instrumental period based on the assimilated model simulations. Furthermore, we present an independent method to extend the dynamic circulation analysis for anomalously cold European JFMA conditions back to the sixteenth century. To this end, we use documentary records that are spatially representative for the long instrumental records and derive, through modern analogs, large-scale SLP, surface temperature and precipitation fields. The skill of the analog method is tested in the virtual world of two three-dimensional climate simulations (ECHO-G and HadCM3). This endeavor offers new possibilities to both constrain climate model into a reconstruction mode (through the assimilation approach) and to better asses documentary data in a quantitative way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnetic resonance imaging of inhaled fluorinated inert gases ((19)F-MRI) such as sulfur hexafluoride (SF(6)) allows for analysis of ventilated air spaces. In this study, the possibility of using this technique to image lung function was assessed. For this, (19)F-MRI of inhaled SF(6) was compared with respiratory gas analysis, which is a global but reliable measure of alveolar gas fraction. Five anesthetized pigs underwent multiple-breath wash-in procedures with a gas mixture of 70% SF(6) and 30% oxygen. Two-dimensional (19)F-MRI and end-expiratory gas fraction analysis were performed after 4 to 24 inhaled breaths. Signal intensity of (19)F-MRI and end-expiratory SF(6) fraction were evaluated with respect to linear correlation and reproducibility. Time constants were estimated by both MRI and respiratory gas analysis data and compared for agreement. A good linear correlation between signal intensity and end-expiratory gas fraction was found (correlation coefficient 0.99+/-0.01). The data were reproducible (standard error of signal intensity 8% vs. that of gas fraction 5%) and the comparison of time constants yielded a sufficient agreement. According to the good linear correlation and the acceptable reproducibility, we suggest the (19)F-MRI to be a valuable tool for quantification of intrapulmonary SF(6) and hence lung function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims at assessing the skill of several climate field reconstruction techniques (CFR) to reconstruct past precipitation over continental Europe and the Mediterranean at seasonal time scales over the last two millennia from proxy records. A number of pseudoproxy experiments are performed within the virtual reality ofa regional paleoclimate simulation at 45 km resolution to analyse different aspects of reconstruction skill. Canonical Correlation Analysis (CCA), two versions of an Analog Method (AM) and Bayesian hierarchical modeling (BHM) are applied to reconstruct precipitation from a synthetic network of pseudoproxies that are contaminated with various types of noise. The skill of the derived reconstructions is assessed through comparison with precipitation simulated by the regional climate model. Unlike BHM, CCA systematically underestimates the variance. The AM can be adjusted to overcome this shortcoming, presenting an intermediate behaviour between the two aforementioned techniques. However, a trade-off between reconstruction-target correlations and reconstructed variance is the drawback of all CFR techniques. CCA (BHM) presents the largest (lowest) skill in preserving the temporal evolution, whereas the AM can be tuned to reproduce better correlation at the expense of losing variance. While BHM has been shown to perform well for temperatures, it relies heavily on prescribed spatial correlation lengths. While this assumption is valid for temperature, it is hardly warranted for precipitation. In general, none of the methods outperforms the other. All experiments agree that a dense and regularly distributed proxy network is required to reconstruct precipitation accurately, reflecting its high spatial and temporal variability. This is especially true in summer, when a specifically short de-correlation distance from the proxy location is caused by localised summertime convective precipitation events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we address the new reduction method called Proper Generalized Decomposition (PGD) which is a discretization technique based on the use of separated representation of the unknown fields, specially well suited for solving multidimensional parametric equations. In this case, it is applied to the solution of dynamics problems. We will focus on the dynamic analysis of an one-dimensional rod with a unit harmonic load of frequency (ω) applied at a point of interest. In what follows, we will present the application of the methodology PGD to the problem in order to approximate the displacement field as the sum of the separated functions. We will consider as new variables of the problem, parameters models associated with the characteristic of the materials, in addition to the frequency. Finally, the quality of the results will be assessed based on an example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A sensitive, labor-saving, and easily automatable nonradioactive procedure named APEX-FCS (amplified probe extension detected by fluorescence correlation spectroscopy) has been established to detect specific in vitro amplification of pathogen genomic sequences. As an example, Mycobacterium tuberculosis genomic DNA was subjected to PCR amplification with the Stoffel fragment of Thermus aquaticus DNA polymerase in the presence of nanomolar concentrations of a rhodamine-labeled probe (third primer), binding to the target in between the micromolar amplification primers. The probe becomes extended only when specific amplification occurs. Its low concentration avoids false-positives due to unspecific hybridization under PCR conditions. With increasing portion of extended probe molecules, the probe’s average translational diffusion properties gradually change over the course of the reaction, reflecting amplification kinetics. Following PCR, this change from a stage of high to a stage of low mobility can directly be monitored during a 30-s measurement using a fluorescence correlation spectroscopy device. Quantitation down to 10 target molecules in a background of 2.5 μg unspecific DNA without post-PCR probe manipulations could be achieved with different primer/probe combinations. The assay holds the promise to concurrently perform amplification, probe hybridization, and specific detection without opening the reaction chamber, if sealable foils are used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows the results of an experimental analysis on the bell tower of “Chiesa della Maddalena” (Mola di Bari, Italy), to better understand the structural behavior of slender masonry structures. The research aims to calibrate a numerical model by means of the Operational Modal Analysis (OMA) method. In this way realistic conclusions about the dynamic behavior of the structure are obtained. The choice of using an OMA derives from the necessity to know the modal parameters of a structure with a non-destructive testing, especially in case of cultural-historical value structures. Therefore by means of an easy and accurate process, it is possible to acquire in-situ environmental vibrations. The data collected are very important to estimate the mode shapes, the natural frequencies and the damping ratios of the structure. To analyze the data obtained from the monitoring, the Peak Picking method has been applied to the Fast Fourier Transforms (FFT) of the signals in order to identify the values of the effective natural frequencies and damping factors of the structure. The main frequencies and the damping ratios have been determined from measurements at some relevant locations. The responses have been then extrapolated and extended to the entire tower through a 3-D Finite Element Model. In this way, knowing the modes of vibration, it has been possible to understand the overall dynamic behavior of the structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reprinted from: British medical journal, 1897, v.1, 1167-1172; 1229-1236; 1292-1300.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we compare a well-known semantic spacemodel, Latent Semantic Analysis (LSA) with another model, Hyperspace Analogue to Language (HAL) which is widely used in different area, especially in automatic query refinement. We conduct this comparative analysis to prove our hypothesis that with respect to ability of extracting the lexical information from a corpus of text, LSA is quite similar to HAL. We regard HAL and LSA as black boxes. Through a Pearsonrsquos correlation analysis to the outputs of these two black boxes, we conclude that LSA highly co-relates with HAL and thus there is a justification that LSA and HAL can potentially play a similar role in the area of facilitating automatic query refinement. This paper evaluates LSA in a new application area and contributes an effective way to compare different semantic space models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis presents new methodology and algorithms that can be used to analyse and measure the hand tremor and fatigue of surgeons while performing surgery. This will assist them in deriving useful information about their fatigue levels, and make them aware of the changes in their tool point accuracies. This thesis proposes that muscular changes of surgeons, which occur through a day of operating, can be monitored using Electromyography (EMG) signals. The multi-channel EMG signals are measured at different muscles in the upper arm of surgeons. The dependence of EMG signals has been examined to test the hypothesis that EMG signals are coupled with and dependent on each other. The results demonstrated that EMG signals collected from different channels while mimicking an operating posture are independent. Consequently, single channel fatigue analysis has been performed. In measuring hand tremor, a new method for determining the maximum tremor amplitude using Principal Component Analysis (PCA) and a new technique to detrend acceleration signals using Empirical Mode Decomposition algorithm were introduced. This tremor determination method is more representative for surgeons and it is suggested as an alternative fatigue measure. This was combined with the complexity analysis method, and applied to surgically captured data to determine if operating has an effect on a surgeon’s fatigue and tremor levels. It was found that surgical tremor and fatigue are developed throughout a day of operating and that this could be determined based solely on their initial values. Finally, several Nonlinear AutoRegressive with eXogenous inputs (NARX) neural networks were evaluated. The results suggest that it is possible to monitor surgeon tremor variations during surgery from their EMG fatigue measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pattern of correlation between two sets of variables can be tested using canonical variate analysis (CVA). CVA, like principal components analysis (PCA) and factor analysis (FA) (Statnote 27, Hilton & Armstrong, 2011b), is a multivariate analysis Essentially, as in PCA/FA, the objective is to determine whether the correlations between two sets of variables can be explained by a smaller number of ‘axes of correlation’ or ‘canonical roots’.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study subdivides the Weddell Sea, Antarctica, into seafloor regions using multivariate statistical methods. These regions are categories used for comparing, contrasting and quantifying biogeochemical processes and biodiversity between ocean regions geographically but also regions under development within the scope of global change. The division obtained is characterized by the dominating components and interpreted in terms of ruling environmental conditions. The analysis uses 28 environmental variables for the sea surface, 25 variables for the seabed and 9 variables for the analysis between surface and bottom variables. The data were taken during the years 1983-2013. Some data were interpolated. The statistical errors of several interpolation methods (e.g. IDW, Indicator, Ordinary and Co-Kriging) with changing settings have been compared for the identification of the most reasonable method. The multivariate mathematical procedures used are regionalized classification via k means cluster analysis, canonical-correlation analysis and multidimensional scaling. Canonical-correlation analysis identifies the influencing factors in the different parts of the cove. Several methods for the identification of the optimum number of clusters have been tested. For the seabed 8 and 12 clusters were identified as reasonable numbers for clustering the Weddell Sea. For the sea surface the numbers 8 and 13 and for the top/bottom analysis 8 and 3 were identified, respectively. Additionally, the results of 20 clusters are presented for the three alternatives offering the first small scale environmental regionalization of the Weddell Sea. Especially the results of 12 clusters identify marine-influenced regions which can be clearly separated from those determined by the geological catchment area and the ones dominated by river discharge.