987 resultados para Generalized cross correlations


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Calculating the potentials on the heart’s epicardial surface from the body surface potentials constitutes one form of inverse problems in electrocardiography (ECG). Since these problems are ill-posed, one approach is to use zero-order Tikhonov regularization, where the squared norms of both the residual and the solution are minimized, with a relative weight determined by the regularization parameter. In this paper, we used three different methods to choose the regularization parameter in the inverse solutions of ECG. The three methods include the L-curve, the generalized cross validation (GCV) and the discrepancy principle (DP). Among them, the GCV method has received less attention in solutions to ECG inverse problems than the other methods. Since the DP approach needs knowledge of norm of noises, we used a model function to estimate the noise. The performance of various methods was compared using a concentric sphere model and a real geometry heart-torso model with a distribution of current dipoles placed inside the heart model as the source. Gaussian measurement noises were added to the body surface potentials. The results show that the three methods all produce good inverse solutions with little noise; but, as the noise increases, the DP approach produces better results than the L-curve and GCV methods, particularly in the real geometry model. Both the GCV and L-curve methods perform well in low to medium noise situations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In vivo, neurons of the globus pallidus (GP) and subthalamic nucleus (STN) resonate independently around 70 Hz. However, on the loss of dopamine as in Parkinson's disease, there is a switch to a lower frequency of firing with increased bursting and synchronization of activity. In vitro, type A neurons of the GP, identified by the presence of Ih and rebound depolarizations, fire at frequencies (≤80 Hz) in response to glutamate pressure ejection, designed to mimic STN input. The profile of this frequency response was unaltered by bath application of the GABAA antagonist bicuculline (10 μM), indicating the lack of involvement of a local GABA neuronal network, while cross-correlations of neuronal pairs revealed uncorrelated activity or phase-locked activity with a variable phase delay, consistent with each GP neuron acting as an independent oscillator. This autonomy of firing appears to arise due to the presence of intrinsic voltage- and sodium-dependent subthreshold membrane oscillations. GABAA inhibitory postsynaptic potentials are able to disrupt this tonic activity while promoting a rebound depolarization and action potential firing. This rebound is able to reset the phase of the intrinsic oscillation and provides a mechanism for promoting coherent firing activity in ensembles of GP neurons that may ultimately lead to abnormal and pathological disorders of movement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50-100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate two numerical procedures for the Cauchy problem in linear elasticity, involving the relaxation of either the given boundary displacements (Dirichlet data) or the prescribed boundary tractions (Neumann data) on the over-specified boundary, in the alternating iterative algorithm of Kozlov et al. (1991). The two mixed direct (well-posed) problems associated with each iteration are solved using the method of fundamental solutions (MFS), in conjunction with the Tikhonov regularization method, while the optimal value of the regularization parameter is chosen via the generalized cross-validation (GCV) criterion. An efficient regularizing stopping criterion which ceases the iterative procedure at the point where the accumulation of noise becomes dominant and the errors in predicting the exact solutions increase, is also presented. The MFS-based iterative algorithms with relaxation are tested for Cauchy problems for isotropic linear elastic materials in various geometries to confirm the numerical convergence, stability, accuracy and computational efficiency of the proposed method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the evolution of magnetohydrodynamic (or hydromagnetic as coined by Chandrasekhar) perturbations in the presence of stochastic noise in rotating shear flows. The particular emphasis is the flows whose angular velocity decreases but specific angular momentum increases with increasing radial coordinate. Such flows, however, are Rayleigh stable but must be turbulent in order to explain astrophysical observed data and, hence, reveal a mismatch between the linear theory and observations and experiments. The mismatch seems to have been resolved, at least in certain regimes, in the presence of a weak magnetic field, revealing magnetorotational instability. The present work explores the effects of stochastic noise on such magnetohydrodynamic flows, in order to resolve the above mismatch generically for the hot flows. We essentially concentrate on a small section of such a flow which is nothing but a plane shear flow supplemented by the Coriolis effect, mimicking a small section of an astrophysical accretion disk around a compact object. It is found that such stochastically driven flows exhibit large temporal and spatial autocorrelations and cross-correlations of perturbation and, hence, large energy dissipations of perturbation, which generate instability. Interestingly, autocorrelations and cross-correlations appear independent of background angular velocity profiles, which are Rayleigh stable, indicating their universality. This work initiates our attempt to understand the evolution of three-dimensional hydromagnetic perturbations in rotating shear flows in the presence of stochastic noise. © 2013 American Physical Society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ambient seismic noise has traditionally been considered as an unwanted perturbation in seismic data acquisition that "contaminates" the clean recording of earthquakes. Over the last decade, however, it has been demonstrated that consistent information about the subsurface structure can be extracted from cross-correlation of ambient seismic noise. In this context, the rules are reversed: the ambient seismic noise becomes the desired seismic signal, while earthquakes become the unwanted perturbation that needs to be removed. At periods lower than 30 s, the spectrum of ambient seismic noise is dominated by microseism, which originates from distant atmospheric perturbations over the oceans. The microsseism is the most continuous seismic signal and can be classified as primary – when observed in the range 10-20 s – and secondary – when observed in the range 5-10 s. The Green‘s function of the propagating medium between two receivers (seismic stations) can be reconstructed by cross-correlating seismic noise simultaneously recorded at the receivers. The reconstruction of the Green‘s function is generally proportional to the surface-wave portion of the seismic wavefield, as microsseismic energy travels mostly as surface-waves. In this work, 194 Green‘s functions obtained from stacking of one month of daily cross-correlations of ambient seismic noise recorded in the vertical component of several pairs of broadband seismic stations in Northeast Brazil are presented. The daily cross-correlations were stacked using a timefrequency, phase-weighted scheme that enhances weak coherent signals by reducing incoherent noise. The cross-correlations show that, as expected, the emerged signal is dominated by Rayleigh waves, with dispersion velocities being reliably measured for periods ranging between 5 and 20 s. Both permanent stations from a monitoring seismic network and temporary stations from past passive experiments in the region are considered, resulting in a combined network of 33 stations separated by distances between 60 and 1311 km, approximately. The Rayleigh-wave, dispersion velocity measurements are then used to develop tomographic images of group velocity variation for the Borborema Province of Northeast Brazil. The tomographic maps allow to satisfactorily map buried structural features in the region. At short periods (~5 s) the images reflect shallow crustal structure, clearly delineating intra-continental and marginal sedimentary basins, as well as portions of important shear zones traversing the Borborema Province. At longer periods (10 – 20 s) the images are sensitive to deeper structure in the upper crust, and most of the shallower anomalies fade away. Interestingly, some of them do persist. The deep anomalies do not correlate with either the location of Cenozoic volcanism and uplift - which marked the evolution of the Borborema Province in the Cenozoic - or available maps of surface heat-flow, and the origin of the deep anomalies remains enigmatic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis deals with tensor completion for the solution of multidimensional inverse problems. We study the problem of reconstructing an approximately low rank tensor from a small number of noisy linear measurements. New recovery guarantees, numerical algorithms, non-uniform sampling strategies, and parameter selection algorithms are developed. We derive a fixed point continuation algorithm for tensor completion and prove its convergence. A restricted isometry property (RIP) based tensor recovery guarantee is proved. Probabilistic recovery guarantees are obtained for sub-Gaussian measurement operators and for measurements obtained by non-uniform sampling from a Parseval tight frame. We show how tensor completion can be used to solve multidimensional inverse problems arising in NMR relaxometry. Algorithms are developed for regularization parameter selection, including accelerated k-fold cross-validation and generalized cross-validation. These methods are validated on experimental and simulated data. We also derive condition number estimates for nonnegative least squares problems. Tensor recovery promises to significantly accelerate N-dimensional NMR relaxometry and related experiments, enabling previously impractical experiments. Our methods could also be applied to other inverse problems arising in machine learning, image processing, signal processing, computer vision, and other fields.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider stock market contagion as a significant increase in cross-market linkages after a shock to one country or group of countries. Under this definition we study if contagion occurred from the U.S. Financial Crisis to the rest of the major stock markets in the world by using the adjusted (unconditional) correlation coefficient approach (Forbes and Rigobon, 2002) which consists of testing if average crossmarket correlations increase significantly during the relevant period of turmoil. We would not reject the null hypothesis of interdependence in favour of contagion if the increase in correlation only suggests a continuation of high linkages in all state of the world. Moreover, if contagion occurs, this would justify the intervention of the IMF and the suddenly portfolio restructuring during the period under study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider the distribution of cross sections of clusters and the density-density correlation functions for the A+B¿0 reaction. We solve the reaction-diffusion equations numerically for random initial distributions of reactants. When both reactant species have the same diffusion coefficients the distribution of cross sections and the correlation functions scale with the diffusion length and obey superuniversal laws (independent of dimension). For different diffusion coefficients the correlation functions still scale, but the scaling functions depend on the dimension and on the diffusion coefficients. Furthermore, we display explicitly the peculiarities of the cluster-size distribution in one dimension.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider the distribution of cross sections of clusters and the density-density correlation functions for the A+B¿0 reaction. We solve the reaction-diffusion equations numerically for random initial distributions of reactants. When both reactant species have the same diffusion coefficients the distribution of cross sections and the correlation functions scale with the diffusion length and obey superuniversal laws (independent of dimension). For different diffusion coefficients the correlation functions still scale, but the scaling functions depend on the dimension and on the diffusion coefficients. Furthermore, we display explicitly the peculiarities of the cluster-size distribution in one dimension.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present an analysis of trace gas correlations in the lowermost stratosphere. In‐situ aircraft measurements of CO, N2O, NOy and O3, obtained during the STREAM 1997 winter campaign, have been used to investigate the role of cross‐tropopause mass exchange on tracer‐tracer relations. At altitudes several kilometers above the local tropopause, undisturbed stratospheric air was found with NOy/NOy * ratios close to unity, NOy/O3 about 0.003–0.006 and CO mixing ratios as low as 20 ppbv (NOy * is a proxy for total reactive nitrogen derived from NOy–N2O relations measured in the stratosphere). Mixing of tropospheric air into the lowermost stratosphere has been identified by enhanced ratios of NOy/NOy * and NOy/O3, and from scatter plots of CO versus O3. The enhanced NOy/O3 ratio in the lowermost stratospheric mixing zone points to a reduced efficiency of O3 formation from aircraft NOx emissions.