998 resultados para Correlation (Statistics)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a conceptual model for the in-plane physics of an earthquake fault. The model employs cellular automaton techniques to simulate tectonic loading, earthquake rupture, and strain redistribution. The impact of a hypothetical crustal elastodynamic Green's function is approximated by a long-range strain redistribution law with a r(-p) dependance. We investigate the influence of the effective elastodynamic interaction range upon the dynamical behaviour of the model by conducting experiments with different values of the exponent (p). The results indicate that this model has two distinct, stable modes of behaviour. The first mode produces a characteristic earthquake distribution with moderate to large events preceeded by an interval of time in which the rate of energy release accelerates. A correlation function analysis reveals that accelerating sequences are associated with a systematic, global evolution of strain energy correlations within the system. The second stable mode produces Gutenberg-Richter statistics, with near-linear energy release and no significant global correlation evolution. A model with effectively short-range interactions preferentially displays Gutenberg-Richter behaviour. However, models with long-range interactions appear to switch between the characteristic and GR modes. As the range of elastodynamic interactions is increased, characteristic behaviour begins to dominate GR behaviour. These models demonstrate that evolution of strain energy correlations may occur within systems with a fixed elastodynamic interaction range. Supposing that similar mode-switching dynamical behaviour occurs within earthquake faults then intermediate-term forecasting of large earthquakes may be feasible for some earthquakes but not for others, in alignment with certain empirical seismological observations. Further numerical investigation of dynamical models of this type may lead to advances in earthquake forecasting research and theoretical seismology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that dichotomizing continuous data has the effect to decrease statistical power when the goal is to test for a statistical association between two variables. Modern researchers however are focusing not only on statistical significance but also on an estimation of the "effect size" (i.e., the strength of association between the variables) to judge whether a significant association is also clinically relevant. In this article, we are interested in the consequences of dichotomizing continuous data on the value of an effect size in some classical settings. It turns out that the conclusions will not be the same whether using a correlation or an odds ratio to summarize the strength of association between the variables: Whereas the value of a correlation is typically decreased by a factor pi/2 after each dichotomization, the value of an odds ratio is at the same time raised to the power 2. From a descriptive statistical point of view, it is thus not clear whether dichotomizing continuous data leads to a decrease or to an increase in the effect size, as illustrated using a data set to investigate the relationship between motor and intellectual functions in children and adolescents

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, significant interest has been expressed in relating the spatial statistics of surface-based reflection ground-penetrating radar (GPR) data to those of the imaged subsurface volume. A primary motivation for this work is that changes in the radar wave velocity, which largely control the character of the observed data, are expected to be related to corresponding changes in subsurface water content. Although previous work has indeed indicated that the spatial statistics of GPR images are linked to those of the water content distribution of the probed region, a viable method for quantitatively analyzing the GPR data and solving the corresponding inverse problem has not yet been presented. Here we address this issue by first deriving a relationship between the 2-D autocorrelation of a water content distribution and that of the corresponding GPR reflection image. We then show how a Bayesian inversion strategy based on Markov chain Monte Carlo sampling can be used to estimate the posterior distribution of subsurface correlation model parameters that are consistent with the GPR data. Our results indicate that if the underlying assumptions are valid and we possess adequate prior knowledge regarding the water content distribution, in particular its vertical variability, this methodology allows not only for the reliable recovery of lateral correlation model parameters but also for estimates of parameter uncertainties. In the case where prior knowledge regarding the vertical variability of water content is not available, the results show that the methodology still reliably recovers the aspect ratio of the heterogeneity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a case series of three children with solid pseudopapillary tumor of the pancreas (SPT) in which a complete radiological work-up, including ultrasound, computed tomography scans, and MRI, has been carried out. The aim of this article is to highlight the characteristic imaging findings of SPT in the pediatric age group and to establish a correlation with typical histopathological findings of the lesion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Coronary microvascular dysfunction (CMD) is associated with cardiovascular events in type 2 diabetes mellitus (T2DM). Optimal glycaemic control does not always preclude future events. We sought to assess the effect of the current target of HBA1c level on the coronary microcirculatory function and identify predictive factors for CMD in T2DM patients. Methods We studied 100 patients with T2DM and 214 patients without T2DM. All of them with a history of chest pain, non-obstructive angiograms and a direct assessment of coronary blood flow increase in response to adenosine and acetylcholine coronary infusion, for evaluation of endothelial independent and dependent CMD. Patients with T2DM were categorized as having optimal (HbA1c < 7 %) vs. suboptimal (HbA1c ≥ 7 %) glycaemic control at the time of catheterization. Results Baseline characteristics and coronary endothelial function parameters differed significantly between T2DM patients and control group. The prevalence of endothelial independent CMD (29.8 vs. 39.6 %, p = 0.40) and dependent CMD (61.7 vs. 62.2 %, p = 1.00) were similar in patients with optimal vs. suboptimal glycaemic control. Age (OR 1.10; CI 95 % 1.04–1.18; p < 0.001) and female gender (OR 3.87; CI 95 % 1.45–11.4; p < 0.01) were significantly associated with endothelial independent CMD whereas glomerular filtrate (OR 0.97; CI 95 % 0.95–0.99; p < 0.05) was significantly associated with endothelial dependent CMD. The optimal glycaemic control was not associated with endothelial independent (OR 0.60, CI 95 % 0.23–1.46; p 0.26) or dependent CMD (OR 0.99, CI 95 % 0.43–2.24; p = 0.98). Conclusions The current target of HBA1c level does not predict a better coronary microcirculatory function in T2DM patients. The appropriate strategy for prevention of CMD in T2DM patients remains to be addressed. Keywords: Endothelial dysfunction; Diabetes mellitus; Coronary microcirculation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is proved the algebraic equality between Jennrich's (1970) asymptotic$X^2$ test for equality of correlation matrices, and a Wald test statisticderived from Neudecker and Wesselman's (1990) expression of theasymptoticvariance matrix of the sample correlation matrix.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Method is offered that makes it possible to apply generalized canonicalcorrelations analysis (CANCOR) to two or more matrices of different row and column order. The new method optimizes the generalized canonical correlationanalysis objective by considering only the observed values. This is achieved byemploying selection matrices. We present and discuss fit measures to assessthe quality of the solutions. In a simulation study we assess the performance of our new method and compare it to an existing procedure called GENCOM,proposed by Green and Carroll. We find that our new method outperforms the GENCOM algorithm both with respect to model fit and recovery of the truestructure. Moreover, as our new method does not require any type of iteration itis easier to implement and requires less computation. We illustrate the methodby means of an example concerning the relative positions of the political parties inthe Netherlands based on provincial data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To compare the diagnostic performance of multi-detector CT arthrography (CTA) and 1.5-T MR arthrography (MRA) in detecting hyaline cartilage lesions of the shoulder, with arthroscopic correlation. PATIENTS AND METHODS: CTA and MRA prospectively obtained in 56 consecutive patients following the same arthrographic procedure were independently evaluated for glenohumeral cartilage lesions (modified Outerbridge grade ≥2 and grade 4) by two musculoskeletal radiologists. The cartilage surface was divided in 18 anatomical areas. Arthroscopy was taken as the reference standard. Diagnostic performance of CTA and MRA was compared using ROC analysis. Interobserver and intraobserver agreement was determined by κ statistics. RESULTS: Sensitivity and specificity of CTA varied from 46.4 to 82.4 % and from 89.0 to 95.9 % respectively; sensitivity and specificity of MRA varied from 31.9 to 66.2 % and from 91.1 to 97.5 % respectively. Diagnostic performance of CTA was statistically significantly better than MRA for both readers (all p ≤ 0.04). Interobserver agreement for the evaluation of cartilage lesions was substantial with CTA (κ = 0.63) and moderate with MRA (κ = 0.54). Intraobserver agreement was almost perfect with both CTA (κ = 0.94-0.95) and MRA (κ = 0.83-0.87). CONCLUSION: The diagnostic performance of CTA and MRA for the detection of glenohumeral cartilage lesions is moderate, although statistically significantly better with CTA. KEY POINTS: ? CTA has moderate diagnostic performance for detecting glenohumeral cartilage substance loss. ? MRA has moderate diagnostic performance for detecting glenohumeral cartilage substance loss. ? CTA is more accurate than MRA for detecting cartilage substance loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a scaling assumption, we propose a phenomenological model aimed to describe the joint probability distribution of two magnitudes A and T characterizing the spatial and temporal scales of a set of avalanches. The model also describes the correlation function of a sequence of such avalanches. As an example we study the joint distribution of amplitudes and durations of the acoustic emission signals observed in martensitic transformations [Vives et al., preceding paper, Phys. Rev. B 52, 12 644 (1995)].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A theory is presented to explain the statistical properties of the growth of dye-laser radiation. Results are in agreement with recent experimental findings. The different roles of pump-noise intensity and correlation time are elucidated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous sources of evidence point to the fact that heterogeneity within the Earth's deep crystalline crust is complex and hence may be best described through stochastic rather than deterministic approaches. As seismic reflection imaging arguably offers the best means of sampling deep crustal rocks in situ, much interest has been expressed in using such data to characterize the stochastic nature of crustal heterogeneity. Previous work on this problem has shown that the spatial statistics of seismic reflection data are indeed related to those of the underlying heterogeneous seismic velocity distribution. As of yet, however, the nature of this relationship has remained elusive due to the fact that most of the work was either strictly empirical or based on incorrect methodological approaches. Here, we introduce a conceptual model, based on the assumption of weak scattering, that allows us to quantitatively link the second-order statistics of a 2-D seismic velocity distribution with those of the corresponding processed and depth-migrated seismic reflection image. We then perform a sensitivity study in order to investigate what information regarding the stochastic model parameters describing crustal velocity heterogeneity might potentially be recovered from the statistics of a seismic reflection image using this model. Finally, we present a Monte Carlo inversion strategy to estimate these parameters and we show examples of its application at two different source frequencies and using two different sets of prior information. Our results indicate that the inverse problem is inherently non-unique and that many different combinations of the vertical and lateral correlation lengths describing the velocity heterogeneity can yield seismic images with the same 2-D autocorrelation structure. The ratio of all of these possible combinations of vertical and lateral correlation lengths, however, remains roughly constant which indicates that, without additional prior information, the aspect ratio is the only parameter describing the stochastic seismic velocity structure that can be reliably recovered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A calculation of passage-time statistics is reported for the laser switch-on problem, under the influence of colored noise, when the net gain is continuously swept from below to above threshold. Cases of fast and slow sweeping are considered. In the weak-noise limit, asymptotic results are given for small and large correlation times of the noise. The mean first passage time increases with the correlation time of the noise. This effect is more important for fast sweeping than for slow sweeping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: In longitudinal studies where subjects experience recurrent incidents over a period of time, such as respiratory infections, fever or diarrhea, statistical methods are required to take into account the within-subject correlation. Methods: For repeated events data with censored failure, the independent increment (AG), marginal (WLW) and conditional (PWP) models are three multiple failure models that generalize Cox"s proportional hazard model. In this paper, we revise the efficiency, accuracy and robustness of all three models under simulated scenarios with varying degrees of within-subject correlation, censoring levels, maximum number of possible recurrences and sample size. We also study the methods performance on a real dataset from a cohort study with bronchial obstruction. Results: We find substantial differences between methods and there is not an optimal method. AG and PWP seem to be preferable to WLW for low correlation levels but the situation reverts for high correlations. Conclusions: All methods are stable in front of censoring, worsen with increasing recurrence levels and share a bias problem which, among other consequences, makes asymptotic normal confidence intervals not fully reliable, although they are well developed theoretically.