43 resultados para Twitter Financial Market Pearson cross correlation
em Aston University Research Archive
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The origin of linear instability resulting in rotating sheared accretion flows has remained a controversial subject for a long time. While some explanations of such non-normal transient growth of disturbances in the Rayleigh stable limit were available for magnetized accretion flows, similar instabilities in the absence of magnetic perturbations remained unexplained. This dichotomy was resolved in two recent publications by Chattopadhyay and co-workers [Mukhopadhyay and Chattopadhyay, J. Phys. A 46, 035501 (2013)1751-811310.1088/1751-8113/46/3/035501; Nath, Phys. Rev. E 88, 013010 (2013)PLEEE81539-375510.1103/PhysRevE.88.013010] where it was shown that such instabilities, especially for nonmagnetized accretion flows, were introduced through interaction of the inherent stochastic noise in the system (even a "cold" accretion flow at 3000 K is too "hot" in the statistical parlance and is capable of inducing strong thermal modes) with the underlying Taylor-Couette flow profiles. Both studies, however, excluded the additional energy influx (or efflux) that could result from nonzero cross correlation of a noise perturbing the velocity flow, say, with the noise that is driving the vorticity flow (or equivalently the magnetic field and magnetic vorticity flow dynamics). Through the introduction of such a time symmetry violating effect, in this article we show that nonzero noise cross correlations essentially renormalize the strength of temporal correlations. Apart from an overall boost in the energy rate (both for spatial and temporal correlations, and hence in the ensemble averaged energy spectra), this results in mutual competition in growth rates of affected variables often resulting in suppression of oscillating Alfven waves at small times while leading to faster saturations at relatively longer time scales. The effects are seen to be more pronounced with magnetic field fluxes where the noise cross correlation magnifies the strength of the field concerned. Another remarkable feature noted specifically for the autocorrelation functions is the removal of energy degeneracy in the temporal profiles of fast growing non-normal modes leading to faster saturation with minimum oscillations. These results, including those presented in the previous two publications, now convincingly explain subcritical transition to turbulence in the linear limit for all possible situations that could now serve as the benchmark for nonlinear stability studies in Keplerian accretion disks.
Resumo:
Typical Double Auction (DA) models assume that trading agents are one-way traders. With this limitation, they cannot directly reflect the fact individual traders in financial markets (the most popular application of double auction) choose their trading directions dynamically. To address this issue, we introduce the Bi-directional Double Auction (BDA) market which is populated by two-way traders. Based on experiments under both static and dynamic settings, we find that the allocative efficiency of a static continuous BDA market comes from rational selection of trading directions and is negatively related to the intelligence of trading strategies. Moreover, we introduce Kernel trading strategy designed based on probability density estimation for general DA market. Our experiments show it outperforms some intelligent DA market trading strategies. Copyright © 2013, International Foundation for Autonomous Agents and Multiagent Systems (www.ifaamas.org). All rights reserved.
Resumo:
Pearson's correlation coefficient (‘r’) is one of the most widely used of all statistics. Nevertheless, care needs to be used in interpreting the results because with large numbers of observations, quite small values of ‘r’ become significant and the X variable may only account for a small proportion of the variance in Y. Hence, ‘r squared’ should always be calculated and included in a discussion of the significance of ‘r’. The use of ‘r’ also assumes that the data follow a bivariate normal distribution (see Statnote 17) and this assumption should be examined prior to the study. If the data do not conform to such a distribution, the use of a non-parametric correlation coefficient should be considered. A significant correlation should not be interpreted as indicating ‘causation’ especially in observational studies, in which the two variables may be correlated because of their mutual correlations with other confounding variables.
Resumo:
1. Pearson's correlation coefficient only tests whether the data fit a linear model. With large numbers of observations, quite small values of r become significant and the X variable may only account for a minute proportion of the variance in Y. Hence, the value of r squared should always be calculated and included in a discussion of the significance of r. 2. The use of r assumes that a bivariate normal distribution is present and this assumption should be examined prior to the study. If Pearson's r is not appropriate, then a non-parametric correlation coefficient such as Spearman's rs may be used. 3. A significant correlation should not be interpreted as indicating causation especially in observational studies in which there is a high probability that the two variables are correlated because of their mutual correlations with other variables. 4. In studies of measurement error, there are problems in using r as a test of reliability and the ‘intra-class correlation coefficient’ should be used as an alternative. A correlation test provides only limited information as to the relationship between two variables. Fitting a regression line to the data using the method known as ‘least square’ provides much more information and the methods of regression and their application in optometry will be discussed in the next article.
Resumo:
This thesis examines the effect of rights issue announcements on stock prices by companies listed on the Kuala Lumpur Stock Exchange (KLSE) between 1987 to 1996. The emphasis is to report whether the KLSE is semi strongly efficient with respect to the announcement of rights issues and to check whether the implications of corporate finance theories on the effect of an event can be supported in the context of an emerging market. Once the effect is established, potential determinants of abnormal returns identified by previous empirical work and corporate financial theory are analysed. By examining 70 companies making clean rights issue announcements, this thesis will hopefully shed light on some important issues in long term corporate financing. Event study analysis is used to check on the efficiency of the Malaysian stock market; while cross-sectional regression analysis is executed to identify possible explanators of the rights issue announcements' effect. To ensure the results presented are not contaminated, econometric and statistical issues raised in both analyses have been taken into account. Given the small amount of empirical research conducted in this part of the world, the results of this study will hopefully be of use to investors, security analysts, corporate financial managements, regulators and policy makers as well as those who are interested in capital market based research of an emerging market. It is found that the Malaysian stock market is not semi strongly efficient since there exists a persistent non-zero abnormal return. This finding is not consistent with the hypothesis that security returns adjust rapidly to reflect new information. It may be possible that the result is influenced by the sample, consisting mainly of below average size companies which tend to be thinly traded. Nevertheless, these issues have been addressed. Another important issue which has emerged from the study is that there is some evidence to suggest that insider trading activity existed in this market. In addition to these findings, when the rights issue announcements' effect is compared to the implications of corporate finance theories in predicting the sign of abnormal returns, the signalling model, asymmetric information model, perfect substitution hypothesis and Scholes' information hypothesis cannot be supported.
Resumo:
Purpose - Anterior segment optical coherent tomography (AS-OCT) is used to further examine previous reports that ciliary muscle thickness (CMT) is increased in myopic eyes. With reference to temporal and nasal CMT, interrelationships between biometric and morphological characteristics of anterior and posterior segments are analysed for British-White and British-South-Asian adults with and without myopia. Methods - Data are presented for the right eyes of 62 subjects (British-White n = 39, British-South-Asian n = 23, aged 18–40 years) with a range of refractive error (mean spherical error (MSE (D)) -1.74 ± 3.26; range -10.06 to +4.38) and separated into myopes (MSE (D) <-0.50, range -10.06 to -0.56; n = 30) and non-myopes (MSE (D) =-0.50, -0.50 to +4.38; n = 32). Temporal and nasal ciliary muscle cross-sections were imaged using a Visante AS-OCT. Using Visante software, manual measures of nasal and temporal CMT (NCMT and TCMT respectively) were taken in successive posterior 1 mm steps from the scleral spur over a 3 mm distance (designated NCMT1, TCMT1 et seq). Measures of axial length and anterior chamber depth were taken with an IOLMaster biometer. MSE and corneal curvature (CC) measurements were taken with a Shin-Nippon auto-refractor. Magnetic resonance imaging was used to determine total ocular volume (OV) for 31 of the original subject group. Statistical comparisons and analyses were made using mixed repeated measures anovas, Pearson's correlation coefficient and stepwise forward multiple linear regression. Results - MSE was significantly associated with CMT, with thicker CMT2 and CMT3 being found in the myopic eyes (p = 0.002). In non-myopic eyes TCMT1, TCMT2, NCMT1 and NCMT2 correlated significantly with MSE, AL and OV (p < 0.05). In contrast, myopic eyes failed generally to exhibit a significant correlation between CMT, MSE and axial length but notably retained a significant correlation between OV, TCMT2, TCMT3, NCMT2 and NCMT3 (p < 0.05). OV was found to be a significantly better predictor of TCMT2 and TCMT3 than AL by approximately a factor of two (p < 0.001). Anterior chamber depth was significantly associated with both temporal and nasal CMT2 and CMT3; TCMT1 correlated positively with CC. Ethnicity had no significant effect on differences in CMT. Conclusions - Increased CMT is associated with myopia. We speculate that the lack of correlation in myopic subjects between CMT and axial length, but not between CMT and OV, is evidence that disrupted feedback between the fovea and ciliary apparatus occurs in myopia development.
Resumo:
Many papers claim that a Log Periodic Power Law (LPPL) model fitted to financial market bubbles that precede large market falls or 'crashes', contains parameters that are confined within certain ranges. Further, it is claimed that the underlying model is based on influence percolation and a martingale condition. This paper examines these claims and their validity for capturing large price falls in the Hang Seng stock market index over the period 1970 to 2008. The fitted LPPLs have parameter values within the ranges specified post hoc by Johansen and Sornette (2001) for only seven of these 11 crashes. Interestingly, the LPPL fit could have predicted the substantial fall in the Hang Seng index during the recent global downturn. Overall, the mechanism posited as underlying the LPPL model does not do so, and the data used to support the fit of the LPPL model to bubbles does so only partially. © 2013.
Resumo:
The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.
Resumo:
This study examines the influence of corporate governance structures on the levels of compliance with IFRSs disclosure requirements by companies listed on the stock exchanges of two leading MENA (Middle East and North Africa) countries, Egypt and Jordan. This study employs a cross-sectional analysis of a sample of non-financial companies listed on the two stock exchanges for the fiscal year 2007. Using an unweighted disclosure index, the study measures the levels of compliance by companies listed on the two stock exchanges investigated.Univariate and multivariate regression analyses are used to estimate the relationships proposed in the hypotheses. In addition, the study uses semi-structured interviews in order to supplement the interpretation of the findings of the quantitative analyses. An innovative theoretical foundation is deployed, in which compliance is interpretable through three lenses - institutional isomorphism theory, secrecy versus transparency (one of Gray’s accounting sub-cultural values), and financial economics theories. The study extends the financial reporting literature, cross-national comparative financial disclosure literature, and the emerging markets disclosure literature by carrying out one of the first comparative studies of the above mentioned stock exchanges. Results provide evidence of a lack of de facto compliance (i.e., actual compliance) with IFRSs disclosure requirements in the scrutinised MENA countries. The impact of corporate governance mechanisms for best practice on enhancing the extent of compliance with mandatory IFRSs is absent in the stock exchanges in question. The limited impact of corporate governance best practice is mainly attributed to the novelty of corporate governance in the region, a finding which lends support to the applicability of the proposed theoretical foundation to the MENA context. Finally, the study provides recommendations for improving de facto compliance with IFRSs disclosure requirements and corporate governance best practice in the MENA region and suggests areas for future research.
Resumo:
Most traditional methods for extracting the relationships between two time series are based on cross-correlation. In a non-linear non-stationary environment, these techniques are not sufficient. We show in this paper how to use hidden Markov models (HMMs) to identify the lag (or delay) between different variables for such data. We first present a method using maximum likelihood estimation and propose a simple algorithm which is capable of identifying associations between variables. We also adopt an information-theoretic approach and develop a novel procedure for training HMMs to maximise the mutual information between delayed time series. Both methods are successfully applied to real data. We model the oil drilling process with HMMs and estimate a crucial parameter, namely the lag for return.
Resumo:
Most traditional methods for extracting the relationships between two time series are based on cross-correlation. In a non-linear non-stationary environment, these techniques are not sufficient. We show in this paper how to use hidden Markov models to identify the lag (or delay) between different variables for such data. Adopting an information-theoretic approach, we develop a procedure for training HMMs to maximise the mutual information (MMI) between delayed time series. The method is used to model the oil drilling process. We show that cross-correlation gives no information and that the MMI approach outperforms maximum likelihood.
Resumo:
An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50-100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.
Resumo:
This thesis discusses the need for nondestructive testing and highlights some of the limitations in present day techniques. Special interest has been given to ultrasonic examination techniques and the problems encountered when they are applied to thick welded plates. Some suggestions are given using signal processing methods. Chapter 2 treats the need for nondestructive testing as seen in the light of economy and safety. A short review of present day techniques in nondestructive testing is also given. The special problems using ultrasonic techniques for welded structures is discussed in Chapter 3 with some examples of elastic wave propagation in welded steel. The limitations in applying sophisticated signal processing techniques to ultrasonic NDT~ mainly found in the transducers generating or receiving the ultrasound. Chapter 4 deals with the different transducers used. One of the difficulties with ultrasonic testing is the interpretation of the signals encountered. Similar problems might be found with SONAR/RADAR techniques and Chapter 5 draws some analogies between SONAR/RADAR and ultrasonic nondestructive testing. This chapter also includes a discussion on some on the techniques used in signal processing in general. A special signal processing technique found useful is cross-correlation detection and this technique is treated in Chapter 6. Electronic digital compute.rs have made signal processing techniques easier to implement -Chapter 7 discusses the use of digital computers in ultrasonic NDT. Experimental equipment used to test cross-correlation detection of ultrasonic signals is described in Chapter 8. Chapter 9 summarises the conclusions drawn during this investigation.
Resumo:
We have simulated the performance of various apertures used in Coded Aperture Imaging - optically. Coded pictures of extended and continuous-tone planar objects from the Annulus, Twin Annulus, Fresnel Zone Plate and the Uniformly Redundant Array have been decoded using a noncoherent correlation process. We have compared the tomographic capabilities of the Twin Annulus with the Uniformly Redundant Arrays based on quadratic residues and m-sequences. We discuss the ways of reducing the 'd. c.' background of the various apertures used. The non-ideal System-Point-Spread-Function inherent in a noncoherent optical correlation process produces artifacts in the reconstruction. Artifacts are also introduced as a result of unwanted cross-correlation terms from out-of-focus planes. We find that the URN based on m-sequences exhibits good spatial resolution and out-of-focus behaviour when imaging extended objects.