370 resultados para long memory
em Queensland University of Technology - ePrints Archive
Resumo:
In this work, we investigate an alternative bootstrap approach based on a result of Ramsey [F.L. Ramsey, Characterization of the partial autocorrelation function, Ann. Statist. 2 (1974), pp. 1296-1301] and on the Durbin-Levinson algorithm to obtain a surrogate series from linear Gaussian processes with long range dependence. We compare this bootstrap method with other existing procedures in a wide Monte Carlo experiment by estimating, parametrically and semi-parametrically, the memory parameter d. We consider Gaussian and non-Gaussian processes to prove the robustness of the method to deviations from normality. The approach is also useful to estimate confidence intervals for the memory parameter d by improving the coverage level of the interval.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Throughout a lifetime of operation, a mobile service robot needs to acquire, store and update its knowledge of a working environment. This includes the ability to identify and track objects in different places, as well as using this information for interaction with humans. This paper introduces a long-term updating mechanism, inspired by the modal model of human memory, to enable a mobile robot to maintain its knowledge of a changing environment. The memory model is integrated with a hybrid map that represents the global topology and local geometry of the environment, as well as the respective 3D location of objects. We aim to enable the robot to use this knowledge to help humans by suggesting the most likely locations of specific objects in its map. An experiment using omni-directional vision demonstrates the ability to track the movements of several objects in a dynamic environment over an extended period of time.
Resumo:
Double-strand breaks represent an extremely cytolethal form of DNA damage and thus pose a serious threat to the preservation of genetic and epigenetic information. Though it is well-known that double-strand breaks such as those generated by ionising radiation are among the principal causative factors behind mutations, chromosomal aberrations, genetic instability and carcino-genesis, significantly less is known about the epigenetic consequences of double-strand break formation and repair for carcinogenesis. Double-strand break repair is a highly coordinated process that requires the unravelling of the compacted chromatin structure to facilitate repair machinery access and then restoration of the original undamaged chromatin state. Recent experimental findings have pointed to a potential mechanism for double-strand break-induced epigenetic silencing. This review will discuss some of the key epigenetic regulatory processes involved in double-strand break (DSB) repair and how incomplete or incorrect restoration of chromatin structure can leave a DSB-induced epigenetic memory of damage with potentially pathological repercussions
Resumo:
Jack's Bay (the architecturalisation of memory) is a key work of the author's exhibition Lightsite, which toured Western Australian galleries from February 2006 to November 2007. It is a five-minute-long exposure photographic image captured inside a purpose-built, room-sized pinhole camera which is demountable and does not have a floor. The work depicts octogenarian Jack Morris, who for forty years held the professional salmon fishing license in the hamlet of Bremer Bay, on the SE coast of Western Australia. The pinhole camera-room is sited within sand dunes new Jack's now demolished beachside camp. Three generations of Jack's descendents stand outside the room - from his daughter to his great grand children. The light from this exterior landscape is 'projected' inside the camera-room and illuminates the interior scene which includes that part of the sand dune upon which the floorless room is erected, along with Jack who is sitting inside. The image evokes the temporality of light. Here, light itself is portrayed as the primary medium through which we both perceive and describe landscape. In this way it is through the agency of light that we construct our connectivity to landscape.
Resumo:
Free association norms indicate that words are organized into semantic/associative neighborhoods within a larger network of words and links that bind the net together. We present evidence indicating that memory for a recent word event can depend on implicitly and simultaneously activating related words in its neighborhood. Processing a word during encoding primes its network representation as a function of the density of the links in its neighborhood. Such priming increases recall and recognition and can have long lasting effects when the word is processed in working memory. Evidence for this phenomenon is reviewed in extralist cuing, primed free association, intralist cuing, and single-item recognition tasks. The findings also show that when a related word is presented to cue the recall of a studied word, the cue activates it in an array of related words that distract and reduce the probability of its selection. The activation of the semantic network produces priming benefits during encoding and search costs during retrieval. In extralist cuing recall is a negative function of cue-to-distracter strength and a positive function of neighborhood density, cue-to-target strength, and target-to cue strength. We show how four measures derived from the network can be combined and used to predict memory performance. These measures play different roles in different tasks indicating that the contribution of the semantic network varies with the context provided by the task. We evaluate spreading activation and quantum-like entanglement explanations for the priming effect produced by neighborhood density.
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.
Resumo:
Here, we investigate the genetic basis of human memory in healthy individuals and the potential role of two polymorphisms, previously implicated in memory function. We have explored aspects of retrospective and prospective memory including semantic, short term, working and long-term memory in conjunction with brain derived neurotrophic factor (BDNF) and tumor necrosis factor-alpha (TNF-alpha). The memory scores for healthy individuals in the population were obtained for each memory type and the population was genotyped via restriction fragment length polymorphism for the BDNF rs6265 (Val66Met) SNP and via pyrosequencing for the TNF-alpha rs113325588 SNP. Using univariate ANOVA, a significant association of the BDNF polymorphism with visual and spatial memory retention and a significant association of the TNF-alpha polymorphism was observed with spatial memory retention. In addition, a significant interactive effect between BDNF and TNF-alpha polymorphisms was observed in spatial memory retention. In practice visual memory involves spatial information and the two memory systems work together, however our data demonstrate that individuals with the Val/Val BDNF genotype have poorer visual memory but higher spatial memory retention, indicating a level of interaction between TNF-alpha and BDNF in spatial memory retention. This is the first study to use genetic analysis to determine the interaction between BDNF and TNF-alpha in relation to memory in normal adults and provides important information regarding the effect of genetic determinants and gene interactions on human memory.
Resumo:
The generation of a correlation matrix for set of genomic sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. Each sequence may be millions of bases long and there may be thousands of such sequences which we wish to compare, so not all sequences may fit into main memory at the same time. Each sequence needs to be compared with every other sequence, so we will generally need to page some sequences in and out more than once. In order to minimize execution time we need to minimize this I/O. This paper develops an approach for faster and scalable computing of large-size correlation matrices through the maximal exploitation of available memory and reducing the number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different bioinformatics problems with different correlation matrix sizes. The significant performance improvement of the approach over previous work is demonstrated through benchmark examples.
Resumo:
It is well established that the coordinated regulation of activity-dependent gene expression by the histone acetyltransferase (HAT) family of transcriptional coactivators is crucial for the formation of contextual fear and spatial memory, and for hippocampal synaptic plasticity. However, no studies have examined the role of this epigenetic mechanism within the infralimbic prefrontal cortex (ILPFC), an area of the brain that is essential for the formation and consolidation of fear extinction memory. Here we report that a postextinction training infusion of a combined p300/CBP inhibitor (Lys-CoA-Tat), directly into the ILPFC, enhances fear extinction memory in mice. Our results also demonstrate that the HAT p300 is highly expressed within pyramidal neurons of the ILPFC and that the small-molecule p300-specific inhibitor (C646) infused into the ILPFC immediately after weak extinction training enhances the consolidation of fear extinction memory. C646 infused 6 h after extinction had no effect on fear extinction memory, nor did an immediate postextinction training infusion into the prelimbic prefrontal cortex. Consistent with the behavioral findings, inhibition of p300 activity within the ILPFC facilitated long-term potentiation (LTP) under stimulation conditions that do not evoke long-lasting LTP. These data suggest that one function of p300 activity within the ILPFC is to constrain synaptic plasticity, and that a reduction in the function of this HAT is required for the formation of fear extinction memory.