232 resultados para Time-memory attacks
Resumo:
This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.
Resumo:
Minimal perfect hash functions are used for memory efficient storage and fast retrieval of items from static sets. We present an infinite family of efficient and practical algorithms for generating order preserving minimal perfect hash functions. We show that almost all members of the family construct space and time optimal order preserving minimal perfect hash functions, and we identify the one with minimum constants. Members of the family generate a hash function in two steps. First a special kind of function into an r-graph is computed probabilistically. Then this function is refined deterministically to a minimal perfect hash function. We give strong theoretical evidence that the first step uses linear random time. The second step runs in linear deterministic time. The family not only has theoretical importance, but also offers the fastest known method for generating perfect hash functions.
Resumo:
Colonius suggests that, in using standard set theory as the language in which to express our computational-level theory of human memory, we would need to violate the axiom of foundation in order to express meaningful memory bindings in which a context is identical to an item in the list. We circumvent Colonius's objection by allowing that a list item may serve as a label for a context without being identical to that context. This debate serves to highlight the value of specifying memory operations in set theoretic notation, as it would have been difficult if not impossible to formulate such an objection at the algorithmic level.
Resumo:
A number of theoretical and experimental investigations have been made into the nature of purlin-sheeting systems over the past 30 years. These systems commonly consist of cold-formed zed or channel section purlins, connected to corrugated sheeting. They have proven difficult to model due to the complexity of both the purlin deformation and the restraint provided to the purlin by the sheeting. Part 1 of this paper presented a non-linear elasto plastic finite element model which, by incorporating both the purlin and the sheeting in the analysis, allowed the interaction between the two components of the system to be modelled. This paper presents a simplified version of the first model which has considerably decreased requirements in terms of computer memory, running time and data preparation. The Simplified Model includes only the purlin but allows for the sheeting's shear and rotational restraints by modelling these effects as springs located at the purlin-sheeting connections. Two accompanying programs determine the stiffness of these springs numerically. As in the Full Model, the Simplified Model is able to account for the cross-sectional distortion of the purlin, the shear and rotational restraining effects of the sheeting, and failure of the purlin by local buckling or yielding. The model requires no experimental or empirical input and its validity is shown by its goon con elation with experimental results. (C) 1997 Elsevier Science Ltd.
Resumo:
It has been claimed that the symptoms of post-traumatic stress disorder (PTSD) can be ameliorated by eye-movement desensitization-reprocessing therapy (EMD-R), a procedure that involves the individual making saccadic eye-movements while imagining the traumatic event. We hypothesized that these eye-movements reduce the vividness of distressing images by disrupting the function of the visuospatial sketchpad (VSSP) of working memory, and that by doing so they reduce the intensity of the emotion associated with the image. This hypothesis was tested by asking non-PTSD participants to form images of neutral and negative pictures under dual task conditions. Their images were less vivid with concurrent eye-movements and with a concurrent spatial tapping task that did not involve eye-movements. In the first three experiments, these secondary tasks did not consistently affect participants' emotional responses to the images. However, Expt 4 used personal recollections as stimuli for the imagery task, and demonstrated a significant reduction in emotional response under the same dual task conditions. These results suggest that, if EMD-R works, it does so by reducing the vividness and emotiveness of traumatic images via the VSSP of working memory. Other visuospatial tasks may also be of therapeutic value.
Resumo:
When linear equality constraints are invariant through time they can be incorporated into estimation by restricted least squares. If, however, the constraints are time-varying, this standard methodology cannot be applied. In this paper we show how to incorporate linear time-varying constraints into the estimation of econometric models. The method involves the augmentation of the observation equation of a state-space model prior to estimation by the Kalman filter. Numerical optimisation routines are used for the estimation. A simple example drawn from demand analysis is used to illustrate the method and its application.
Resumo:
The dispersion model with mixed boundary conditions uses a single parameter, the dispersion number, to describe the hepatic elimination of xenobiotics and endogenous substances. An implicit a priori assumption of the model is that the transit time density of intravascular indicators is approximated by an inverse Gaussian distribution. This approximation is limited in that the model poorly describes the tail part of the hepatic outflow curves of vascular indicators. A sum of two inverse Gaussian functions is proposed as ail alternative, more flexible empirical model for transit time densities of vascular references. This model suggests that a more accurate description of the tail portion of vascular reference curves yields an elimination rate constant (or intrinsic clearance) which is 40% less than predicted by the dispersion model with mixed boundary conditions. The results emphasize the need to accurately describe outflow curves in using them as a basis for determining pharmacokinetic parameters using hepatic elimination models. (C) 1997 Society for Mathematical Biology.
Resumo:
This paper offers a defense of backwards in time causation models in quantum mechanics. Particular attention is given to Cramer's transactional account, which is shown to have the threefold virtue of solving the Bell problem, explaining the complex conjugate aspect of the quantum mechanical formalism, and explaining various quantum mysteries such as Schrodinger's cat. The question is therefore asked, why has this model not received more attention from physicists and philosophers? One objection given by physicists in assessing Cramer's theory was that it is not testable. This paper seeks to answer this concern by utilizing an argument that backwards causation models entail a fork theory of causal direction. From the backwards causation model together with the fork theory one can deduce empirical predictions. Finally, the objection that this strategy is questionable because of its appeal to philosophy is deflected.