38 resultados para Novel of memory
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.
Resumo:
The adsorption of low-energy C20 isomers on diamond (0 0 1)–(2×1) surface was investigated by molecular dynamics simulation using the Brenner potential. The energy dependence of chemisorption characteristic was studied. We found that there existed an energy threshold for chemisorption of C20 to occur. Between 10 and 20 eV, the C20 fullerene has high probability of chemisorption and the adsorbed cage retains its original structure, which supports the experimental observations of memory effects. However, the structures of the adsorbed bowl and ring C20 were different from their original ones. In this case, the local order in cluster-assembled films would be different from the free clusters.
Resumo:
Introduction Road safety researchers rely heavily on self-report data to explore the aetiology of crash risk. However, researchers consistently acknowledge a range of limitations associated with this methodological approach (e.g., self-report bias), which has been hypothesised to reduce the predictive efficacy of scales. Although well researched in other areas, one important factor often neglected in road safety studies is the fallibility of human memory. Given accurate recall is a key assumption in many studies, the validity and consistency of self-report data warrants investigation. The aim of the current study was to examine the consistency of self-report data of crash history and details of the most recent reported crash on two separate occasions. Materials & Method A repeated measures design was utilised to examine the self-reported crash involvement history of 214 general motorists over a two month period. Results A number of interesting discrepancies were noted in relation to number of lifetime crashes reported by the participants and the descriptions of their most recent crash across the two occasions. Of the 214 participants who reported having been involved in a crash, 35 (22.3%) reported a lower number of lifetime crashes as Time 2, than at Time 1. Of the 88 drivers who reported no change in number of lifetime crashes, 10 (11.4%) described a different most recent crash. Additionally, of the 34 reporting an increase in the number of lifetime crashes, 29 (85.3%) of these described the same crash on both occasions. Assessed as a whole, at least 47.1% of participants made a confirmed mistake at Time 1 or Time 2. Conclusions These results raise some doubt in regard to the accuracy of memory recall across time. Given that self-reported crash involvement is the predominant dependent variable used in the majority of road safety research, this issue warrants further investigation. Replication of the study with a larger sample size that includes multiple recall periods would enhance understanding into the significance of this issue for road safety methodology.
Resumo:
This thesis is a study of how the contents of volatile memory on the Windows operating system can be better understood and utilised for the purposes of digital forensic investigations. It proposes several techniques to improve the analysis of memory, with a focus on improving the detection of unknown code such as malware. These contributions allow the creation of a more complete reconstruction of the state of a computer at acquisition time, including whether or not the computer has been infected by malicious code.
Resumo:
The study of memory in most behavioral paradigms, including emotional memory paradigms, has focused on the feed forward components that underlie Hebb’s first postulate, associative synaptic plasticity. Hebb’s second postulate argues that activated ensembles of neurons reverberate in order to provide temporal coordination of different neural signals, and thereby facilitate coincidence detection. Recent evidence from our groups has suggested that the lateral amygdala (LA) contains recurrent microcircuits and that these may reverberate. Additionally this reverberant activity is precisely timed with latencies that would facilitate coincidence detection between cortical and sub cortical afferents to the LA.Thus, recent data at the microcircuit level in the amygdala provide some physiological evidence in support of the second Hebbian postulate.
Resumo:
In this paper we present a cryptanalysis of a new 256-bit hash function, FORK-256, proposed by Hong et al. at FSE 2006. This cryptanalysis is based on some unexpected differentials existing for the step transformation. We show their possible uses in different attack scenarios by giving a 1-bit (resp. 2-bit) near collision attack against the full compression function of FORK-256 running with complexity of 2^125 (resp. 2^120) and with negligible memory, and by exhibiting a 22-bit near pseudo-collision. We also show that we can find collisions for the full compression function with a small amount of memory with complexity not exceeding 2^126.6 hash evaluations. We further show how to reduce this complexity to 2^109.6 hash computations by using 273 memory words. Finally, we show that this attack can be extended with no additional cost to find collisions for the full hash function, i.e. with the predefined IV.
Resumo:
The generation of a correlation matrix for set of genomic sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. Each sequence may be millions of bases long and there may be thousands of such sequences which we wish to compare, so not all sequences may fit into main memory at the same time. Each sequence needs to be compared with every other sequence, so we will generally need to page some sequences in and out more than once. In order to minimize execution time we need to minimize this I/O. This paper develops an approach for faster and scalable computing of large-size correlation matrices through the maximal exploitation of available memory and reducing the number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different bioinformatics problems with different correlation matrix sizes. The significant performance improvement of the approach over previous work is demonstrated through benchmark examples.
Resumo:
The hippocampus is an anatomically distinct region of the medial temporal lobe that plays a critical role in the formation of declarative memories. Here we show that a computer simulation of simple compartmental cells organized with basic hippocampal connectivity is capable of producing stimulus intensity sensitive wide-band fluctuations of spectral power similar to that seen in real EEG. While previous computational models have been designed to assess the viability of the putative mechanisms of memory storage and retrieval, they have generally been too abstract to allow comparison with empirical data. Furthermore, while the anatomical connectivity and organization of the hippocampus is well defined, many questions regarding the mechanisms that mediate large-scale synaptic integration remain unanswered. For this reason we focus less on the specifics of changing synaptic weights and more on the population dynamics. Spectral power in four distinct frequency bands were derived from simulated field potentials of the computational model and found to depend on the intensity of a random input. The majority of power occurred in the lowest frequency band (3-6 Hz) and was greatest to the lowest intensity stimulus condition (1% maximal stimulus). In contrast, higher frequency bands ranging from 7-45 Hz show an increase in power directly related with an increase in stimulus intensity. This trend continues up to a stimulus level of 15% to 20% of the maximal input, above which power falls dramatically. These results suggest that the relative power of intrinsic network oscillations are dependent upon the level of activation and that above threshold levels all frequencies are damped, perhaps due to over activation of inhibitory interneurons.
Resumo:
Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-then-sign signature scheme, one has to solve a cryptanalytical task which is related to finding second preimages for the hash function. In this article, we will show how to use Dean’s method of finding expandable messages for finding a second preimage in the Merkle-Damgård hash function to existentially forge a signature scheme based on a t-bit RMX-hash function which uses the Davies-Meyer compression functions (e.g., MD4, MD5, SHA family) in 2 t/2 chosen messages plus 2 t/2 + 1 off-line operations of the compression function and similar amount of memory. This forgery attack also works on the signature schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack.
Resumo:
Peggy Shaw’s RUFF, (USA 2013) and Queensland Theatre Company’s collaboration with Queensland University of Technology, Total Dik!, (Australia 2013) overtly and evocatively draw on an aestheticized use of the cinematic techniques and technologies of Chroma Key to reveal the tensions in their production and add layers to their performances. In doing so they offer invaluable insight where the filmic and theatrical approaches overlap. This paper draws on Eckersall, Grehan and Scheer’s New Media Dramaturgy (2014) to reposition the frame as a contribution to intermedial theatre and performance practices in light of increasing convergence between seemingly disparate discourses. In RUFF, the scenic environment replicates a chroma-key ‘studio’ which facilitates the reconstruction of memory displaced after a stroke. RUFF uses the screen and projections to recall crooners, lounge singers, movie stars, rock and roll bands, and an eclectic line of eccentric family members living inside Shaw. While the show pays tribute to those who have kept her company across decades of theatrical performance, use of non-composited chroma-key technique as a theatrical device and the work’s taciturn revelation of the production process during performance, play a central role in its exploration of the juxtaposition between its reconstructed form and content. In contrast Total Dik! uses real-time green screen compositing during performance as a scenic device. Actors manipulate scale models, refocus cameras and generate scenes within scenes in the construction of the work’s examination of an isolated Dictator. The ‘studio’ is again replicated as a site for (re)construction, only in this case Total Dik! actively seeks to reveal the process of production as the performance plays out. Building on RUFF, and other works such as By the Way, Meet Vera Stark, (2012) and Hotel Modern’s God’s Beard (2012), this work blends a convergence of mobile technologies, models, and green screen capture to explore aspects of transmedia storytelling in a theatrical environment (Jenkins, 2009, 2013). When a green screen is placed on stage, it reads at once as metaphor and challenge to the language of theatre. It becomes, or rather acts, as a ‘sign’ that alludes to the nature of the reconstructed, recomposited, manipulated and controlled. In RUFF and in Total Dik!, it is also a place where as a mode of production and subsequent reveal, it adds weight to performance. These works are informed by Auslander (1999) and Giesenkam (2007) and speak to and echo Lehmann’s Postdramatic Theatre (2006). This paper’s consideration of the integration of studio technique and live performance as a dynamic approach to multi-layered theatrical production develops our understanding of their combinatory use in a live performance environment.
Resumo:
Memory T cells develop early during the preclinical stages of autoimmune diseases and have traditionally been considered resistant to tolerance induction. As such, they may represent a potent barrier to the successful immunotherapy of established autoimmune diseases. It was recently shown that memory CD8+ T cell responses are terminated when Ag is genetically targeted to steady-state dendritic cells. However, under these conditions, inactivation of memory CD8+ T cells is slow, allowing transiently expanded memory CD8+ T cells to exert tissue-destructive effector function. In this study, we compared different Ag-targeting strategies and show, using an MHC class II promoter to drive Ag expression in a diverse range of APCs, that CD8+ memory T cells can be rapidly inactivated by MHC class II+ hematopoietic APCs through a mechanism that involves a rapid and sustained downregulation of TCR, in which the effector response of CD8+ memory cells is rapidly truncated and Ag-expressing target tissue destruction is prevented. Our data provide the first demonstration that genetically targeting Ag to a broad range of MHC class II+ APC types is a highly efficient way to terminate memory CD8+ T cell responses to prevent tissue-destructive effector function and potentially established autoimmune diseases. Copyright © 2010 by The American Association of Immunologists, Inc.
Resumo:
Review of Memory and Gender in Medieval Europe, 900-1200 by Elizabeth van Houts (Toronto UP, 1999).
Resumo:
For Bakhtin, it is always important to know from where one speaks. The place from which I speak is that of a person who grew up in Italy during the economic miracle (pre-1968) in a working class family, watching film matinees on television during school holidays. All sort of films and genres were shown: from film noir to westerns, to Jean Renoir's films, German expressionism, Italian neorealism and Italian comedy. Cinema has come to represent over time a sort of memory extension that supplements lived memory of events, and one which, especially, mediates the intersection of many cultural discourses. When later in life I moved to Australia and started teaching in film studies, my choice of a film that was emblematic of neorealism went naturally to Roma città aperta (Open city hereafter) by Roberto Rossellini (1945), and not to Paisan or Sciuscà or Bicycle Thieves. My choice was certainly grounded in my personal memory - especially those aspects transmitted to me by my parents, who lived through the war and maintained that Open City had truly made them cry. With a mother who voted for the Christian Democratic Party and a father who was a unionist, I thought that this was normal in Italian families and society. In the early 1960s, the Resistance still offered a narrative of suffering and redemption, shared by Catholics or Communists. This construction of psychological realism is what I believe Open City continues to offer in time.
Resumo:
From an ecological perspective knowledge signifies the degree of fitness of a performer and his/her environment. From this viewpoint, the role of training is to enhance this degree of fit between a specific athlete and the performance environment, instead of the enrichment of memory in the performer. In this regard, ecological psychology distinguishes between perceptual knowledge or "knowledge of" the environment and symbolic knowledge or "knowledge about" the environment. This distinction elucidates how knowing how to act (knowing of) as well as knowing how to verbalise memorial representations (e.g., a verbal description of performance) (knowing about) are both rooted in perception. In this chapter we demonstrate these types of knowledge in decision-making behaviour and exemplify how they can be presented in 1 v 1 practice task contraints in basketball.
Resumo:
Following Youngjohn, Lees-Haley, and Binder's (1999) comment on Johnson and Lesniak-Karpiak's (1997) study that warnings lead to more subtle malingering, researchers have sought to better understand warning effects. However, such studies have been largely atheoretical and may have confounded warning and coaching. This study examined the effect on malingering of a warning that was based on criminological-sociological concepts derived from the rational choice model of deterrence theory. A total of 78 participants were randomly assigned to a control group, an unwarned simulator group, or one of two warned simulator groups. The warning groups comprised low- and high-level conditions depending on warning intensity. Simulator participants received no coaching about how to fake tests. Outcome variables were scores derived from the Test of Memory Malingering and Wechsler Memory Scale-III. When the rate of malingering was compared across the four groups, a high-level warning effect was found such that warned participants were significantly less likely to exaggerate than unwarned simulators. In an exploratory follow-up analysis, the warned groups were divided into those who reported malingering and those who did not report malingering, and the performance of these groups was compared to that of unwarned simulators and controls. Using this approach, results showed that participants who were deterred from malingering by warning performed no worse than controls. However, on a small number of tests, self-reported malingerers in the low-level warning group appeared less impaired than unwarned simulators. This pattern was not observed in the high-level warning condition. Although cautious interpretation of findings is necessitated by the exploratory nature of some analyses, overall results suggest that using a carefully designed warning may be useful for reducing the rate of malingering. The combination of some noteworthy effect sizes, despite low power and the small size of some groups, suggests that further investigation of the effects of warnings needs to continue to determine their effect more fully.