119 resultados para Graphic Memory
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.
Resumo:
Educational reforms currently being enacted in Kuwaiti Family and Consumer Sciences (FCS) in response to contemporary demands for increased student-centred teaching and learning are challenging for FCS teachers due to their limited experience with student-centred learning tools such as Graphic Organisers (GOs). To adopt these reforms, Kuwaiti teachers require a better understanding of and competency in promoting cognitive learning processes that will maximise student-centred learning approaches. This study followed the experiences of four Grade 6 FCS Kuwaiti teachers as they undertook a Professional Development (PD) program specifically designed to advance their understanding of the use of GOs and then as they implemented what they had learned in their Grade 6 FCS classroom. The PD program developed for this study was informed by Nasseh.s competency PD model as well as Piaget and Ausubel.s cognitive theories. This model enabled an assessment and evaluation of the development of the teachers. competencies as an outcome of the PD program in terms of the adoption of GOs, in particular, and their capacity to use GOs to engage students in personalised, in-depth, learning through critical thinking and understanding. The research revealed that the PD program was influential in reforming the teachers. learning, understanding of and competency in, cognitive and visual theories of learning, so that they facilitated student-centred teaching and learning processes that enabled students to adopt and adapt GOs in constructivist learning. The implementation of five GOs - Flow Chart, Concept Maps, K-W-L Chart, Fishbone Diagram and Venn Diagram - as learning tools in classrooms was investigated to find if changes in pedagogical approach for supporting conceptual learning through cognitive information processing would reduce the cognitive work load of students and produce better learning approaches. The study as evidenced by the participant teachers. responses and classroom observations, showed a marked increase in student interest, participation, critical thought, problem solving skills, as a result of using GOs, compared to using traditional teaching and learning methods. A theoretical model was developed from the study based on the premise that teachers. knowledge of the subject, pedagogy and student learning precede the implementation of student-centred learning reform, that it plays an important role in the implementation of student-centred learning and that it brings about a change in teaching practice. The model affirmed that observed change in teaching-practice included aspects of teachers. beliefs, as well as confidence and effect on workplace and on student learning, including engagement, understanding, critical thinking and problem solving. The model assumed that change in teaching practice is inseparable from teachers. lifelong PD needs related to knowledge, understanding, skills and competency. These findings produced a set of preliminary guidelines for establishing student-centred constructivist strategies in Kuwaiti education while retaining Kuwait.s cultural uniqueness.
Resumo:
The world is rapidly ageing. It is against this backdrop that there are increasing incidences of dementia reported worldwide, with Alzheimer's disease (AD) being the most common form of dementia in the elderly. It is estimated that AD affects almost 4 million people in the US, and costs the US economy more than 65 million dollars annually. There is currently no cure for AD but various therapeutic agents have been employed in attempting to slow down the progression of the illness, one of which is oestrogen. Over the last decades, scientists have focused mainly on the roles of oestrogen in the prevention and treatment of AD. Newer evidences suggested that testosterone might also be involved in the pathogenesis of AD. Although the exact mechanisms on how androgen might affect AD are still largely unknown, it is known that testosterone can act directly via androgen receptor-dependent mechanisms or indirectly by converting to oestrogen to exert this effect. Clinical trials need to be conducted to ascertain the putative role of androgen replacement in Alzheimer's disease.
Resumo:
In this paper, the deposition of C-20 fullerenes on a diamond (001)-(2x1) surface and the fabrication of C-20 thin film at 100 K were investigated by a molecular dynamics (MD) simulation using the many-body Brenner bond order potential. First, we found that the collision dynamic of a single C-20 fullerene on a diamond surface was strongly dependent on its impact energy. Within the energy range 10-45 eV, the C-20 fullerene chemisorbed on the surface retained its free cage structure. This is consistent with the experimental observation, where it was called the memory effect in "C-20-type" films [P. Melion , Int. J. Mod. B 9, 339 (1995); P. Milani , Cluster Beam Synthesis of Nanostructured Materials (Springer, Berlin, 1999)]. Next, more than one hundred C-20 (10-25 eV) were deposited one after the other onto the surface. The initial growth stage of C-20 thin film was observed to be in the three-dimensional island mode. The randomly deposited C-20 fullerenes stacked on diamond surface and acted as building blocks forming a polymerlike structure. The assembled film was also highly porous due to cluster-cluster interaction. The bond angle distribution and the neighbor-atom-number distribution of the film presented a well-defined local order, which is of sp(3) hybridization character, the same as that of a free C-20 cage. These simulation results are again in good agreement with the experimental observation. Finally, the deposited C-20 film showed high stability even when the temperature was raised up to 1500 K.
Resumo:
There has been a renewal of interest in memory studies in recent years, particularly in the Western world. This chapter considers aspects of personal memory followed by the concept of cultural memory. It then examines how the Australian cultural memory of the Anzac Legend is represented in a number of recent picture books.
Resumo:
Children are encountering more and more graphic representations of data in their learning and everyday life. Much of this data occurs in quantitative forms as different forms of measurement are incorporated into the graphics during their construction. In their formal education, children are required to learn to use a range of these quantitative representations in subjects across the school curriculum. Previous research that focuses on the use of information processing and traditional approaches to cognitive psychology concludes that the development of an understanding of such representations of data is a complex process. An alternative approach is to investigate the experiences of children as they interact with graphic representations of quantitative data in their own life-worlds. This paper demonstrates how a phenomenographic approach may be used to reveal the qualitatively different ways in which children in Australian primary and secondary education understand the phenomenon of graphic representations of quantitative data. Seven variations of the children’s understanding were revealed. These have been described interpretively in the article and confirmed through the words of the children. A detailed outcome space demonstrates how these seven variations are structurally related.
Resumo:
Human memory is a complex neurocognitive process. By combining psychological and molecular genetics expertise, we examined the APOE ε4 allele, a known risk factor for Alzheimer's disease, and the COMT Val 158 polymorphism, previously implicated in schizophrenia, for association with lowered memory functioning in healthy adults. To assess memory type we used a range of memory tests of both retrospective and prospective memory. Genotypes were determined using RFLP analysis and compared with mean memory scores using univariate ANOVAs. Despite a modest sample size (n=197), our study found a significant effect of the APOE ε4 polymorphism in prospective memory. Supporting our hypothesis, a significant difference was demonstrated between genotype groups for means of the Comprehensive Assessment of Prospective Memory total score (p=0.036; ε4 alleles=1.99; all other alleles=1.86). In addition, we demonstrate a significant interactive effect between the APOE ε4 and COMT polymorphisms in semantic memory. This is the first study to investigate both APOE and COMT genotypes in relation to memory in non-pathological adults and provides important information regarding the effect of genetic determinants on human memory.
Resumo:
This practice-led project has two outcomes: a collection of short stories titled 'Corkscrew Section', and an exegesis. The short stories combine written narrative with visual elements such as images and typographic devices, while the exegesis analyses the function of these graphic devices within adult literary fiction. My creative writing explores a variety of genres and literary styles, but almost all of the stories are concerned with fusing verbal and visual modes of communication. The exegesis adopts the interpretive paradigm of multimodal stylistics, which aims to analyse graphic devices with the same level of detail as linguistic analysis. Within this framework, the exegesis compares and extends previous studies to develop a systematic method for analysing how the interactions between language, images and typography create meaning within multimodal literature.
Resumo:
A key question in neuroscience is how memory is selectively allocated to neural networks in the brain. This question remains a significant research challenge, in both rodent models and humans alike, because of the inherent difficulty in tracking and deciphering large, highly dimensional neuronal ensembles that support memory (i.e., the engram). In a previous study we showed that consolidation of a new fear memory is allocated to a common topography of amygdala neurons. When a consolidated memory is retrieved, it may enter a labile state, requiring reconsolidation for it to persist. What is not known is whether the original spatial allocation of a consolidated memory changes during reconsolidation. Knowledge about the spatial allocation of a memory, during consolidation and reconsolidation, provides fundamental insight into its core physical structure (i.e., the engram). Using design-based stereology, we operationally define reconsolidation by showing a nearly identical quantity of neurons in the dorsolateral amygdala (LAd) that expressed a plasticity-related protein, phosphorylated mitogen-activated protein kinase, following both memory acquisition and retrieval. Next, we confirm that Pavlovian fear conditioning recruits a stable, topographically organized population of activated neurons in the LAd. When the stored fear memory was briefly reactivated in the presence of the relevant conditioned stimulus, a similar topography of activated neurons was uncovered. In addition, we found evidence for activated neurons allocated to new regions of the LAd. These findings provide the first insight into the spatial allocation of a fear engram in the LAd, during its consolidation and reconsolidation phase.
Resumo:
Pavlovian fear conditioning is a robust technique for examining behavioral and cellular components of fear learning and memory. In fear conditioning, the subject learns to associate a previously neutral stimulus with an inherently noxious co-stimulus. The learned association is reflected in the subjects' behavior upon subsequent re-exposure to the previously neutral stimulus or the training environment. Using fear conditioning, investigators can obtain a large amount of data that describe multiple aspects of learning and memory. In a single test, researchers can evaluate functional integrity in fear circuitry, which is both well characterized and highly conserved across species. Additionally, the availability of sensitive and reliable automated scoring software makes fear conditioning amenable to high-throughput experimentation in the rodent model; thus, this model of learning and memory is particularly useful for pharmacological and toxicological screening. Due to the conserved nature of fear circuitry across species, data from Pavlovian fear conditioning are highly translatable to human models. We describe equipment and techniques needed to perform and analyze conditioned fear data. We provide two examples of fear conditioning experiments, one in rats and one in mice, and the types of data that can be collected in a single experiment. © 2012 Springer Science+Business Media, LLC.
Resumo:
Pavlovian fear conditioning, also known as classical fear conditioning is an important model in the study of the neurobiology of normal and pathological fear. Progress in the neurobiology of Pavlovian fear also enhances our understanding of disorders such as posttraumatic stress disorder (PTSD) and with developing effective treatment strategies. Here we describe how Pavlovian fear conditioning is a key tool for understanding both the neurobiology of fear and the mechanisms underlying variations in fear memory strength observed across different phenotypes. First we discuss how Pavlovian fear models aspects of PTSD. Second, we describe the neural circuits of Pavlovian fear and the molecular mechanisms within these circuits that regulate fear memory. Finally, we show how fear memory strength is heritable; and describe genes which are specifically linked to both changes in Pavlovian fear behavior and to its underlying neural circuitry. These emerging data begin to define the essential genes, cells and circuits that contribute to normal and pathological fear.
Resumo:
This thesis is a study of how the contents of volatile memory on the Windows operating system can be better understood and utilised for the purposes of digital forensic investigations. It proposes several techniques to improve the analysis of memory, with a focus on improving the detection of unknown code such as malware. These contributions allow the creation of a more complete reconstruction of the state of a computer at acquisition time, including whether or not the computer has been infected by malicious code.
Resumo:
The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling. [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia