958 resultados para memory access complexity
Resumo:
The use of n-tuple or weightless neural networks as pattern recognition devices is well known (Aleksander and Stonham, 1979). They have some significant advantages over the more common and biologically plausible networks, such as multi-layer perceptrons; for example, n-tuple networks have been used for a variety of tasks, the most popular being real-time pattern recognition, and they can be implemented easily in hardware as they use standard random access memories. In operation, a series of images of an object are shown to the network, each being processed suitably and effectively stored in a memory called a discriminator. Then, when another image is shown to the system, it is processed in a similar manner and the system reports whether it recognises the image; is the image sufficiently similar to one already taught? If the system is to be able to recognise and discriminate between m-objects, then it must contain m-discriminators. This can require a great deal of memory. This paper describes various ways in which memory requirements can be reduced, including a novel method for multiple discriminator n-tuple networks used for pattern recognition. By using this method, the memory normally required to handle m-objects can be used to recognise and discriminate between 2^m — 2 objects.
Resumo:
The use of n-tuple or weightless neural networks as pattern recognition devices has been well documented. They have a significant advantages over more common networks paradigms, such as the multilayer perceptron in that they can be easily implemented in digital hardware using standard random access memories. To date, n-tuple networks have predominantly been used as fast pattern classification devices. The paper describes how n-tuple techniques can be used in the hardware implementation of a general auto-associative network.
Resumo:
Recall in many types of verbal memory task is reliably disrupted by the presence of auditory distracters, with verbal distracters frequently proving the most disruptive (Beaman, 2005). A multinomial processing tree model (Schweickert, 1993) is applied to the effects on free recall of background speech from a known or an unknown language. The model reproduces the free recall curve and the impact on memory of verbal distracters for which a lexical entry exists (i.e., verbal items from a known language). The effects of semantic relatedness of distracters within a language is found to depend upon a redintegrative factor thought to reflect the contribution of the speech-production system. The differential impacts of known and unknown languages cannot be accounted for in this way, but the same effects of distraction are observed amongst bilinguals, regardless of distracter-language.
Resumo:
Cluttering is a rate-based disorder of fluency, the scope of whose diagnostic criteria currently remains unclear. This paper reports preliminary findings from a larger study which aims to determine whether cluttering can be associated with language disturbances as well as motor and rate based ones. Subtests from the Mt Wilga High Level Language Test (MWHLLT) were used to determine whether people who clutter (PWC) have word finding difficulties, and use significantly more maze behaviours compared to controls, during story re-telling and simple sequencing tasks. Independent t tests showed that PWC were significantly slower than control participants in lexical access and sentence completion tasks, but returned mixed findings when PWCs were required to name items within a semantic category. PWC produced significantly more maze behaviour than controls in a task where participants were required to explain how to undertake commonly performed actions, but no difference in use of maze behaviour was found between the two groups when retelling a story from memory. The implications of these findings are discussed
Resumo:
Our established understanding of lymphocyte migration suggests that naive and memory T cells travel throughout the body via divergent pathways; naive T cells circulate between blood and lymph whereas memory T cells additionally migrate through non-lymphoid organs. Evidence is now gradually emerging which suggests such disparate pathways between naive and memory T cells may not strictly be true, and that naive T cells gain access to the non-lymphoid environment in numbers approaching that of memory T cells. We discuss here the evidence for naive T-cell traffic into the non-lymphoid environment, compare and contrast this movement with what is known of memory T cells, and finally discuss the functional importance of why naive T cells might access the parenchymal tissues.
Resumo:
Decades of research attest that memory processes suffer under conditions of auditory distraction. What is however less well understood is whether people are able to modify how their memory processes are deployed in order to compensate for disruptive effects of distraction. The metacognitive approach to memory describes a variety of ways people can exert control over their cognitive processes to optimize performance. Here we describe our recent investigations into how these control processes change under conditions of auditory distraction. We specifically looked at control of encoding in the form of decisions about how long to study a word when it is presented and control of memory reporting in the form of decisions whether to volunteer or withhold retrieved details. Regarding control of encoding, we expected that people would compensate for disruptive effects of distraction by extending study time under noise. Our results revealed, however, that when exposed to irrelevant speech, people curtail rather than extend study. Regarding control of memory reporting, we expected that people would compensate for the loss of access to memory records by volunteering responses held with lower confidence. Our results revealed, however, that people’s reporting strategies do not differ when memory task is performed in silence or under auditory distraction, although distraction seriously undermines people’s confidence in their own responses. Together, our studies reveal novel avenues for investigating the psychological effects of auditory distraction within a metacognitive framework.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
Abstract Background: Depression is highly prevalent within individuals diagnosed with schizophrenia, and is associated with an increased risk of suicide. There are no current evidence based treatments for low mood within this group. The specific targeting of co-morbid conditions within complex mental health problems lends itself to the development of short-term structured interventions which are relatively easy to disseminate within health services. A brief cognitive intervention based on a competitive memory theory of depression, is being evaluated in terms of its effectiveness in reducing depression within this group. Methods/Design: This is a single blind, intention-to-treat, multi-site, randomized controlled trial comparing Positive Memory Training plus Treatment as Usual with Treatment as Usual alone. Participants will be recruited from two NHS Trusts in Southern England. In order to be eligible, participants must have a DSM-V diagnosis of schizophrenia or schizo-affective disorder and exhibit at least a mild level of depression. Following baseline assessment eligible participants will be randomly allocated to either the Positive Memory Training plus Treatment as Usual group or the Treatment as Usual group. Outcome will be assessed at the end of treatment (3-months) and at 6-month and 9-month post randomization by assessors blind to group allocation. The primary outcome will be levels of depression and secondary outcomes will be severity of psychotic symptoms and cost-effectiveness. Semi-structured interviews will be conducted with all participants who are allocated to the treatment group so as to explore the acceptability of the intervention. Discussion: Cognitive behaviour therapy is recommended for individuals diagnosed with schizophrenia. However, the number of sessions and length of training required to deliver this intervention has caused a limit in availability. The current trial will evaluate a short-term structured protocol which targets a co-morbid condition often considered of primary importance by service users. If successful the intervention will be an important addition to current initiatives aimed at increasing access to psychological therapies for people diagnosed with severe mental health problems. Trial registration: Current Controlled Trials. ISRCTN99485756. Registered 13 March 2014.
Resumo:
Voluntary physical activity improves memory and learning ability in rodents, whereas status epilepticus has been associated with memory impairment. Physical activity and seizures have been associated with enhanced hippocampal expression of BDNF, indicating that this protein may have a dual role in epilepsy. The influence of voluntary physical activity on memory and BDNF expression has been poorly studied in experimental models of epilepsy. In this paper, we have investigated the effect of voluntary physical activity on memory and BDNF expression in mice with pilocarpine-incluced epilepsy. Male Swiss mice were assigned to four experimental groups: pilocarpine sedentary (PS), pilocarpine runners (PRs), saline sedentary (SS) and saline runners (SRs). Two days after pilocarpine-induced status epilepticus, the affected mice (PR) and their running controls (SR) were housed with access to a running wheel for 28 days. After that, the spatial memory and the expression of the precursor and mature forms of hippocampal BDNF were assessed. PR mice performed better than PS mice in the water maze test. In addition, PR mice had a higher amount of mature BDNF (14 kDa) relative to the total BDNF (14 kDa + 28 kDa + 32 kDa forms) content when compared with PS mice. These results show that voluntary physical activity improved the spatial memory and increased the hippocampal content of mature BDNF of mice with pilocarpine-induced status epilepticus. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Searching in a dataset for elements that are similar to a given query element is a core problem in applications that manage complex data, and has been aided by metric access methods (MAMs). A growing number of applications require indices that must be built faster and repeatedly, also providing faster response for similarity queries. The increase in the main memory capacity and its lowering costs also motivate using memory-based MAMs. In this paper. we propose the Onion-tree, a new and robust dynamic memory-based MAM that slices the metric space into disjoint subspaces to provide quick indexing of complex data. It introduces three major characteristics: (i) a partitioning method that controls the number of disjoint subspaces generated at each node; (ii) a replacement technique that can change the leaf node pivots in insertion operations; and (iii) range and k-NN extended query algorithms to support the new partitioning method, including a new visit order of the subspaces in k-NN queries. Performance tests with both real-world and synthetic datasets showed that the Onion-tree is very compact. Comparisons of the Onion-tree with the MM-tree and a memory-based version of the Slim-tree showed that the Onion-tree was always faster to build the index. The experiments also showed that the Onion-tree significantly improved range and k-NN query processing performance and was the most efficient MAM, followed by the MM-tree, which in turn outperformed the Slim-tree in almost all the tests. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The present work proposes an investigation of the treatment given to memory in Pinter’s latest play, Ashes to Ashes, and of its function in the development of Pinter’s work. In order to do that, different aspects of the construction of meaning in the theatre are analysed, so that the specificity of its reception is determined. A survey of techniques used to present information, time and space in the theatre is made. The analytical drama, the history drama, and the theatre of the absurd are defined. After that, the evolution of the author’s work is analysed to determine what characterises Pinter’s work, while at the same time determining how his treatment of themes like menace, memory, and political oppression of the individual has evolved. Finally, a detailed survey of the apparently disconnected elements that are mentioned in Ashes to Ashes is made. The intertextual analysis allied to a study of the analytical form as used in this play enables the discovery of several layers of meaning. Through the connection established between the Holocaust and man’s fall followed by expulsion from Eden, Pinter examines the use of memory as a way of dealing with personal and collective responsibility and guilt. It is through the recovery of memory (also through writing) that the present can establish a critical and responsible relation with the past.
Resumo:
This doctoral dissertation analyzes two novels by the American novelist Robert Coover as examples of hypertextual writing on the book bound page, as tokens of hyperfiction. The complexity displayed in the novels, John's Wife and The Adventures of Lucky Pierre, integrates the cultural elements that characterize the contemporary condition of capitalism and technologized practices that have fostered a different subjectivity evidenced in hypertextual writing and reading, the posthuman subjectivity. The models that account for the complexity of each novel are drawn from the concept of strange attractors in Chaos Theory and from the concept of rhizome in Nomadology. The transformations the characters undergo in the degree of their corporeality sets the plane on which to discuss turbulence and posthumanity. The notions of dynamic patterns and strange attractors, along with the concept of the Body without Organs and Rhizome are interpreted, leading to the revision of narratology and to analytical categories appropriate to the study of the novels. The reading exercised throughout this dissertation enacts Daniel Punday's corporeal reading. The changes in the characters' degree of materiality are associated with the stages of order, turbulence and chaos in the story, bearing on the constitution of subjectivity within and along the reading process. Coover's inscription of planes of consistency to counter linearity and accommodate hypertextual features to the paper supported narratives describes the characters' trajectory as rhizomatic. The study led to the conclusion that narrative today stands more as a regime in a rhizomatic relation with other regimes in cultural practice than as an exclusively literary form and genre. Besides this, posthuman subjectivity emerges as class identity, holding hypertextual novels as their literary form of choice.
Resumo:
With the ever increasing demands for high complexity consumer electronic products, market pressures demand faster product development and lower cost. SoCbased design can provide the required design flexibility and speed by allowing the use of IP cores. However, testing costs in the SoC environment can reach a substantial percent of the total production cost. Analog testing costs may dominate the total test cost, as testing of analog circuits usually require functional verification of the circuit and special testing procedures. For RF analog circuits commonly used in wireless applications, testing is further complicated because of the high frequencies involved. In summary, reducing analog test cost is of major importance in the electronic industry today. BIST techniques for analog circuits, though potentially able to solve the analog test cost problem, have some limitations. Some techniques are circuit dependent, requiring reconfiguration of the circuit being tested, and are generally not usable in RF circuits. In the SoC environment, as processing and memory resources are available, they could be used in the test. However, the overhead for adding additional AD and DA converters may be too costly for most systems, and analog routing of signals may not be feasible and may introduce signal distortion. In this work a simple and low cost digitizer is used instead of an ADC in order to enable analog testing strategies to be implemented in a SoC environment. Thanks to the low analog area overhead of the converter, multiple analog test points can be observed and specific analog test strategies can be enabled. As the digitizer is always connected to the analog test point, it is not necessary to include muxes and switches that would degrade the signal path. For RF analog circuits, this is specially useful, as the circuit impedance is fixed and the influence of the digitizer can be accounted for in the design phase. Thanks to the simplicity of the converter, it is able to reach higher frequencies, and enables the implementation of low cost RF test strategies. The digitizer has been applied successfully in the testing of both low frequency and RF analog circuits. Also, as testing is based on frequency-domain characteristics, nonlinear characteristics like intermodulation products can also be evaluated. Specifically, practical results were obtained for prototyped base band filters and a 100MHz mixer. The application of the converter for noise figure evaluation was also addressed, and experimental results for low frequency amplifiers using conventional opamps were obtained. The proposed method is able to enhance the testability of current mixed-signal designs, being suitable for the SoC environment used in many industrial products nowadays.
Resumo:
OSAN, R. , TORT, A. B. L. , AMARAL, O. B. . A mismatch-based model for memory reconsolidation and extinction in attractor networks. Plos One, v. 6, p. e23113, 2011.