888 resultados para invention and memory
Resumo:
How we get from transistors through to logic gates to ALUs and memory to the stored program and the fetch execute cycle through to machine code and high level languages. Inspired by Tanenbaum's approach in "Structured Computer Organozation"
Resumo:
This article is about the politics of landscape ideas, and the relationship between landscape, identity and memory. It explores these themes through the history of the Victoria Falls, and the tourist resort that developed around the waterfall after 1900. Drawing on oral and archival sources, including popular natural history writing and tourist guides, it investigates African and European ideas about the waterfall, and the ways that these interacted and changed in the course of colonial appropriations of the Falls area. The tourist experience of the resort and the landscape ideas promoted through it were linked to Edwardian notions of Britishness and empire, ideas of whiteness and settler identities that transcended new colonial borders, and to the subject identities accommodated or excluded. Cultures of colonial authority did not develop by simply overriding local ideas, they involved fusions, exchanges and selective appropriations of them. The two main African groups I am concerned with here are the Leya, who lived in small groups around the Falls under a number of separate chiefs, and the powerful Lozi rulers, to whom they paid tribute in the nineteenth century. The article highlights colonial authorities' celebration of aspects of the Lozi aristocracy's relationship with the river, and their exclusion of the Leya people who had a longer and closer relationship with the waterfall. It also touches on the politics of recent attempts to reverse this exclusion, and the controversial rewriting of history this has involved.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
A full assessment of para-virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-‐metal, as well as on para-‐virtualization. The idea is to see what the overheads of para-‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-‐metal, then on the para-‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-‐native performance. You can deploy both para-‐virtualization and full virtualization across various virtualized systems. Para-‐virtualization is an OS-‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.
Resumo:
Acute doses of Ginkgo biloba have been shown to improve attention and memory in young, healthy participants, but there has been a lack of investigation into possible effects on executive function. In addition, only one study has investigated the effects of chronic treatment in young volunteers. This study was conducted to compare the effects of ginkgo after acute and chronic treatment on tests of attention, memory and executive function in healthy university students. Using a placebo-controlled double-blind design, in experiment 1, 52 students were randomly allocated to receive a single dose of ginkgo (120 mg, n=26) or placebo (n=26), and were tested 4h later. In experiment 2, 40 students were randomly allocated to receive ginkgo (120 mg/day; n=20) or placebo (n=20) for a 6-week period and were tested at baseline and after 6 weeks of treatment. In both experiments, participants underwent tests of sustained attention, episodic and working memory, mental flexibility and planning, and completed mood rating scales. The acute dose of ginkgo significantly improved performance on the sustained-attention task and pattern-recognition memory task; however, there were no effects on working memory, planning, mental flexibility or mood. After 6 weeks of treatment, there were no significant effects of ginkgo on mood or any of the cognitive tests. In line with the literature, after acute administration ginkgo improved performance in tests of attention and memory. However, there were no effects after 6 weeks, suggesting that tolerance develops to the effects in young, healthy participants.
Resumo:
Between 8 and 40% of Parkinson disease (PD) patients will have visual hallucinations (VHs) during the course of their illness. Although cognitive impairment has been identified as a risk factor for hallucinations, more specific neuropsychological deficits underlying such phenomena have not been established. Research in psychopathology has converged to suggest that hallucinations are associated with confusion between internal representations of events and real events (i.e. impaired-source monitoring). We evaluated three groups: 17 Parkinson's patients with visual hallucinations, 20 Parkinson's patients without hallucinations and 20 age-matched controls, using tests of visual imagery, visual perception and memory, including tests of source monitoring and recollective experience. The study revealed that Parkinson's patients with hallucinations appear to have intact visual imagery processes and spatial perception. However, there were impairments in object perception and recognition memory, and poor recollection of the encoding episode in comparison to both non-hallucinating Parkinson's patients and healthy controls. Errors were especially likely to occur when encoding and retrieval cues were in different modalities. The findings raise the possibility that visual hallucinations in Parkinson's patients could stem from a combination of faulty perceptual processing of environmental stimuli, and less detailed recollection of experience combined with intact image generation. (C) 2002 Elsevier Science Ltd. All fights reserved.
Resumo:
Dual-system models suggest that English past tense morphology involves two processing routes: rule application for regular verbs and memory retrieval for irregular verbs (Pinker, 1999). In second language (L2) processing research, Ullman (2001a) suggested that both verb types are retrieved from memory, but more recently Clahsen and Felser (2006) and Ullman (2004) argued that past tense rule application can be automatised with experience by L2 learners. To address this controversy, we tested highly proficient Greek-English learners with naturalistic or classroom L2 exposure compared to native English speakers in a self-paced reading task involving past tense forms embedded in plausible sentences. Our results suggest that, irrespective to the type of exposure, proficient L2 learners of extended L2 exposure apply rule-based processing.
Resumo:
Cognitive control mechanisms—such as inhibition—decrease the likelihood that goal-directed activity is ceded to irrelevant events. Here, we use the action of auditory distraction to show how retrieval from episodic long-term memory is affected by competitor inhibition. Typically, a sequence of to-be-ignored spoken distracters drawn from the same semantic category as a list of visually-presented to-be-recalled items impairs free recall performance. In line with competitor inhibition theory (Anderson, 2003), free recall was worse for items on a probe trial if they were a repeat of distracter items presented during the previous, prime, trial (Experiment 1). This effect was only produced when the distracters were dominant members of the same category as the to-be-recalled items on the prime. For prime trials in which distracters were low-dominant members of the to-be-remembered item category or were unrelated to that category—and hence not strong competitors for retrieval—positive priming was found (Experiments 2 & 3). These results are discussed in terms of inhibitory approaches to negative priming and memory retrieval.
Resumo:
This book examines to what extent the invention and first use of nuclear weapons was a turning point in the history of warfare and strategy(to what extent was it a mere continuation or perfection of air power strategy? Were the casualty numbers really unprecedented?), the ethics of war (was this form of war against civilians unprecedented?), and it asks whether it was an expression of total war or did it create total war
Resumo:
In this article, we explore whether cross-linguistic differences in grammatical aspect encoding may give rise to differences in memory and cognition. We compared native speakers of two languages that encode aspect differently (English and Swedish) in four tasks that examined verbal descriptions of stimuli, online triads matching, and memory-based triads matching with and without verbal interference. Results showed between-group differences in verbal descriptions and in memory-based triads matching. However, no differences were found in online triads matching and in memory-based triads matching with verbal interference. These findings need to be interpreted in the context of the overall pattern of performance, which indicated that both groups based their similarity judgments on common perceptual characteristics of motion events. These results show for the first time a cross-linguistic difference in memory as a function of differences in grammatical aspect encoding, but they also contribute to the emerging view that language fine tunes rather than shapes perceptual processes that are likely to be universal and unchanging.
Resumo:
Our established understanding of lymphocyte migration suggests that naive and memory T cells travel throughout the body via divergent pathways; naive T cells circulate between blood and lymph whereas memory T cells additionally migrate through non-lymphoid organs. Evidence is now gradually emerging which suggests such disparate pathways between naive and memory T cells may not strictly be true, and that naive T cells gain access to the non-lymphoid environment in numbers approaching that of memory T cells. We discuss here the evidence for naive T-cell traffic into the non-lymphoid environment, compare and contrast this movement with what is known of memory T cells, and finally discuss the functional importance of why naive T cells might access the parenchymal tissues.
Resumo:
As people get older, they tend to remember more positive than negative information. This age-by-valence interaction has been called “positivity effect.” The current study addressed the hypotheses that baseline functional connectivity at rest is predictive of older adults' brain activity when learning emotional information and their positivity effect in memory. Using fMRI, we examined the relationship among resting-state functional connectivity, subsequent brain activity when learning emotional faces, and individual differences in the positivity effect (the relative tendency to remember faces expressing positive vs. negative emotions). Consistent with our hypothesis, older adults with a stronger positivity effect had increased functional coupling between amygdala and medial PFC (MPFC) during rest. In contrast, younger adults did not show the association between resting connectivity and memory positivity. A similar age-by-memory positivity interaction was also found when learning emotional faces. That is, memory positivity in older adults was associated with (a) enhanced MPFC activity when learning emotional faces and (b) increased negative functional coupling between amygdala and MPFC when learning negative faces. In contrast, memory positivity in younger adults was related to neither enhanced MPFC activity to emotional faces, nor MPFC–amygdala connectivity to negative faces. Furthermore, stronger MPFC–amygdala connectivity during rest was predictive of subsequent greater MPFC activity when learning emotional faces. Thus, emotion–memory interaction in older adults depends not only on the task-related brain activity but also on the baseline functional connectivity.
Resumo:
The present study addressed the hypothesis that emotional stimuli relevant to survival or reproduction (biologically emotional stimuli) automatically affect cognitive processing (e.g., attention, memory), while those relevant to social life (socially emotional stimuli) require elaborative processing to modulate attention and memory. Results of our behavioral studies showed that (1) biologically emotional images hold attention more strongly than do socially emotional images, (2) memory for biologically emotional images was enhanced even with limited cognitive resources, but (3) memory for socially emotional images was enhanced only when people had sufficient cognitive resources at encoding. Neither images’ subjective arousal nor their valence modulated these patterns. A subsequent functional magnetic resonance imaging study revealed that biologically emotional images induced stronger activity in the visual cortex and greater functional connectivity between the amygdala and visual cortex than did socially emotional images. These results suggest that the interconnection between the amygdala and visual cortex supports enhanced attention allocation to biological stimuli. In contrast, socially emotional images evoked greater activity in the medial prefrontal cortex (MPFC) and yielded stronger functional connectivity between the amygdala and MPFC than did biological images. Thus, it appears that emotional processing of social stimuli involves elaborative processing requiring frontal lobe activity.
Resumo:
Purpose – The purpose of this paper is to demonstrate how strategy is developed and implemented within a subsidiary of a global organization, the relationship between subsidiary and headquarters and the need for continuous change and adaption to remain relevant. Furthermore, this case study describes a successful process of invention and adoption. Design/methodology/approach – The paper draws on documentary evidence and a semistructured interview with Jill McDonald CEO and President of McDonald’s Northern Europe Division with responsibility for the UK, Sweden, Finland, Denmark, Norway and the Republic of Ireland. Management research rarely captures the views of the top executive, yet the top executives have a broad picture and are key strategic decision makers. Findings – The case study and interview offers a unique insight into factors contributing to McDonald’s unprecedented success (it has paid an increased dividend for the past 37 years). It also sheds light on its successful internationalization strategy. Originality/value – The case study draws on published material and augments this with an in-depth interview with the Chief Executive. Very few case studies offer insight into the thinking of a Chief Executive managing a subsidiary of a global organization. Its value lies in the lessons that managers and students of management can draw on the approach adopted by a highly successful global organization.
Resumo:
Research on invention has focused on business invention and little work has been conducted on the process and capability required for the individual inventor or the capabilities required for an advice to be considered an invention. This paper synthesises the results of an empirical survey of ten inventor case studies with current research on invention and recent capability affordance research to develop an integrated capability process model of human capabilities for invention and specific capabilities of an invented device. We identify eight necessary human effectivities required for individual invention capability and six functional key activities using these effectivities, to deliver the functional capability of invention. We also identified key differences between invention and general problem solving processes. Results suggest that inventive step capability relies on a unique application of principles that relate to a new combination of affordance chain with a new mechanism and or space time (affordance) path representing the novel way the device works, in conjunction with defined critical affordance operating factors that are the subject of the patent claims.