864 resultados para Last in last out memory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This purely theoretical thesis covers aspects of two contemporary research fields: the non-equilibrium dynamics in quantum systems and the electronic properties of three-dimensional topological insulators. In the first part we investigate the non-equilibrium dynamics in closed quantum systems. Thanks to recent technologies, especially from the field of ultracold quantum gases, it is possible to realize such systems in the laboratory. The focus is on the influence of hydrodynamic slow modes on the thermalization process. Generic systems in equilibrium, either classical or quantum, in equilibrium are described by thermodynamics. This is characterized by an ensemble of maximal entropy, but constrained by macroscopically conserved quantities. We will show that these conservation laws slow down thermalization and the final equilibrium state can be approached only algebraically in time. When the conservation laws are violated thermalization takes place exponential in time. In a different study we calculate probability distributions of projective quantum measurements. Newly developed quantum microscopes provide the opportunity to realize new measurement protocols which go far beyond the conventional measurements of correlation functions. The second part of this thesis is dedicated to a new class of materials known as three-dimensional topological insulators. Also here new experimental techniques have made it possible to fabricate these materials to a high enough quality that their topological nature is revealed. However, their transport properties are not fully understood yet. Motivated by unusual experimental results in the optical conductivity we have investigated the formation and thermal destruction of spatially localized electron- and hole-doped regions. These are caused by charged impurities which are introduced into the material in order to make the bulk insulating. Our theoretical results are in agreement with the experiment and can explain the results semi-quantitatively. Furthermore, we study emergent lengthscales in the bulk as well as close to the conducting surface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the multi-core CPU world, transactional memory (TM)has emerged as an alternative to lock-based programming for thread synchronization. Recent research proposes the use of TM in GPU architectures, where a high number of computing threads, organized in SIMT fashion, requires an effective synchronization method. In contrast to CPUs, GPUs offer two memory spaces: global memory and local memory. The local memory space serves as a shared scratch-pad for a subset of the computing threads, and it is used by programmers to speed-up their applications thanks to its low latency. Prior work from the authors proposed a lightweight hardware TM (HTM) support based in the local memory, modifying the SIMT execution model and adding a conflict detection mechanism. An efficient implementation of these features is key in order to provide an effective synchronization mechanism at the local memory level. After a quick description of the main features of our HTM design for GPU local memory, in this work we gather together a number of proposals designed with the aim of improving those mechanisms with high impact on performance. Firstly, the SIMT execution model is modified to increase the parallelism of the application when transactions must be serialized in order to make forward progress. Secondly, the conflict detection mechanism is optimized depending on application characteristics, such us the read/write sets, the probability of conflict between transactions and the existence of read-only transactions. As these features can be present in hardware simultaneously, it is a task of the compiler and runtime to determine which ones are more important for a given application. This work includes a discussion on the analysis to be done in order to choose the best configuration solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Has the 1998 prediction of a well-known contact lens researcher – that rigid contact lenses will be obsolete by the year 2010 – come to fruition? This Eulogy to RGPs will demonstrate why it has. A recent survey of international contact lens prescribing trends shows that rigid lenses constituted less than 5% of all contact lenses prescribed in 16 out of 27 nations surveyed. This compares with rigid lenses representing 100% of all lenses prescribed 1965 and about 40% in 1990). With the wide range of sophisticated soft lens materials available today, including super-permeable silicone hydrogels, and designs capable of correcting astigmatism and presbyopia, there is now no need to fit cosmetic patients with rigid lenses, with the associated intractable problems of rigid lens-induced ptosis, 3 and 9 o’clock, staining, lens binding, corneal warpage and adaptation discomfort. Orthokeratology is largely a fringe application of marginal efficacy, and the notion that rigid lenses arrest myopia progression is flawed. That last bastion of rigid lens practice – fitting patients with severely distorted corneas as in keratoconus – is about to crumble in view of a number of demonstrations by independent research groups of the efficacy of custom-designed wavefront-corrected soft contact lenses for the correction of keratoconus. It is concluded that rigid contact lenses now have no place in modern contact lens practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generation of a correlation matrix for set of genomic sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. Each sequence may be millions of bases long and there may be thousands of such sequences which we wish to compare, so not all sequences may fit into main memory at the same time. Each sequence needs to be compared with every other sequence, so we will generally need to page some sequences in and out more than once. In order to minimize execution time we need to minimize this I/O. This paper develops an approach for faster and scalable computing of large-size correlation matrices through the maximal exploitation of available memory and reducing the number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different bioinformatics problems with different correlation matrix sizes. The significant performance improvement of the approach over previous work is demonstrated through benchmark examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solid state chemistry was in its infancy when the author got interested in the subject. In this article, the author outlines the manner in which the subject has grown over the last four decades, citing representative examples from his own contributions to the different facets of the subject. The various aspects covered include synthesis, structure, defects, phase transitions, transition metal oxides, catalysts, superconductors, metal clusters and fullerenes. In an effort to demonstrate the breadth and vitality of the subject, the author shares his own experiences and aspirations and gives expression to the agony and ecstacy in carrying out experimental research in such a frontier area in India.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

den Dunnen et al. [den Dunnen, W.F.A., Brouwer, W.H., Bijlard, E., Kamphuis, J., van Linschoten, K., Eggens-Meijer, E., Holstege, G., 2008. No disease in the brain of a 115-year-old woman. Neurobiol. Aging] had the opportunity to follow up the cognitive functioning of one of the world's oldest woman during the last 3 years of her life. They performed two neuropsychological evaluations at age 112 and 115 that revealed a striking preservation of immediate recall abilities and orientation. In contrast, working memory, retrieval from semantic memory and mental arithmetic performances declined after age 112. Overall, only a one-point decrease of MMSE score occurred (from 27 to 26) reflecting the remarkable preservation of cognitive abilities. The neuropathological assessment showed few neurofibrillary tangles (NFT) in the hippocampal formation compatible with Braak staging II, absence of amyloid deposits and other types of neurodegenerative lesions as well as preservation of neuron numbers in locus coeruleus. This finding was related to a striking paucity of Alzheimer disease (AD)-related lesions in the hippocampal formation. The present report parallels the early descriptions of rare "supernormal" centenarians supporting the dissociation between brain aging and AD processes. In conjunction with recent stereological analyses in cases aged from 90 to 102 years, it also points to the marked resistance of the hippocampal formation to the degenerative process in this age group and possible dissociation between the occurrence of slight cognitive deficits and development of AD-related pathologic changes in neocortical areas. This work is discussed in the context of current efforts to identify the biological and genetic parameters of human longevity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les commotions cérébrales subies en contexte sportif constituent un sujet préoccupant. Il est estimé qu’aux États-Unis, environ cinq pourcent de l’ensemble des athlètes subiront une commotion cérébrale. Celle-ci est considérée comme une blessure transitoire dans la majorité des cas. Dans le domaine de la commotion cérébrale sportive, le phénomène de risque accru chez des athlètes ayant subi préalablement des commotions cérébrales est bien documenté. Cet aspect remet en question l’aspect transitoire de la blessure. Les techniques d’imagerie fonctionnelle offrent un grand potentiel dans la compréhension de cette pathologie en montrant notamment les différences fonctionnelles chez des participants ayant subi un traumatisme crânio-cérébral léger en l’absence de résultats comportementaux. Il est probable que des altérations fonctionnelles persistent au-delà de la phase de récupération postsymptômes. L’électrophysiologie, en particulier les potentiels évoqués cognitifs sont un outil de choix pour étudier la question en raison de leur sensibilité et de la mesure fonctionnelle qu’ils permettent d’obtenir. Les potentiels évoqués cognitifs consistent en une réponse électrique cérébrale moyenne générée lors de l’accomplissement d’une tâche. Il est possible d’identifier différentes composantes dans le tracé d’un potentiel évoqué; ces composantes sont associées à différents aspects de l’activité électrique cérébrale durant le traitement perceptuel et cognitif.Les articles scientifiques inclus dans cette thèse se penchent sur les effets de commotions cérébrales multiples chez des athlètes plusieurs mois après la dernière commotion. Dans un premier temps, l’aspect temporel est évalué par le biais de la mesure de la P3a et la P3b dans différents groupes d’athlètes. Ces composantes sont liées aux processus de mémoire et d’attention. Les résultats suggèrent que, malgré un fonctionnement normal, les athlètes ayant subi des commotions cérébrales éprouveraient de probables changements cognitifs sous-cliniques persistants se traduisant par une atténuation des P3a et P3b. Des altérations seraient aussi présentes quelques années après la dernière commotion, mais de façon plus subtile. La deuxième étude soumise s’intéresse aux processus électrophysiologiques liés au maintien de l’information en mémoire de travail visuel chez des athlètes ayant subi plusieurs commotions cérébrales. La mesure utilisée est la SPCN (sustained posterior controlateral negativity), une composante ERP spécifique au processus cognitif étudié. Les résultats montrent non seulement une composante atténuée chez les athlètes ayant subi trois commotions cérébrales ou plus, mais aussi une modulation de la composante en fonction du nombre de commotions cérébrales subies. Ces résultats pourraient contribuer à expliquer le risque accru de subir des commotions cérébrales subséquentes observées chez ces athlètes. En lien avec la littérature, ces données pourraient s’expliquer par la présence de déficits cognitifs sous-cliniques ou encore par la mise en place de mécanismes compensatoires. Enfin, ces résultats invitent à une grande prudence dans la gestion des cas de commotions cérébrales ainsi qu’à un effort d’éducation plus poussé chez les jeunes athlètes afin qu’ils puissent prendre les meilleures décisions concernant leur avenir.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L'apprentissage profond est un domaine de recherche en forte croissance en apprentissage automatique qui est parvenu à des résultats impressionnants dans différentes tâches allant de la classification d'images à la parole, en passant par la modélisation du langage. Les réseaux de neurones récurrents, une sous-classe d'architecture profonde, s'avèrent particulièrement prometteurs. Les réseaux récurrents peuvent capter la structure temporelle dans les données. Ils ont potentiellement la capacité d'apprendre des corrélations entre des événements éloignés dans le temps et d'emmagasiner indéfiniment des informations dans leur mémoire interne. Dans ce travail, nous tentons d'abord de comprendre pourquoi la profondeur est utile. Similairement à d'autres travaux de la littérature, nos résultats démontrent que les modèles profonds peuvent être plus efficaces pour représenter certaines familles de fonctions comparativement aux modèles peu profonds. Contrairement à ces travaux, nous effectuons notre analyse théorique sur des réseaux profonds acycliques munis de fonctions d'activation linéaires par parties, puisque ce type de modèle est actuellement l'état de l'art dans différentes tâches de classification. La deuxième partie de cette thèse porte sur le processus d'apprentissage. Nous analysons quelques techniques d'optimisation proposées récemment, telles l'optimisation Hessian free, la descente de gradient naturel et la descente des sous-espaces de Krylov. Nous proposons le cadre théorique des méthodes à région de confiance généralisées et nous montrons que plusieurs de ces algorithmes développés récemment peuvent être vus dans cette perspective. Nous argumentons que certains membres de cette famille d'approches peuvent être mieux adaptés que d'autres à l'optimisation non convexe. La dernière partie de ce document se concentre sur les réseaux de neurones récurrents. Nous étudions d'abord le concept de mémoire et tentons de répondre aux questions suivantes: Les réseaux récurrents peuvent-ils démontrer une mémoire sans limite? Ce comportement peut-il être appris? Nous montrons que cela est possible si des indices sont fournis durant l'apprentissage. Ensuite, nous explorons deux problèmes spécifiques à l'entraînement des réseaux récurrents, à savoir la dissipation et l'explosion du gradient. Notre analyse se termine par une solution au problème d'explosion du gradient qui implique de borner la norme du gradient. Nous proposons également un terme de régularisation conçu spécifiquement pour réduire le problème de dissipation du gradient. Sur un ensemble de données synthétique, nous montrons empiriquement que ces mécanismes peuvent permettre aux réseaux récurrents d'apprendre de façon autonome à mémoriser des informations pour une période de temps indéfinie. Finalement, nous explorons la notion de profondeur dans les réseaux de neurones récurrents. Comparativement aux réseaux acycliques, la définition de profondeur dans les réseaux récurrents est souvent ambiguë. Nous proposons différentes façons d'ajouter de la profondeur dans les réseaux récurrents et nous évaluons empiriquement ces propositions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between working memory (WM) and attention is a highly interdependent one, with evidence that attention determines the state in which items in WM are retained. Through focusing of attention, an item might be held in a more prioritized state, commonly termed as the focus of attention (FOA). The remaining items, although still retrievable, are considered to be in a different representational state. One means to bring an item into the FOA is to use retrospective cues (‘retro-cues’) which direct attention to one of the objects retained in WM. Alternatively, an item can enter a privileged state once attention is directed towards it through bottom-up influences (e.g. recency effect) or by performing an action on one of the retained items (‘incidental’ cueing). In all these cases, the item in the FOA is recalled with better accuracy compared to the other items in WM. Far less is known about the nature of the other items in WM and whether they can be flexibly manipulated in and out of the FOA. We present data from three types of experiments as well as transcranial magnetic stimulation to early visual cortex to manipulate the item inside FOA. Taken together, our results suggest that the context in which items are retained in WM matters. When an item remains behaviourally relevant, despite not being inside the FOA, re-focusing attention upon it can increase its recall precision. This suggests that a non-FOA item can be held in a state in which it can be later retrieved. However, if an item is rendered behaviourally unimportant because it is very unlikely to be probed, it cannot be brought back into the FOA, nor recalled with high precision. Under such conditions, some information appears to be irretrievably lost from WM. These findings, obtained from several different methods, demonstrate quite considerable flexibility with which items in WM can be represented depending upon context. They have important consequences for emerging state-dependent models of WM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Sweden the number of rural food shops has decreased for more than 50 years. Often the closing of a village shop is supposed to affect the migration patterns in the area it has been serving. However, according to this study, neither in- nor out-migration in the area affected by the closing is affected. The deficits of migration usual in those areas are established at least 10-12 years before the closing year. Thus, the typical closing takes place subsequent to a long term population decline. On the other hand, localities hosting a shop that survived during the study period 1990-2004 have a bigger total population and show tendencies towards decreasing deficit of migration at any potential closing year. These statistical results are supported by interviews carried out in three villages where the last shop has closed. They indicate that the shop has already lost its importance as supplier when it closes. By then the village shop is primarily used as complement to nearby towns or shopping centres. Each of the two studies accounted for here point at a relative un-importance of the village shop as a service point at the closing time. However, as it often offer the last public space in the village the village shop serves a key function as a meeting point for some households. When the shop has closed, the village holds private homes only. That is a situation increasing loneliness to some inhabitants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the financial and economic crisis began to affect the real economy and spread throughout the world, the region’s economies have been faced with a situation where data on employment and labour reflect the real stories of millions of women and men for whom the future has become uncertain. When these problems began to appear, the International Labour Organization (ILO) warned that the world faced a global employment crisis whose consequences could lead to a social recession. As the Economic Commission for Latin America and the Caribbean (ECLAC) has pointed out, the outbreak of the crisis put an end to a five-year period of sustained growth and falling unemployment. As early as the second half of 2008, the figures began to reflect slowing economic growth, while a downward slide began in the labour market. This initial bulletin, produced jointly by ECLAC and ILO, seeks to review the ways in which the crisis is affecting the region’s labour markets. Amidst a situation characterized by shocks and uncertainty, governments and social partners must have the inputs needed for designing public policies to increase the population’s levels of employment and well-being. It is planned to produce two further bulletins by January 2010, in order to measure the impact of the crisis on employment and provide an input to the process of defining the best public policies to reverse its consequences. The bulletin reviews the most recent available indicators and analyses them in order to establish trends and detect variations. It provides statistics for the first quarter, estimates for the rest of 2009, and a review of policies announced by the Governments. In 2008, the last year of the growth cycle, the region’s urban unemployment stood at 7.5%. According to economic growth forecasts for 2009, the average annual urban unemployment rate for the region will increase to between 8.7% and 9.1%; in other words, between 2.8 million and 3.9 million additional people will swell the ranks of the unemployed. Data for the first quarter of 2009 already confirm that the crisis is hitting employment in the region. Compared with the first quarter of 2008, the urban unemployment rate was up by 0.6 percentage points, representing over a million people.Work will continue until September 2009 on the preparation of a new report on the employment situation, using data updated to the first half of 2009. This will provide a picture of the region’s employment situation, so that growth and employment projections can be adjusted for 2009 as a whole. Strategies for dealing with the crisis must have jobs and income protection as their central goals. Policies are moving in that direction in Latin America and the Caribbean and, if they are effective, an even greater worsening of the situation may be avoided. Labour produces wealth, generates consumption, keeps economies functioning and is a key factor in seeking out the way to more sustainable and equitable growth once the crisis is past.