932 resultados para MEMORY SYSTEMS INTERACTION
Resumo:
Phase-locked loops (PLLs) are a crucial component in modern communications systems. Comprising of a phase-detector, linear filter, and controllable oscillator, they are widely used in radio receivers to retrieve the information content from remote signals. As such, they are capable of signal demodulation, phase and carrier recovery, frequency synthesis, and clock synchronization. Continuous-time PLLs are a mature area of study, and have been covered in the literature since the early classical work by Viterbi [1] in the 1950s. With the rise of computing in recent decades, discrete-time digital PLLs (DPLLs) are a more recent discipline; most of the literature published dates from the 1990s onwards. Gardner [2] is a pioneer in this area. It is our aim in this work to address the difficulties encountered by Gardner [3] in his investigation of the DPLL output phase-jitter where additive noise to the input signal is combined with frequency quantization in the local oscillator. The model we use in our novel analysis of the system is also applicable to another of the cases looked at by Gardner, that is the DPLL with a delay element integrated in the loop. This gives us the opportunity to look at this system in more detail, our analysis providing some unique insights into the variance `dip' seen by Gardner in [3]. We initially provide background on the probability theory and stochastic processes. These branches of mathematics are the basis for the study of noisy analogue and digital PLLs. We give an overview of the classical analogue PLL theory as well as the background on both the digital PLL and circle map, referencing the model proposed by Teplinsky et al. [4, 5]. For our novel work, the case of the combined frequency quantization and noisy input from [3] is investigated first numerically, and then analytically as a Markov chain via its Chapman-Kolmogorov equation. The resulting delay equation for the steady-state jitter distribution is treated using two separate asymptotic analyses to obtain approximate solutions. It is shown how the variance obtained in each case matches well to the numerical results. Other properties of the output jitter, such as the mean, are also investigated. In this way, we arrive at a more complete understanding of the interaction between quantization and input noise in the first order DPLL than is possible using simulation alone. We also do an asymptotic analysis of a particular case of the noisy first-order DPLL with delay, previously investigated by Gardner [3]. We show a unique feature of the simulation results, namely the variance `dip' seen for certain levels of input noise, is explained by this analysis. Finally, we look at the second-order DPLL with additive noise, using numerical simulations to see the effects of low levels of noise on the limit cycles. We show how these effects are similar to those seen in the noise-free loop with non-zero initial conditions.
Resumo:
To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI). During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets) or from non-viewed portions of the same game (foils). After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan's perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.
Resumo:
This special issue of Cortex focuses on the relative contribution of different neural networks to memory and the interaction of 'core' memory processes with other cognitive processes. In this article, we examine both. Specifically, we identify cognitive processes other than encoding and retrieval that are thought to be involved in memory; we then examine the consequences of damage to brain regions that support these processes. This approach forces a consideration of the roles of brain regions outside of the frontal, medial-temporal, and diencephalic regions that form a central part of neurobiological theories of memory. Certain kinds of damage to visual cortex or lateral temporal cortex produced impairments of visual imagery or semantic memory; these patterns of impairment are associated with a unique pattern of amnesia that was distinctly different from the pattern associated with medial-temporal trauma. On the other hand, damage to language regions, auditory cortex, or parietal cortex produced impairments of language, auditory imagery, or spatial imagery; however, these impairments were not associated with amnesia. Therefore, a full model of autobiographical memory must consider cognitive processes that are not generally considered 'core processes,' as well as the brain regions upon which these processes depend.
Resumo:
All of us are taxed with juggling our inner mental lives with immediate external task demands. For many years, the temporary maintenance of internal information was considered to be handled by a dedicated working memory (WM) system. It has recently become increasingly clear, however, that such short-term internal activation interacts with attention focused on external stimuli. It is unclear, however, exactly why these two interact, at what level of processing, and to what degree. Because our internal maintenance and external attention processes co-occur with one another, the manner of their interaction has vast implications for functioning in daily life. The work described here has employed original experimental paradigms combining WM and attention task elements, functional magnetic resonance imaging (fMRI) to illuminate the associated neural processes, and transcranial magnetic stimulation (TMS) to clarify the causal substrates of attentional brain function. These studies have examined a mechanism that might explain why (and when) the content of WM can involuntarily capture visual attention. They have, furthermore, tested whether fundamental attentional selection processes operate within WM, and whether they are reciprocal with attention. Finally, they have illuminated the neural consequences of competing attentional demands. The findings indicate that WM shares representations, operating principles, and cognitive resources with externally-oriented attention.
Resumo:
This chapter discusses the code parallelization environment, where a number of tools that address the main tasks, such as code parallelization, debugging, and optimization are available. The parallelization tools include ParaWise and CAPO, which enable the near automatic parallelization of real world scientific application codes for shared and distributed memory-based parallel systems. The chapter discusses the use of ParaWise and CAPO to transform the original serial code into an equivalent parallel code that contains appropriate OpenMP directives. Additionally, as user involvement can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform near automatic relative debugging of an OpenMP program that has been parallelized either using the tools or manually. In order for these tools to be effective in parallelizing a range of applications, a high quality fully inter-procedural dependence analysis, as well as user interaction is vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of parallelized NASA codes are discussed and show the benefits of using the environment.
Resumo:
The parallelization of real-world compute intensive Fortran application codes is generally not a trivial task. If the time to complete the parallelization is to be significantly reduced then an environment is needed that will assist the programmer in the various tasks of code parallelization. In this paper the authors present a code parallelization environment where a number of tools that address the main tasks such as code parallelization, debugging and optimization are available. The ParaWise and CAPO parallelization tools are discussed which enable the near automatic parallelization of real-world scientific application codes for shared and distributed memory-based parallel systems. As user involvement in the parallelization process can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform nearly automatic relative debugging of a program that has been parallelized using the tools. A high quality interprocedural dependence analysis as well as user-tool interaction are also highlighted and are vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of benchmark and real-world application codes parallelized are presented and show the benefits of using the environment
Resumo:
This paper describes work towards the deployment of flexible self-management into real-time embedded systems. A challenging project which focuses specifically on the development of a dynamic, adaptive automotive middleware is described, and the specific self-management requirements of this project are discussed. These requirements have been identified through the refinement of a wide-ranging set of use cases requiring context-sensitive behaviours. A sample of these use-cases is presented to illustrate the extent of the demands for self-management. The strategy that has been adopted to achieve self-management, based on the use of policies is presented. The embedded and real-time nature of the target system brings the constraints that dynamic adaptation capabilities must not require changes to the run-time code (except during hot update of complete binary modules), adaptation decisions must have low latency, and because the target platforms are resource-constrained the self-management mechanism have low resource requirements (especially in terms of processing and memory). Policy-based computing is thus and ideal candidate for achieving the self-management because the policy itself is loaded at run-time and can be replaced or changed in the future in the same way that a data file is loaded. Policies represent a relatively low complexity and low risk means of achieving self-management, with low run-time costs. Policies can be stored internally in ROM (such as default policies) as well as externally to the system. The architecture of a designed-for-purpose powerful yet lightweight policy library is described. A suitable evaluation platform, supporting the whole life-cycle of feasibility analysis, concept evaluation, development, rigorous testing and behavioural validation has been devised and is described.
Resumo:
In Higher Education web-based course support systems are essential for supporting flexible learning environments. They provide tools to enable the interaction between student and tutor to reinforce transfer of theory to understanding particularly in an academic environment, therefore this paper will examine issues associated with the use of curriculum and learning resources within Web-based course support systems and the effectiveness of the resulting flexible learning environments This paper is a general discussion about flexible learning and in this case how it was applied to one of the courses at undergraduate level one. The first section will introduce what is flexible learning and the importance of flexible learning in Higher Education followed by the description of the course and why the flexible learning concepts is important in such a course and finally, how the flexibility was useful for this particular instance.
Resumo:
A multi-sensor satellite approach based on ocean colour, sunglint and Synthetic Aperture Radar imagery is used to study the impact of interacting internal tidal (IT) waves on near-surface chlorophyll-a distribution, in the central Bay of Biscay. Satellite imagery was initially used to characterize the internal solitary wave (ISW) field in the study area, where the “local generation mechanism” was found to be associated with two distinct regions of enhanced barotropic tidal forcing. IT beams formed at the French shelf-break, and generated from critical bathymetry in the vicinities of one of these regions, were found to be consistent with “locally generated” ISWs. Representative case studies illustrate the existence of two different axes of IT propagation originating from the French shelf-break, which intersect close to 46°N, − 7°E, where strong IT interaction has been previously identified. Evidence of constructive interference between large IT waves is then presented and shown to be consistent with enhanced levels of chlorophyll-a concentration detected by means of ocean colour satellite sensors. Finally, the results obtained from satellite climatological mean chlorophyll-a concentration from late summer (i.e. September, when ITs and ISWs can meet ideal propagation conditions) suggest that elevated IT activity plays a significant role in phytoplankton vertical distribution, and therefore influences the late summer ecology in the central Bay of Biscay.
Resumo:
Laboratory studies were conducted to investigate the interactions of nanoparticles (NPs) formed via simulated cloud processing of mineral dust with seawater under environmentally relevant conditions. The effect of sunlight and the presence of exopolymeric substances (EPS) were assessed on the: (1) colloidal stability of the nanoparticle aggregates (i.e. size distribution, zeta potential, polydispersity); (2) micromorphology and (3) Fe dissolution from particles. We have demonstrated that: (i) synthetic nano-ferrihydrite has distinct aggregation behaviour from NPs formed from mineral dusts in that the average hydrodynamic diameter remained unaltered upon dispersion in seawater (~1500 nm), whilst all dust derived NPs increased about three fold in aggregate size; (ii) relatively stable and monodisperse aggregates of NPs formed during simulated cloud processing of mineral dust become more polydisperse and unstable in contact with seawater; (iii) EPS forms stable aggregates with both the ferrihydrite and the dust derived NPs whose hydrodynamic diameter remains unchanged in seawater over 24h; (iv) dissolved Fe concentration from NPs, measured here as <3 kDa filter-fraction, is consistently >30% higher in seawater in the presence of EPS and the effect is even more pronounced in the absence of light; (v) micromorphology of nanoparticles from mineral dusts closely resemble that of synthetic ferrihydrite in MQ water, but in seawater with EPS they form less compact aggregates, highly variable in size, possibly due to EPS-mediated steric and electrostatic interactions. The larger scale implications on real systems of the EPS solubilising effect on Fe and other metals with the additional enhancement of colloidal stability of the resulting aggregates are discussed.
Resumo:
The authors are concerned with the development of computer systems that are capable of using information from faces and voices to recognise people's emotions in real-life situations. The paper addresses the nature of the challenges that lie ahead, and provides an assessment of the progress that has been made in the areas of signal processing and analysis techniques (with regard to speech and face), and the psychological and linguistic analyses of emotion. Ongoing developmental work by the authors in each of these areas is described.
Resumo:
Closing feedback loops using an IEEE 802.11b ad hoc wireless communication network incurs many challenges sensitivity to varying channel conditions and lower physical transmission rates tend to limit the bandwidth of the communication channel. Given that the bandwidth usage and control performance are linked, a method of adapting the sampling interval based on an 'a priori', static sampling policy has been proposed and, more significantly, assuring stability in the mean square sense using discrete-time Markov jump linear system theory. Practical issues including current limitations of the 802.11 b protocol, the sampling policy and stability are highlighted. Simulation results on a cart-mounted inverted pendulum show that closed-loop stability can be improved using sample rate adaptation and that the control design criteria can be met in the presence of channel errors and severe channel contention.