974 resultados para Main Memory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter discusses the code parallelization environment, where a number of tools that address the main tasks, such as code parallelization, debugging, and optimization are available. The parallelization tools include ParaWise and CAPO, which enable the near automatic parallelization of real world scientific application codes for shared and distributed memory-based parallel systems. The chapter discusses the use of ParaWise and CAPO to transform the original serial code into an equivalent parallel code that contains appropriate OpenMP directives. Additionally, as user involvement can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform near automatic relative debugging of an OpenMP program that has been parallelized either using the tools or manually. In order for these tools to be effective in parallelizing a range of applications, a high quality fully inter-procedural dependence analysis, as well as user interaction is vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of parallelized NASA codes are discussed and show the benefits of using the environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Book review of: Eisen, M.L., Quas, J.A. & Goodman, G.S. (Eds.) (2002). Memory and suggestibility in the forensic interview. Mahwah, NJ: Lawrence Erlbaum Associates. ISBN: 0-8058-3080-4/$55.00 Special Prepaid Price

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Code parallelization using OpenMP for shared memory systems is relatively easier than using message passing for distributed memory systems. Despite this, it is still a challenge to use OpenMP to parallelize application codes in a way that yields effective scalable performance when executed on a shared memory parallel system. We describe an environment that will assist the programmer in the various tasks of code parallelization and this is achieved in a greatly reduced time frame and level of skill required. The parallelization environment includes a number of tools that address the main tasks of parallelism detection, OpenMP source code generation, debugging and optimization. These tools include a high quality, fully interprocedural dependence analysis with user interaction capabilities to facilitate the generation of efficient parallel code, an automatic relative debugging tool to identify erroneous user decisions in that interaction and also performance profiling to identify bottlenecks. Finally, experiences of parallelizing some NASA application codes are presented to illustrate some of the benefits of using the evolving environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The parallelization of real-world compute intensive Fortran application codes is generally not a trivial task. If the time to complete the parallelization is to be significantly reduced then an environment is needed that will assist the programmer in the various tasks of code parallelization. In this paper the authors present a code parallelization environment where a number of tools that address the main tasks such as code parallelization, debugging and optimization are available. The ParaWise and CAPO parallelization tools are discussed which enable the near automatic parallelization of real-world scientific application codes for shared and distributed memory-based parallel systems. As user involvement in the parallelization process can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform nearly automatic relative debugging of a program that has been parallelized using the tools. A high quality interprocedural dependence analysis as well as user-tool interaction are also highlighted and are vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of benchmark and real-world application codes parallelized are presented and show the benefits of using the environment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores how new media environments represent and create collective memories of trauma; how creative digital practice can be a key methodology for memory studies and the potential of digital interfaces for representing and reconciling collective memories of trauma, particularly in the context of Cyprus. My project MemoryBank will be used as a model to discuss the potential role of creative digital media practice in both community arts and the formal education process in order to enable participants to engage with the process of peace and reconciliation in Cyprus and circumvent and negotiate politically ossified collective memory narratives and chauvinistic histories. [From the Author]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Culloden (BBC, 1964) The Great War (BBC, 1964) 1914-18 (BBC/KCET, 1996) Haig: the Unknown Soldier (BBC, 1996) Veterans: the Last Survivors of the Great War (BBC, 1998) 1900s House (Channel 4, 1999) The Western Front (BBC, 1999) History of Britain (BBC, 2000) 1940s House (Channel 4, 2001) The Ship (BBC, 2002) Surviving the Iron Age (BBC, 2001) The Trench (BBC, 2002) Frontier House (Channel 4, 2002) Lad's Army (BBC, 2002) Edwardian Country House (Channel 4, 2002) Spitfire Ace (Channel 4, 2003) World War One in Colour (Channel 5, 2003) 1914: the War Revolution (BBC, 2003) The First World War (Channel 4, 2003) Dunkirk (BBC, 2004) Dunkirk: The Soldier's Story (BBC, 2004) D-Day to Berlin (BBC, 2004) Bad Lad's Army (ITV, 2004) Destination D-Day: Raw Recruits (BBC, 2004) Bomber Crew (Channel 4, 2004) Battlefield Britain (BBC, 2004) The Last Battle (ARTE/ZDF, 2005) Who Do You Think You Are? (BBC, 2004, 2006) The Somme (Channel 4, 2005) [From the Publisher]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study information rates of time-varying flat-fading channels (FFC) modeled as finite-state Markov channels (FSMC). FSMCs have two main applications for FFCs: modeling channel error bursts and decoding at the receiver. Our main finding in the first application is that receiver observation noise can more adversely affect higher-order FSMCs than lower-order FSMCs, resulting in lower capacities. This is despite the fact that the underlying higher-order FFC and its corresponding FSMC are more predictable. Numerical analysis shows that at low to medium SNR conditions (SNR lsim 12 dB) and at medium to fast normalized fading rates (0.01 lsim fDT lsim 0.10), FSMC information rates are non-increasing functions of memory order. We conclude that BERs obtained by low-order FSMC modeling can provide optimistic results. To explain the capacity behavior, we present a methodology that enables analytical comparison of FSMC capacities with different memory orders. We establish sufficient conditions that predict higher/lower capacity of a reduced-order FSMC, compared to its original high-order FSMC counterpart. Finally, we investigate the achievable information rates in FSMC-based receivers for FFCs. We observe that high-order FSMC modeling at the receiver side results in a negligible information rate increase for normalized fading rates fDT lsim 0.01.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phytoplankton observation is the product of a number of trade-offs related to sampling processes, required level of diversity and size spectrum analysis capabilities of the techniques involved. Instruments combining the morphological and high-frequency analysis for phytoplankton cells are now available. This paper presents an application of the automated high-resolution flow cytometer Cytosub as a tool for analysing phytoplanktonic cells in their natural environment. High resolution data from a temporal study in the Bay of Marseille (analysis every 30 min over 1 month) and a spatial study in the Southern Indian Ocean (analysis every 5 min at 10 knots over 5 days) are presented to illustrate the capabilities and limitations of the instrument. Automated high-frequency flow cytometry revealed the spatial and temporal variability of phytoplankton in the size range 1−∼50 μm that could not be resolved otherwise. Due to some limitations (instrumental memory, volume analysed per sample), recorded counts could be statistically too low. By combining high-frequency consecutive samples, it is possible to decrease the counting error, following Poisson’s law, and to retain the main features of phytoplankton variability. With this technique, the analysis of phytoplankton variability combines adequate sampling frequency and effective monitoring of community changes.