957 resultados para Graphic ProcessingUnits, GPUs
Resumo:
Illustrations are an integral part of many dictionaries, but the selection, placing, and sizing of illustrations is often highly conservative, and can appear to reflect the editorial concerns and technological constraints of previous eras. We might start with the question ‘why not illustrate?’, especially when we consider the ability of an illustration to simplify the definition of technical terms. How do illustrations affect the reader’s view of a dictionary as objective, and how illustrations reinforce the pedagogic aims of the dictionary? By their graphic nature, illustrations stand out from the field of text against which they stand, and they can immediately indicate to the reader the level of seriousness or popularity of the book’s approach, or the age-range that it is intended for. And illustrations are expensive to create and can add to printing costs, so it is not surprising that there is much direct and indirect copying from dictionary to dictionary, and simple re-use. This article surveys developments in illustrating dictionaries, considering the difference between distributing individual illustrations through the text of the dictionary and grouping illustrations into larger synoptic illustrations; the graphic style of illustrations; and the role of illustrations in ‘feature-led’ dictionary marketing.
Resumo:
The work in graphic communication carried out by Otto Neurath and his associates – now commonly known simply as Isotype – has been the subject of much interest in recent years. Conceived and developed in the 1920s as ‘the Vienna method of pictorial statistics’, this approach to designing information had from its inception the power to grow and spread internationally. Political developments in Europe played their part in its development, and production moved to the Netherlands (1934) and to England (1940), where the Isotype Institute continued to produce work until 1971. Bringing together the latest research, this book is the first comprehensive, detailed account of its subject. The Austrian, Dutch, and English years of Isotype are described here freshly and extensively. There are chapters on the notable extensions of Isotype to Soviet Russia, the USA, and Africa. Isotype work in film and in designing for children is fully documented and discussed. Between these main chapters the book presents interludes documenting Isotype production visually. Three appendices reprint key documents. In its international coverage and its extensions into the wider terrain of history, this book opens a new vista in graphic design.
Resumo:
A survey of the techniques, uses, and meanings of colour overprinting as employed by printers, graphic arts technicians, and graphic designers, principally in the twentieth century.
Resumo:
Much is made of the viscerally disturbing qualities embedded in The Texas Chain Saw Massacre - human bodies are traumatised, mutilated and distorted – and the way these are matched by close and often intense access to the performers involved. Graphic violence focused on the body specifically indicates the film as a key contemporary horror text. Yet, for all this closeness to the performers, it soon becomes clear in undertaking close-analysis of the film that access to them is equally characterised by extreme distance, both spatially and cognitively. The issue of distance is particularly striking, not least because of its ramifications on engagement, which throws up various aesthetic and methodological questions concerning performers’ expressive authenticity. This article considers the lack of access to performance in The Texas Chain Saw Massacre, paying particular attention to how this fits in with contemporaneous presentations of performance more generally, as seen in films such as Junior Bonner (Sam Peckinpah, 1972). As part of this investigation I consider the affect of such a severe disruption to access on engagement with, and discussion of, performance. At the heart of this investigation lie methodological considerations of the place of performance analysis in the post-studio period. How can we perceive anything of a character’s interior life, and therefore engage with performers who we fundamentally lack access to? Does such an apparently significant difference in the way performers and their embodiment is treated mean that they can even be thought of as delivering a performance?
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
Empirical mode decomposition (EMD) is a data-driven method used to decompose data into oscillatory components. This paper examines to what extent the defined algorithm for EMD might be susceptible to data format. Two key issues with EMD are its stability and computational speed. This paper shows that for a given signal there is no significant difference between results obtained with single (binary32) and double (binary64) floating points precision. This implies that there is no benefit in increasing floating point precision when performing EMD on devices optimised for single floating point format, such as graphical processing units (GPUs).
Resumo:
This article traces the intertextual relationships between Anya Ulinich’s graphic novel Lena Finkle’s Magic Barrel, Bernard Malamud’s short story ‘The Magic Barrel’ and a number of works by Philip Roth. Through these relationships and her construction of a number of variations on what Miriam Libicki has called a ‘gonzo self’ Ulinich explores the tensions between life and art, fact and fiction, and autobiography and the novel, mediating the aesthetic imperatives of what Roth has called the ‘written world’ and the ethical obligations of the ‘unwritten world’ in order to arrive at an authentic sense of herself as an artist and writer.
Resumo:
Tcl/Tk scripting language has become the de-facto standard for EDA tools. This paper explains how to start working with Tcl/Tk using simple examples. Two complete applications are presented to show in more detail the capabilities of the language. In one script average power consumption of a digital system is automated. A second script creates a virtual display driven by the simulation of a graphic card.
Resumo:
The present work describes a new tool that helps bidders improve their competitive bidding strategies. This new tool consists of an easy-to-use graphical tool that allows the use of more complex decision analysis tools in the field of Competitive Bidding. The graphic tool described here tries to move away from previous bidding models which attempt to describe the result of an auction or a tender process by means of studying each possible bidder with probability density functions. As an illustration, the tool is applied to three practical cases. Theoretical and practical conclusions on the great potential breadth of application of the tool are also presented.