747 resultados para Unreliable narrator


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a design paradigm for energy efficient and variation-aware operation of next-generation multicore heterogeneous platforms. The main idea behind the proposed approach lies on the observation that not all operations are equally important in shaping the output quality of various applications and of the overall system. Based on such an observation, we suggest that all levels of the software design stack, including the programming model, compiler, operating system (OS) and run-time system should identify the critical tasks and ensure correct operation of such tasks by assigning them to dynamically adjusted reliable cores/units. Specifically, based on error rates and operating conditions identified by a sense-and-adapt (SeA) unit, the OS selects and sets the right mode of operation of the overall system. The run-time system identifies the critical/less-critical tasks based on special directives and schedules them to the appropriate units that are dynamically adjusted for highly-accurate/approximate operation by tuning their voltage/frequency. Units that execute less significant operations can operate at voltages less than what is required for correct operation and consume less power, if required, since such tasks do not need to be always exact as opposed to the critical ones. Such scheme can lead to energy efficient and reliable operation, while reducing the design cost and overheads of conventional circuit/micro-architecture level techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we investigate the impact of circuit misbehavior due to parametric variations and voltage scaling on the performance of wireless communication systems. Our study reveals the inherent error resilience of such systems and argues that sufficiently reliable operation can be maintained even in the presence of unreliable circuits and manufacturing defects. We further show how selective application of more robust circuit design techniques is sufficient to deal with high defect rates at low overhead and improve energy efficiency with negligible system performance degradation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Future digital signal processing (DSP) systems must provide robustness on algorithm and application level to the presence of reliability issues that come along with corresponding implementations in modern semiconductor process technologies. In this paper, we address this issue by investigating the impact of unreliable memories on general DSP systems. In particular, we propose a novel framework to characterize the effects of unreliable memories, which enables us to devise novel methods to mitigate the associated performance loss. We propose to deploy specifically designed data representations, which have the capability of substantially improving the system reliability compared to that realized by conventional data representations used in digital integrated circuits, such as 2's-complement or sign-magnitude number formats. To demonstrate the efficacy of the proposed framework, we analyze the impact of unreliable memories on coded communication systems, and we show that the deployment of optimized data representations substantially improves the error-rate performance of such systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded memories account for a large fraction of the overall silicon area and power consumption in modern SoC(s). While embedded memories are typically realized with SRAM, alternative solutions, such as embedded dynamic memories (eDRAM), can provide higher density and/or reduced power consumption. One major challenge that impedes the widespread adoption of eDRAM is that they require frequent refreshes potentially reducing the availability of the memory in periods of high activity and also consuming significant amount of power due to such frequent refreshes. Reducing the refresh rate while on one hand can reduce the power overhead, if not performed in a timely manner, can cause some cells to lose their content potentially resulting in memory errors. In this paper, we consider extending the refresh period of gain-cell based dynamic memories beyond the worst-case point of failure, assuming that the resulting errors can be tolerated when the use-cases are in the domain of inherently error-resilient applications. For example, we observe that for various data mining applications, a large number of memory failures can be accepted with tolerable imprecision in output quality. In particular, our results indicate that by allowing as many as 177 errors in a 16 kB memory, the maximum loss in output quality is 11%. We use this failure limit to study the impact of relaxing reliability constraints on memory availability and retention power for different technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we introduce a statistical data-correction framework that aims at improving the DSP system performance in presence of unreliable memories. The proposed signal processing framework implements best-effort error mitigation for signals that are corrupted by defects in unreliable storage arrays using a statistical correction function extracted from the signal statistics, a data-corruption model, and an application-specific cost function. An application example to communication systems demonstrates the efficacy of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we investigate the impact of faulty memory bit-cells on the performance of LDPC and Turbo channel decoders based on realistic memory failure models. Our study investigates the inherent error resilience of such codes to potential memory faults affecting the decoding process. We develop two mitigation mechanisms that reduce the impact of memory faults rather than correcting every single error. We show how protection of only few bit-cells is sufficient to deal with high defect rates. In addition, we show how the use of repair-iterations specifically helps mitigating the impact of faults that occur inside the decoder itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Ph. D.)--University of Washington, 1987

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We distilled research findings on sources of unreliable testimony from children into four principles that capture how the field of forensic developmental psychology conceptualizes this topic. The studies selected to illustrate these principles address three major questions: (a) how do young children perform in eyewitness studies, (b) why are some children less accurate than others, and (c) what phenomena generate unreliable testimony? Throughout our research, our focus is on factors other than lying that produce inaccurate or seemingly inconsistent autobiographical reports.Collectively, this research has shown that (a) children’s eyewitness accuracy is highly dependent on context, (b) neurological immaturity makes children vulnerable to errors under some circumstances, and (c) some children are more swayed by external influences than others. Finally, the diversity of factors that can influence the reliability of children’s testimony dictates that (d) analyzing children’s testimony as if they were adults (i.e., with adult abilities, sensibilities, and motivations) will lead to frequent misunderstandings. It takes considerable knowledge of development—including information about developmental psycholinguistics, memory development, and the gradual emergence of cognitive control—to work with child witnesses and to analyze cases as there are many sources of unreliable testimony.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Subjective and personal forms of nonfiction writing are enjoying exponential popularity in English language publishing currently, as an interested public engages with ‘true’ stories of society and culture. Yet a paradox exists at the centre of this form of writing. As readers, we want to know who the writer is and what she has to tell us. Yet as writers we use a persona, a constructed character, a narrator who is only partially the writer, to deliver the narrative. How is a writer able to convey ‘true’ stories that are inherently reliant on memory, within a constructed narrative persona?We find a ‘gap’ between the writer and the narrator/protagonist on the page, an empowered creative space in which composition occurs, facilitating a balance between the facts and lived experiences from which ‘true’ stories are crafted, and the acknowledged fallibility of human memory. While the gap between writer and writer-as-narrator provides an enabling space for creative composition, it also creates space for the perception of unreliability. The width of this gap, we argue, is crucial. Only if the gap is small, if writer and writer-as-narrator share a set of passionately held values, can the writer-as-narrator become a believable entity, satisfying the reader with the ‘truth’ of their story.