44 resultados para false memories


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded memories account for a large fraction of the overall silicon area and power consumption in modern SoC(s). While embedded memories are typically realized with SRAM, alternative solutions, such as embedded dynamic memories (eDRAM), can provide higher density and/or reduced power consumption. One major challenge that impedes the widespread adoption of eDRAM is that they require frequent refreshes potentially reducing the availability of the memory in periods of high activity and also consuming significant amount of power due to such frequent refreshes. Reducing the refresh rate while on one hand can reduce the power overhead, if not performed in a timely manner, can cause some cells to lose their content potentially resulting in memory errors. In this paper, we consider extending the refresh period of gain-cell based dynamic memories beyond the worst-case point of failure, assuming that the resulting errors can be tolerated when the use-cases are in the domain of inherently error-resilient applications. For example, we observe that for various data mining applications, a large number of memory failures can be accepted with tolerable imprecision in output quality. In particular, our results indicate that by allowing as many as 177 errors in a 16 kB memory, the maximum loss in output quality is 11%. We use this failure limit to study the impact of relaxing reliability constraints on memory availability and retention power for different technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the United Kingdom (UK) the centenary commemoration of the First World War has been driven by a combination of central government direction (and funding) with a multitude of local and community initiatives, with a particular focus on 4 August 2014; 1 July 2016 (the beginning of the Battle of the Somme) and 11 November 2018. ‘National’ ceremonies on these dates have been and will be supplemented with projects commemorating micro-stories and government-funded opportunities for schoolchildren to visit Great War battlefields, the latter clearly aimed to reinforce a contemporary sense of civic and national obligation and service. This article explores the problematic nature of this approach, together with the issues raised by the multi-national nature of the UK state itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current variation aware design methodologies, tuned for worst-case scenarios, are becoming increasingly pessimistic from the perspective of power and performance. A good example of such pessimism is setting the refresh rate of DRAMs according to the worst-case access statistics, thereby resulting in very frequent refresh cycles, which are responsible for the majority of the standby power consumption of these memories. However, such a high refresh rate may not be required, either due to extremely low probability of the actual occurrence of such a worst-case, or due to the inherent error resilient nature of many applications that can tolerate a certain number of potential failures. In this paper, we exploit and quantify the possibilities that exist in dynamic memory design by shifting to the so-called approximate computing paradigm in order to save power and enhance yield at no cost. The statistical characteristics of the retention time in dynamic memories were revealed by studying a fabricated 2kb CMOS compatible embedded DRAM (eDRAM) memory array based on gain-cells. Measurements show that up to 73% of the retention power can be saved by altering the refresh time and setting it such that a small number of failures is allowed. We show that these savings can be further increased by utilizing known circuit techniques, such as body biasing, which can help, not only in extending, but also in preferably shaping the retention time distribution. Our approach is one of the first attempts to access the data integrity and energy tradeoffs achieved in eDRAMs for utilizing them in error resilient applications and can prove helpful in the anticipated shift to approximate computing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we introduce a statistical data-correction framework that aims at improving the DSP system performance in presence of unreliable memories. The proposed signal processing framework implements best-effort error mitigation for signals that are corrupted by defects in unreliable storage arrays using a statistical correction function extracted from the signal statistics, a data-corruption model, and an application-specific cost function. An application example to communication systems demonstrates the efficacy of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The area and power consumption of low-density parity check (LDPC) decoders are typically dominated by embedded memories. To alleviate such high memory costs, this paper exploits the fact that all internal memories of a LDPC decoder are frequently updated with new data. These unique memory access statistics are taken advantage of by replacing all static standard-cell based memories (SCMs) of a prior-art LDPC decoder implementation by dynamic SCMs (D-SCMs), which are designed to retain data just long enough to guarantee reliable operation. The use of D-SCMs leads to a 44% reduction in silicon area of the LDPC decoder compared to the use of static SCMs. The low-power LDPC decoder architecture with refresh-free D-SCMs was implemented in a 90nm CMOS process, and silicon measurements show full functionality and an information bit throughput of up to 600 Mbps (as required by the IEEE 802.11n standard).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we investigate the impact of faulty memory bit-cells on the performance of LDPC and Turbo channel decoders based on realistic memory failure models. Our study investigates the inherent error resilience of such codes to potential memory faults affecting the decoding process. We develop two mitigation mechanisms that reduce the impact of memory faults rather than correcting every single error. We show how protection of only few bit-cells is sufficient to deal with high defect rates. In addition, we show how the use of repair-iterations specifically helps mitigating the impact of faults that occur inside the decoder itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Illegal, Unreported and Unregulated fishing has had a major role in the overexploitation of global fish populations. In response, international regulations have been imposed and many fisheries have been 'eco-certified' by consumer organizations, but methods for independent control of catch certificates and eco-labels are urgently needed. Here we show that, by using gene-associated single nucleotide polymorphisms, individual marine fish can be assigned back to population of origin with unprecedented high levels of precision. By applying high differentiation single nucleotide polymorphism assays, in four commercial marine fish, on a pan-European scale, we find 93-100% of individuals could be correctly assigned to origin in policy-driven case studies. We show how case-targeted single nucleotide polymorphism assays can be created and forensically validated, using a centrally maintained and publicly available database. Our results demonstrate how application of gene-associated markers will likely revolutionize origin assignment and become highly valuable tools for fighting illegal fishing and mislabelling worldwide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

41.Connor, M.C., Fairley, D.J. Marks, N.J. McGrath, J.W. (2016) Clostridium difficile Ribotype 023 lacks the ability to hydrolyse esculin, leading to false negative results on chromogenic agar. Letters in Applied Microbiology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE/BACKGROUND: Many associations between abdominal aortic aneurysm (AAA) and genetic polymorphisms have been reported. It is unclear which are genuine and which may be caused by type 1 errors, biases, and flexible study design. The objectives of the study were to identify associations supported by current evidence and to investigate the effect of study design on reporting associations.

METHODS: Data sources were MEDLINE, Embase, and Web of Science. Reports were dual-reviewed for relevance and inclusion against predefined criteria (studies of genetic polymorphisms and AAA risk). Study characteristics and data were extracted using an agreed tool and reports assessed for quality. Heterogeneity was assessed using I(2) and fixed- and random-effects meta-analyses were conducted for variants that were reported at least twice, if any had reported an association. Strength of evidence was assessed using a standard guideline.

RESULTS: Searches identified 467 unique articles, of which 97 were included. Of 97 studies, 63 reported at least one association. Of 92 studies that conducted multiple tests, only 27% corrected their analyses. In total, 263 genes were investigated, and associations were reported in polymorphisms in 87 genes. Associations in CDKN2BAS, SORT1, LRP1, IL6R, MMP3, AGTR1, ACE, and APOA1 were supported by meta-analyses.

CONCLUSION: Uncorrected multiple testing and flexible study design (particularly testing many inheritance models and subgroups, and failure to check for Hardy-Weinberg equilibrium) contributed to apparently false associations being reported. Heterogeneity, possibly due to the case mix, geographical, temporal, and environmental variation between different studies, was evident. Polymorphisms in nine genes had strong or moderate support on the basis of the literature at this time. Suggestions are made for improving AAA genetics study design and conduct.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Northern Ireland, decades of religious and political unrest led to the marginalization not only of rights but also the experiences and voices of those who identify as Lesbian, Gay, Bisexual, Trans and/or Queer (LGBTQ). The peace process has arguably created space in which sexual minorities can voice their experiences and articulate counter-memories to those that tend to dominate ethno-nationalist commemorations of the conflict. This essay explores two productions of Northern Ireland’s first publicly funded gay theatre company, TheatreofplucK, led by artistic director Niall Rea: D.R.A.G (Divided, Radical and Gorgeous) was first performed in 2011 and explores the personal experiences of a Belfast drag queen in the form of personal testimonial monologue. The forthcoming (November 2015) performed archive installation, Tr<uble, by Shannon Yee, assembles true-life testimonies of the LGBTQ community in Northern Ireland during and after the Troubles. I will explore how performed and performative memories have the potential to ‘queer’ remembrance of the Troubles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Memory is thought to be about the past. The past is a problem in conflict transformation. This lecture suggests memory can also be about the future. It introduces the notion of remembering forwards, which is contrasted with remembering backwards. The distinction between these two forms of remembering defines the burden of memory in post-conflict societies generally and specifically in Ireland. In societies emerging out of conflict, where divided memories in part constituted the conflict, social memory privileges remembering backward. Collective and personal memories elide within social memory to perpetuate divided group identities and contested personal narratives. Above all, social memory works to arbitrate the future, by predisposing an extreme memory culture that locks people into the past. Forgetting the past is impossible and undesirable. What is needed in societies emerging out of conflict is to be released from the hold that oppressive and haunting memories have over people. This lecture will suggest that this is found in the idea of remembering forwards. This is not the same as forgetting. It is remembering to cease to remember oppressive and haunting memories. It does not involve non-remembrance but active remembering: remembering to cease to remember the past. While the past lives in us always, remembering forwards assists us in not living in the past. Remembering forwards thus allows us to live in tolerance in the future despite the reality that divided memories endure and live on. The lecture further argues that these enduring divided memories need to be reimagined by the application of truth, tolerance, togetherness and trajectory. The lecture suggests that it is through remembering forwards with truth, tolerance, togetherness and trajectory that people in post-conflict societies can inherit the future despite their divided pasts and live in tolerance in the midst of contested memories.