935 resultados para Memory-based


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple Table Lookup architectures in Software Defined Networking (SDN) open the door for exciting new network applications. The development of the OpenFlow protocol supported the SDN paradigm. However, the first version of the OpenFlow protocol specified a single table lookup model with the associated constraints in flow entry numbers and search capabilities. With the introduction of multiple table lookup in OpenFlow v1.1, flexible and efficient search to support SDN application innovation became possible. However, implementation of multiple table lookup in hardware to meet high performance requirements is non-trivial. One possible approach involves the use of multi-dimensional lookup algorithms. A high lookup performance can be achieved by using embedded memory for flow entry storage. A detailed study of OpenFlow flow filters for multi-dimensional lookup is presented in this paper. Based on a proposed multiple table lookup architecture, the memory consumption and update performance using parallel single field searches are evaluated. The results demonstrate an efficient multi-table lookup implementation with minimum memory usage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The use of technology in healthcare settings is on the increase and may represent a cost-effective means of delivering rehabilitation. Reductions in treatment time, and delivery in the home, are also thought to be benefits of this approach. Children and adolescents with brain injury often experience deficits in memory and executive functioning that can negatively affect their school work, social lives, and future occupations. Effective interventions that can be delivered at home, without the need for high-cost clinical involvement, could provide a means to address a current lack of provision. We have systematically reviewed studies examining the effects of technology-based interventions for the rehabilitation of deficits in memory and executive functioning in children and adolescents with acquired brain injury. Objectives To assess the effects of technology-based interventions compared to placebo intervention, no treatment, or other types of intervention, on the executive functioning and memory of children and adolescents with acquired brain injury. Search methods We ran the search on the 30 September 2015. We searched the Cochrane Injuries Group Specialised Register, the Cochrane Central Register of Controlled Trials (CENTRAL), Ovid MEDLINE(R), Ovid MEDLINE(R) In-Process & Other Non-Indexed Citations, Ovid MEDLINE(R) Daily and Ovid OLDMEDLINE(R), EMBASE Classic + EMBASE (OvidSP), ISI Web of Science (SCI-EXPANDED, SSCI, CPCI-S, and CPSI-SSH), CINAHL Plus (EBSCO), two other databases, and clinical trials registers. We also searched the internet, screened reference lists, and contacted authors of included studies. Selection criteria Randomised controlled trials comparing the use of a technological aid for the rehabilitation of children and adolescents with memory or executive-functioning deficits with placebo, no treatment, or another intervention. Data collection and analysis Two review authors independently reviewed titles and abstracts identified by the search strategy. Following retrieval of full-text manuscripts, two review authors independently performed data extraction and assessed the risk of bias. Main results Four studies (involving 206 participants) met the inclusion criteria for this review. Three studies, involving 194 participants, assessed the effects of online interventions to target executive functioning (that is monitoring and changing behaviour, problem solving, planning, etc.). These studies, which were all conducted by the same research team, compared online interventions against a 'placebo' (participants were given internet resources on brain injury). The interventions were delivered in the family home with additional support or training, or both, from a psychologist or doctoral student. The fourth study investigated the use of a computer program to target memory in addition to components of executive functioning (that is attention, organisation, and problem solving). No information on the study setting was provided, however a speech-language pathologist, teacher, or occupational therapist accompanied participants. Two studies assessed adolescents and young adults with mild to severe traumatic brain injury (TBI), while the remaining two studies assessed children and adolescents with moderate to severe TBI. Risk of bias We assessed the risk of selection bias as low for three studies and unclear for one study. Allocation bias was high in two studies, unclear in one study, and low in one study. Only one study (n = 120) was able to conceal allocation from participants, therefore overall selection bias was assessed as high. One study took steps to conceal assessors from allocation (low risk of detection bias), while the other three did not do so (high risk of detection bias). Primary outcome 1: Executive functioning: Technology-based intervention versus placebo Results from meta-analysis of three studies (n = 194) comparing online interventions with a placebo for children and adolescents with TBI, favoured the intervention immediately post-treatment (standardised mean difference (SMD) -0.37, 95% confidence interval (CI) -0.66 to -0.09; P = 0.62; I2 = 0%). (As there is no 'gold standard' measure in the field, we have not translated the SMD back to any particular scale.) This result is thought to represent only a small to medium effect size (using Cohen’s rule of thumb, where 0.2 is a small effect, 0.5 a medium one, and 0.8 or above is a large effect); this is unlikely to have a clinically important effect on the participant. The fourth study (n = 12) reported differences between the intervention and control groups on problem solving (an important component of executive functioning). No means or standard deviations were presented for this outcome, therefore an effect size could not be calculated. The quality of evidence for this outcome according to GRADE was very low. This means future research is highly likely to change the estimate of effect. Primary outcome 2: Memory One small study (n = 12) reported a statistically significant difference in improvement in sentence recall between the intervention and control group following an eight-week remediation programme. No means or standard deviations were presented for this outcome, therefore an effect size could not be calculated. Secondary outcomes Two studies (n = 158) reported on anxiety/depression as measured by the Child Behavior Checklist (CBCL) and were included in a meta-analysis. We found no evidence of an effect with the intervention (mean difference -5.59, 95% CI -11.46 to 0.28; I2 = 53%). The GRADE quality of evidence for this outcome was very low, meaning future research is likely to change the estimate of effect. A single study sought to record adverse events and reported none. Two studies reported on use of the intervention (range 0 to 13 and 1 to 24 sessions). One study reported on social functioning/social competence and found no effect. The included studies reported no data for other secondary outcomes (that is quality of life and academic achievement). Authors' conclusions This review provides low-quality evidence for the use of technology-based interventions in the rehabilitation of executive functions and memory for children and adolescents with TBI. As all of the included studies contained relatively small numbers of participants (12 to 120), our findings should be interpreted with caution. The involvement of a clinician or therapist, rather than use of the technology, may have led to the success of these interventions. Future research should seek to replicate these findings with larger samples, in other regions, using ecologically valid outcome measures, and reduced clinician involvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background
The use of multiple medicines (polypharmacy) is increasingly common in older people. Ensuring that patients receive the most appropriate combinations of medications (appropriate polypharmacy) is a significant challenge. The quality of evidence to support the effectiveness of interventions to improve appropriate polypharmacy is low. Systematic identification of mediators of behaviour change, using the Theoretical Domains Framework (TDF), provides a theoretically robust evidence base to inform intervention design. This study aimed to (1) identify key theoretical domains that were perceived to influence the prescribing and dispensing of appropriate polypharmacy to older patients by general practitioners (GPs) and community pharmacists, and (2) map domains to associated behaviour change techniques (BCTs) to include as components of an intervention to improve appropriate polypharmacy in older people in primary care.

Methods
Semi-structured interviews were conducted with members of each healthcare professional (HCP) group using tailored topic guides based on TDF version 1 (12 domains). Questions covering each domain explored HCPs’ perceptions of barriers and facilitators to ensuring the prescribing and dispensing of appropriate polypharmacy to older people. Interviews were audio-recorded and transcribed verbatim. Data analysis involved the framework method and content analysis. Key domains were identified and mapped to BCTs based on established methods and discussion within the research team.

Results
Thirty HCPs were interviewed (15 GPs, 15 pharmacists). Eight key domains were identified, perceived to influence prescribing and dispensing of appropriate polypharmacy: ‘Skills’, ‘Beliefs about capabilities’, ‘Beliefs about consequences’, ‘Environmental context and resources’, ‘Memory, attention and decision processes’, ‘Social/professional role and identity’, ‘Social influences’ and ‘Behavioural regulation’. Following mapping, four BCTs were selected for inclusion in an intervention for GPs or pharmacists: ‘Action planning’, ‘Prompts/cues’, ‘Modelling or demonstrating of behaviour’ and ‘Salience of consequences’. An additional BCT (‘Social support or encouragement’) was selected for inclusion in a community pharmacy-based intervention in order to address barriers relating to interprofessional working that were encountered by pharmacists.

Conclusions
Selected BCTs will be operationalised in a theory-based intervention to improve appropriate polypharmacy for older people, to be delivered in GP practice and community pharmacy settings. Future research will involve development and feasibility testing of this intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The worsening of process variations and the consequent increased spreads in circuit performance and consumed power hinder the satisfaction of the targeted budgets and lead to yield loss. Corner based design and adoption of design guardbands might limit the yield loss. However, in many cases such methods may not be able to capture the real effects which might be way better than the predicted ones leading to increasingly pessimistic designs. The situation is even more severe in memories which consist of substantially different individual building blocks, further complicating the accurate analysis of the impact of variations at the architecture level leaving many potential issues uncovered and opportunities unexploited. In this paper, we develop a framework for capturing non-trivial statistical interactions among all the components of a memory/cache. The developed tool is able to find the optimum memory/cache configuration under various constraints allowing the designers to make the right choices early in the design cycle and consequently improve performance, energy, and especially yield. Our, results indicate that the consideration of the architectural interactions between the memory components allow to relax the pessimistic access times that are predicted by existing techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The area and power consumption of low-density parity check (LDPC) decoders are typically dominated by embedded memories. To alleviate such high memory costs, this paper exploits the fact that all internal memories of a LDPC decoder are frequently updated with new data. These unique memory access statistics are taken advantage of by replacing all static standard-cell based memories (SCMs) of a prior-art LDPC decoder implementation by dynamic SCMs (D-SCMs), which are designed to retain data just long enough to guarantee reliable operation. The use of D-SCMs leads to a 44% reduction in silicon area of the LDPC decoder compared to the use of static SCMs. The low-power LDPC decoder architecture with refresh-free D-SCMs was implemented in a 90nm CMOS process, and silicon measurements show full functionality and an information bit throughput of up to 600 Mbps (as required by the IEEE 802.11n standard).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Realising memory intensive applications such as image and video processing on FPGA requires creation of complex, multi-level memory hierarchies to achieve real-time performance; however commerical High Level Synthesis tools are unable to automatically derive such structures and hence are unable to meet the demanding bandwidth and capacity constraints of these applications. Current approaches to solving this problem can only derive either single-level memory structures or very deep, highly inefficient hierarchies, leading in either case to one or more of high implementation cost and low performance. This paper presents an enhancement to an existing MC-HLS synthesis approach which solves this problem; it exploits and eliminates data duplication at multiple levels levels of the generated hierarchy, leading to a reduction in the number of levels and ultimately higher performance, lower cost implementations. When applied to synthesis of C-based Motion Estimation, Matrix Multiplication and Sobel Edge Detection applications, this enables reductions in Block RAM and Look Up Table (LUT) cost of up to 25%, whilst simultaneously increasing throughput.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the position of planning practices operated under precise guidelines for displaying modernity. Cultivating the spatial qualities of Cairo since the 1970s has unveiled centralised ideologies and systems of governance and economic incentives. I present a discussion of the wounds that result from the inadequate upgrading ventures in Cairo, which I argue, created scars as enduring evidence of unattainable planning methods and processes that undermined its locales. In this process, the paper focuses on the consequences of eviction rather than the planning methods in one of the city’s traditional districts. Empirical work is based on interdisciplinary research, public media reports and archival maps that document actions and procedures put in place to alter the visual, urban, and demographic characteristics of Cairo’s older neighbourhoods against a backdrop of decay to shift towards a global spectacular. The paper builds a conversation about the power and fate these spaces were subject to during hostile transformations that ended with their being disused. Their existence became associated with sores on the souls of its ex-inhabitants, as outward signs of inward scars showcasing a lack of equality and social justice in a context where it was much needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the availability of a wide range of cloud Virtual Machines (VMs) it is difficult to determine which VMs can maximise the performance of an application. Benchmarking is commonly used to this end for capturing the performance of VMs. Most cloud benchmarking techniques are typically heavyweight - time consuming processes which have to benchmark the entire VM in order to obtain accurate benchmark data. Such benchmarks cannot be used in real-time on the cloud and incur extra costs even before an application is deployed.

In this paper, we present lightweight cloud benchmarking techniques that execute quickly and can be used in near real-time on the cloud. The exploration of lightweight benchmarking techniques are facilitated by the development of DocLite - Docker Container-based Lightweight Benchmarking. DocLite is built on the Docker container technology which allows a user-defined portion (such as memory size and the number of CPU cores) of the VM to be benchmarked. DocLite operates in two modes, in the first mode, containers are used to benchmark a small portion of the VM to generate performance ranks. In the second mode, historic benchmark data is used along with the first mode as a hybrid to generate VM ranks. The generated ranks are evaluated against three scientific high-performance computing applications. The proposed techniques are up to 91 times faster than a heavyweight technique which benchmarks the entire VM. It is observed that the first mode can generate ranks with over 90% and 86% accuracy for sequential and parallel execution of an application. The hybrid mode improves the correlation slightly but the first mode is sufficient for benchmarking cloud VMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing benchmarking methods are time consuming processes as they typically benchmark the entire Virtual Machine (VM) in order to generate accurate performance data, making them less suitable for real-time analytics. The research in this paper is aimed to surmount the above challenge by presenting DocLite - Docker Container-based Lightweight benchmarking tool. DocLite explores lightweight cloud benchmarking methods for rapidly executing benchmarks in near real-time. DocLite is built on the Docker container technology, which allows a user-defined memory size and number of CPU cores of the VM to be benchmarked. The tool incorporates two benchmarking methods - the first referred to as the native method employs containers to benchmark a small portion of the VM and generate performance ranks, and the second uses historic benchmark data along with the native method as a hybrid to generate VM ranks. The proposed methods are evaluated on three use-cases and are observed to be up to 91 times faster than benchmarking the entire VM. In both methods, small containers provide the same quality of rankings as a large container. The native method generates ranks with over 90% and 86% accuracy for sequential and parallel execution of an application compared against benchmarking the whole VM. The hybrid method did not improve the quality of the rankings significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis is to study the properties of resistive switching effect based on bistable resistive memory which is fabricated in the form of Al2O3/polymer diodes and to contribute to the elucidation of resistive switching mechanisms. Resistive memories were characterized using a variety of electrical techniques, including current-voltage measurements, small-signal impedance, and electrical noise based techniques. All the measurements were carried out over a large temperature range. Fast voltage ramps were used to elucidate the dynamic response of the memory to rapid varying electric fields. The temperature dependence of the current provided insight into the role of trapped charges in resistive switching. The analysis of fast current fluctuations using electric noise techniques contributed to the elucidation of the kinetics involved in filament formation/rupture, the filament size and correspondent current capabilities. The results reported in this thesis provide insight into a number of issues namely: (i) The fundamental limitations on the speed of operation of a bi-layer resistive memory are the time and voltage dependences of the switch-on mechanism. (ii) The results explain the wide spread in switching times reported in the literature and the apparently anomalous behaviour of the high conductance state namely the disappearance of the negative differential resistance region at high voltage scan rates which is commonly attributed to a “dead time” phenomenon which had remained elusive since it was first reported in the ‘60s. (iii) Assuming that the current is filamentary, Comsol simulations were performed and used to explain the observed dynamic properties of the current-voltage characteristics. Furthermore, the simulations suggest that filaments can interact with each other. (iv) The current-voltage characteristics have been studied as a function of temperature. The findings indicate that creation and annihilation of filaments is controlled by filling and neutralizing traps localized at the oxide/polymer interface. (v) Resistive switching was also studied in small-molecule OLEDs. It was shown that the degradation that leads to a loss of light output during operation is caused by the presence of a resistive switching layer. A diagnostic tool that predicts premature failure of OLEDs was devised and proposed. Resistive switching is a property of oxides. These layers can grow in a number of devices including, organic light emitting diodes (OLEDs), spin-valve transistors and photovoltaic devices fabricated in different types of material. Under strong electric fields the oxides can undergo dielectric breakdown and become resistive switching layers. Resistive switching strongly modifies the charge injection causing a number of deleterious effects and eventually device failure. In this respect the findings in this thesis are relevant to understand reliability issues in devices across a very broad field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2013

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To increase the amount of logic available to the users in SRAM-based FPGAs, manufacturers are using nanometric technologies to boost logic density and reduce costs, making its use more attractive. However, these technological improvements also make FPGAs particularly vulnerable to configuration memory bit-flips caused by power fluctuations, strong electromagnetic fields and radiation. This issue is particularly sensitive because of the increasing amount of configuration memory cells needed to define their functionality. A short survey of the most recent publications is presented to support the options assumed during the definition of a framework for implementing circuits immune to bit-flips induction mechanisms in memory cells, based on a customized redundant infrastructure and on a detection-and-fix controller.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last decade has witnessed a major shift towards the deployment of embedded applications on multi-core platforms. However, real-time applications have not been able to fully benefit from this transition, as the computational gains offered by multi-cores are often offset by performance degradation due to shared resources, such as main memory. To efficiently use multi-core platforms for real-time systems, it is hence essential to tightly bound the interference when accessing shared resources. Although there has been much recent work in this area, a remaining key problem is to address the diversity of memory arbiters in the analysis to make it applicable to a wide range of systems. This work handles diverse arbiters by proposing a general framework to compute the maximum interference caused by the shared memory bus and its impact on the execution time of the tasks running on the cores, considering different bus arbiters. Our novel approach clearly demarcates the arbiter-dependent and independent stages in the analysis of these upper bounds. The arbiter-dependent phase takes the arbiter and the task memory-traffic pattern as inputs and produces a model of the availability of the bus to a given task. Then, based on the availability of the bus, the arbiter-independent phase determines the worst-case request-release scenario that maximizes the interference experienced by the tasks due to the contention for the bus. We show that the framework addresses the diversity problem by applying it to a memory bus shared by a fixed-priority arbiter, a time-division multiplexing (TDM) arbiter, and an unspecified work-conserving arbiter using applications from the MediaBench test suite. We also experimentally evaluate the quality of the analysis by comparison with a state-of-the-art TDM analysis approach and consistently showing a considerable reduction in maximum interference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While fractional calculus (FC) is as old as integer calculus, its application has been mainly restricted to mathematics. However, many real systems are better described using FC equations than with integer models. FC is a suitable tool for describing systems characterised by their fractal nature, long-term memory and chaotic behaviour. It is a promising methodology for failure analysis and modelling, since the behaviour of a failing system depends on factors that increase the model’s complexity. This paper explores the proficiency of FC in modelling complex behaviour by tuning only a few parameters. This work proposes a novel two-step strategy for diagnosis, first modelling common failure conditions and, second, by comparing these models with real machine signals and using the difference to feed a computational classifier. Our proposal is validated using an electrical motor coupled with a mechanical gear reducer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: Background. In India, prevalence rates of dementia and prodromal amnestic Mild Cognitive Impairment (MCI) are 3.1% and 4.3% respectively. Most Indians refer to the full spectrum of cognitive disorders simply as ‘memory loss.’ Barring prevention or cure, these conditions will rise rapidly with population aging. Evidence-based policies and practices can improve the lives of affected individuals and their caregivers, but will require timely and sustained uptake. Objectives. Framed by social cognitive theories of health behavior, this study explores the knowledge, attitudes and practices concerning cognitive impairment and related service use by older adults who screen positive for MCI, their primary caregivers, and health providers. Methods. I used the Montreal Cognitive Assessment to screen for cognitive impairment in memory camps in Mumbai. To achieve sampling diversity, I used maximum variation sampling. Ten adults aged 60+ who had no significant functional impairment but screened positive for MCI and their caregivers participated in separate focus groups. Four other such dyads and six doctors/ traditional healers completed in-depth interviews. Data were translated from Hindi or Marathi to English and analyzed in Atlas.ti using Framework Analysis. Findings. Knowledge and awareness of cognitive impairment and available resources were very low. Physicians attributed the condition to disease-induced pathology while lay persons blamed brain malfunction due to normal aging. Main attitudes were that this condition is not a disease, is not serious and/or is not treatable, and that it evokes stigma toward and among impaired persons, their families and providers. Low knowledge and poor attitudes impeded help-seeking. Conclusions. Cognitive disorders of aging will take a heavy toll on private lives and public resources in developing countries. Early detection, accurate diagnosis, systematic monitoring and quality care are needed to compress the period of morbidity and promote quality of life. Key stakeholders provide essential insights into how scientific and indigenous knowledge and sociocultural attitudes affect use and provision of resources.