37 resultados para offloading


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability of cloud computing to provide almost unlimited storage, backup and recovery, and quick deployment contributes to its widespread attention and implementation. Cloud computing has also become an attractive choice for mobile users as well. Due to limited features of mobile devices such as power scarcity and inability to cater computationintensive tasks, selected computation needs to be outsourced to the resourceful cloud servers. However, there are many challenges which need to be addressed in computation offloading for mobile cloud computing such as communication cost, connectivity maintenance and incurred latency. This paper presents taxonomy of the computation offloading approaches which aim to address the challenges. The taxonomy provides guidelines to identify research scopes in computation offloading for mobile cloud computing. We also outline directions and anticipated trends for future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The publish/subscribe paradigm has lately received much attention. In publish/subscribe systems, a specialized event-based middleware delivers notifications of events created by producers (publishers) to consumers (subscribers) interested in that particular event. It is considered a good approach for implementing Internet-wide distributed systems as it provides full decoupling of the communicating parties in time, space and synchronization. One flavor of the paradigm is content-based publish/subscribe which allows the subscribers to express their interests very accurately. In order to implement a content-based publish/subscribe middleware in way suitable for Internet scale, its underlying architecture must be organized as a peer-to-peer network of content-based routers that take care of forwarding the event notifications to all interested subscribers. A communication infrastructure that provides such service is called a content-based network. A content-based network is an application-level overlay network. Unfortunately, the expressiveness of the content-based interaction scheme comes with a price - compiling and maintaining the content-based forwarding and routing tables is very expensive when the amount of nodes in the network is large. The routing tables are usually partially-ordered set (poset) -based data structures. In this work, we present an algorithm that aims to improve scalability in content-based networks by reducing the workload of content-based routers by offloading some of their content routing cost to clients. We also provide experimental results of the performance of the algorithm. Additionally, we give an introduction to the publish/subscribe paradigm and content-based networking and discuss alternative ways of improving scalability in content-based networks. ACM Computing Classification System (CCS): C.2.4 [Computer-Communication Networks]: Distributed Systems - Distributed applications

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smartphones and other internet enabled devices are now common on our everyday life, thus unsurprisingly a current trend is to adapt desktop PC applications to execute on them. However, since most of these applications have quality of service (QoS) requirements, their execution on resource-constrained mobile devices presents several challenges. One solution to support more stringent applications is to offload some of the applications’ services to surrogate devices nearby. Therefore, in this paper, we propose an adaptable offloading mechanism which takes into account the QoS requirements of the application being executed (particularly its real-time requirements), whilst allowing offloading services to several surrogate nodes. We also present how the proposed computing model can be implemented in an Android environment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sperimentazione di comunicazioni veicolari, panoramica sulle VANET, sull'IEEE 802.11p e sul GCDC. Descrizione dell'offloading da rete cellulare. Risultati sperimentali.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Panoramica di MPEG-DASH e TVWS, descrizione dell'implementazione di un middleware multi-interfaccia per lo streaming video adattivo e test di valutazione del lavoro svolto

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Diabetic foot complications are recognised as the most common reason for diabetic related hospitalisation and lower extremity amputations. Multi-faceted strategies to reduce diabetic foot hospitalisation and amputation rates have been successful. However, most diabetic foot ulcers are managed in ambulatory settings where data availability is poor and studies limited. The project aimed to develop and evaluate strategies to improve the management of diabetic foot complications in three diverse ambulatory settings and measure the subsequent impact on ospitalisation and amputation. Methods Multifaceted strategies were implemented in 2008, including: multi-disciplinary teams, clinical pathways and training, clinical indicators, telehealth support and surveys. A retrospective audit of consecutive patient records from July 2006 – June 2007 determined baseline clinical indicators (n = 101). A clinical pathway teleform was implemented as a clinical record and clinical indicator analyser in all sites in 2008 (n = 327) and followed up in 2009 (n = 406). Results Prior to the intervention, clinical pathways were not used and multi-disciplinary teams were limited. There was an absolute improvement in treating according to risk of 15% in 2009 and surveillance of the high risk population of 34% and 19% in 2008 and 2009 respectively (p < 0.001). Improvements of 13 – 66% (p < 0.001) were recorded in 2008 for individual clinical activities to a performance > 92% in perfusion, ulcer depth, infection assessment and management, offloading and education. Hospitalisation impacts recorded reductions of up to 64% in amputation rates / 100,000 population (p < 0.001) and 24% average length of stay (p < 0.001) Conclusion These findings support the use of multi-faceted strategies in diverse ambulatory services to standardise practice, improve diabetic foot complications management and positively impact on hospitalisation outcomes. As of October 2010, these strategies had been rolled out to over 25 ambulatory sites, representing 66% of Queensland Health districts, managing 1,820 patients and 13,380 occasions of service, including 543 healed ulcer patients. It is expected that this number will rise dramatically as an incentive payment for the use of the teleform is expanded.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Foot ulcers are a leading cause of avoidable hospital admissions and lower extremity amputations. However, large clinical studies describing foot ulcer presentations in the ambulatory setting are limited. The aim of this descriptive observational paper is to report the characteristics of ambulatory foot ulcer patients managed across 13 of 17 Queensland Health & Hospital Services. Methods Data on all foot ulcer patients registered with a Queensland High Risk Foot Form (QHRFF) was collected at their first consult in 2012. Data is automatically extracted from each QHRFF into a Queensland high risk foot database. Descriptive statistics display age, sex, ulcer types and co-morbidities. Statewide clinical indicators of foot ulcer management are also reported. Results Overall, 2,034 people presented with a foot ulcer in 2012. Mean age was 63(±14) years and 67.8% were male. Co-morbidities included 85% had diabetes, 49.7% hypertension, 39.2% dyslipidaemia, 25.6% cardiovascular disease, 13.7% kidney disease and 12.2% smoking. Foot ulcer types included 51.6% neuropathic, 17.8% neuro-ischaemic, 7.2% ischaemic, 6.6% post-surgical and 16.8% other; whilst 31% were infected. Clinical indicator results revealed 98% had their wound categorised, 51% received non-removable offloading, median ulcer healing time was 6-weeks and 37% had ulcer recurrence. Conclusion This paper details the largest foot ulcer database reported in Australia. People presenting with foot ulcers appear predominantly older, male with several co-morbidities. Encouragingly it appears most patients are receiving best practice care. These results may be a factor in the significant reduction of Queensland diabetes foot-related hospitalisations and amputations recently reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background High-risk foot complications such as neuropathy, ischaemia, deformity, infections, ulcers and amputations consume considerable health care resources and typically result from chronic diseases. This study aimed to develop and test the validity and reliability of a Queensland High Risk Foot Form (QHRFF) tool. Methods Phase one involved developing a QHRFF using an existing diabetes high-risk foot tool, literature search, expert panel and several state-wide stakeholder groups. Phase two tested the criterion-related validity along with inter- and intra-rater reliability of the final QHRFF. Three cohorts of patients (n = 94) and four clinicians, representing different levels of expertise, were recruited. Validity was determined by calculating sensitivity, specificity and positive predictive values (PPV). Kappa and intra-class correlation (ICC) statistics were used to establish reliability. Results A QHRFF tool containing 46-items across seven domains was developed and endorsed. The majority of QHRFF items achieved moderate-to-perfect validity (PPV = 0.71 – 1) and reliability (Kappa/ICC = 0.41 – 1). Items with weak validity and/or reliability included those identifying health professionals previously attending the patient, other (non-listed) co-morbidity, previous foot ulcer, foot deformity, optimum offloading and optimum footwear. Conclusions The QHRFF had moderate-to-perfect validity and reliability across the majority of items, particularly identifying individual co-morbidities and foot complications. Items with weak validity or reliability need to be re-defined or removed. Overall, the QHRFF appears to be a valid and reliable tool to assess, collect and measure clinical data pertaining to high-risk foot complications for clinical or research purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Foot problems complicating diabetes are a source of major patient suffering and societal costs. Investing in evidence-based, internationally appropriate diabetic foot care guidance is likely among the most cost-effective forms of healthcare expenditure, provided it is goal-focused and properly implemented. The International Working Group on the Diabetic Foot (IWGDF) has been publishing and updating international Practical Guidelines since 1999. The 2015 updates are based on systematic reviews of the literature, and recommendations are formulated using the Grading of Recommendations Assessment Development and Evaluation system. As such, we changed the name from 'Practical Guidelines' to 'Guidance'. In this article we describe the development of the 2015 IWGDF Guidance documents on prevention and management of foot problems in diabetes. This Guidance consists of five documents, prepared by five working groups of international experts. These documents provide guidance related to foot complications in persons with diabetes on: prevention; footwear and offloading; peripheral artery disease; infections; and, wound healing interventions. Based on these five documents, the IWGDF Editorial Board produced a summary guidance for daily practice. The resultant of this process, after reviewed by the Editorial Board and by international IWGDF members of all documents, is an evidence-based global consensus on prevention and management of foot problems in diabetes. Plans are already under way to implement this Guidance. We believe that following the recommendations of the 2015 IWGDF Guidance will almost certainly result in improved management of foot problems in persons with diabetes and a subsequent worldwide reduction in the tragedies caused by these foot problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Workstation clusters equipped with high performance interconnect having programmable network processors facilitate interesting opportunities to enhance the performance of parallel application run on them. In this paper, we propose schemes where certain application level processing in parallel database query execution is performed on the network processor. We evaluate the performance of TPC-H queries executing on a high end cluster where all tuple processing is done on the host processor, using a timed Petri net model, and find that tuple processing costs on the host processor dominate the execution time. These results are validated using a small cluster. We therefore propose 4 schemes where certain tuple processing activity is offloaded to the network processor. The first 2 schemes offload the tuple splitting activity - computation to identify the node on which to process the tuples, resulting in an execution time speedup of 1.09 relative to the base scheme, but with I/O bus becoming the bottleneck resource. In the 3rd scheme in addition to offloading tuple processing activity, the disk and network interface are combined to avoid the I/O bus bottleneck, which results in speedups up to 1.16, but with high host processor utilization. Our 4th scheme where the network processor also performs apart of join operation along with the host processor, gives a speedup of 1.47 along with balanced system resource utilizations. Further we observe that the proposed schemes perform equally well even in a scaled architecture i.e., when the number of processors is increased from 2 to 64

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Following rising demands in positioning with GPS, low-cost receivers are becoming widely available; but their energy demands are still too high. For energy efficient GPS sensing in delay-tolerant applications, the possibility of offloading a few milliseconds of raw signal samples and leveraging the greater processing power of the cloud for obtaining a position fix is being actively investigated. In an attempt to reduce the energy cost of this data offloading operation, we propose Sparse-GPS(1): a new computing framework for GPS acquisition via sparse approximation. Within the framework, GPS signals can be efficiently compressed by random ensembles. The sparse acquisition information, pertaining to the visible satellites that are embedded within these limited measurements, can subsequently be recovered by our proposed representation dictionary. By extensive empirical evaluations, we demonstrate the acquisition quality and energy gains of Sparse-GPS. We show that it is twice as energy efficient than offloading uncompressed data, and has 5-10 times lower energy costs than standalone GPS; with a median positioning accuracy of 40 m.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mobile cloud computing model promises to address the resource limitations of mobile devices, but effectively implementing this model is difficult. Previous work on mobile cloud computing has required the user to have a continuous, high-quality connection to the cloud infrastructure. This is undesirable and possibly infeasible, as the energy required on the mobile device to maintain a connection, and transfer sizeable amounts of data is large; the bandwidth tends to be quite variable, and low on cellular networks. The cloud deployment itself needs to efficiently allocate scalable resources to the user as well. In this paper, we formulate the best practices for efficiently managing the resources required for the mobile cloud model, namely energy, bandwidth and cloud computing resources. These practices can be realised with our mobile cloud middleware project, featuring the Cloud Personal Assistant (CPA). We compare this with the other approaches in the area, to highlight the importance of minimising the usage of these resources, and therefore ensure successful adoption of the model by end users. Based on results from experiments performed with mobile devices, we develop a no-overhead decision model for task and data offloading to the CPA of a user, which provides efficient management of mobile cloud resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

FastFlow is a programming framework specifically targeting cache-coherent shared-memory multi-cores. It is implemented as a stack of C++ template libraries built on top of lock-free (and memory fence free) synchronization mechanisms. Its philosophy is to combine programmability with performance. In this paper a new FastFlow programming methodology aimed at supporting parallelization of existing sequential code via offloading onto a dynamically created software accelerator is presented. The new methodology has been validated using a set of simple micro-benchmarks and some real applications. © 2011 Springer-Verlag.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In-Memory Databases (IMDBs), such as SAP HANA, enable new levels of database performance by removing the disk bottleneck and by compressing data in memory. The consequence of this improved performance means that reports and analytic queries can now be processed on demand. Therefore, the goal is now to provide near real-time responses to compute and data intensive analytic queries. To facilitate this, much work has investigated the use of acceleration technologies within the database context. While current research into the application of these technologies has yielded positive results, they have tended to focus on single database tasks or on isolated single user requests. This paper uses SHEPARD, a framework for managing accelerated tasks across shared heterogeneous resources, to introduce acceleration into an IMDB. Results show how, using SHEPARD, multiple simultaneous user queries all receive speed-up by using a shared pool of accelerators. Results also show that offloading analytic tasks onto accelerators can have indirect benefits for other database workloads by reducing contention for CPU resources.