844 resultados para computation- and data-intensive applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.

DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.

SETTING: Primary and secondary care.

PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).

INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.

MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).

RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.

LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.

CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.

FUNDING: The National Institute for Health Research Health Technology Assessment Programme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in hardware development coupled with the rapid adoption and broad applicability of cloud computing have introduced widespread heterogeneity in data centers, significantly complicating the management of cloud applications and data center resources. This paper presents the CACTOS approach to cloud infrastructure automation and optimization, which addresses heterogeneity through a combination of in-depth analysis of application behavior with insights from commercial cloud providers. The aim of the approach is threefold: to model applications and data center resources, to simulate applications and resources for planning and operation, and to optimize application deployment and resource use in an autonomic manner. The approach is based on case studies from the areas of business analytics, enterprise applications, and scientific computing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With security and surveillance, there is an increasing need to be able to process image data efficiently and effectively either at source or in a large data networks. Whilst Field Programmable Gate Arrays have been seen as a key technology for enabling this, they typically use high level and/or hardware description language synthesis approaches; this provides a major disadvantage in terms of the time needed to design or program them and to verify correct operation; it considerably reduces the programmability capability of any technique based on this technology. The work here proposes a different approach of using optimised soft-core processors which can be programmed in software. In particular, the paper proposes a design tool chain for programming such processors that uses the CAL Actor Language as a starting point for describing an image processing algorithm and targets its implementation to these custom designed, soft-core processors on FPGA. The main purpose is to exploit the task and data parallelism in order to achieve the same parallelism as a previous HDL implementation but avoiding the design time, verification and debugging steps associated with such approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Skeletal muscle wasting and weakness are significant complications of critical illness, associated with the degree of illness severity and periods of reduced mobility during mechanical ventilation. They contribute to the profound physical and functional deficits observed in survivors. These impairments may persist for many years following discharge from the intensive care unit (ICU) and may markedly influence health-related quality of life. Rehabilitation is a key strategy in the recovery of patients following critical illness. Exercise based interventions are aimed at targeting this muscle wasting and weakness. Physical rehabilitation delivered during ICU admission has been systematically evaluated and shown to be beneficial. However its effectiveness when initiated after ICU discharge has yet to be established. Objectives: To assess the effectiveness of exercise rehabilitation programmes, initiated after ICU discharge, on functional exercise capacity and health-related quality of life in adult ICU survivors who have been mechanically ventilated for more than 24 hours. Search methods:We searched the following databases: the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library), OvidSP MEDLINE, Ovid SP EMBASE, and CINAHL via EBSCO host to 15th May 2014. We used a specific search strategy for each database. This included synonyms for ICU and critical illness, exercise training and rehabilitation. We searched the reference lists of included studies and contacted primary authors to obtain further information regarding potentially eligible studies. We also searched major clinical trials registries (Clinical Trials and Current Controlled Trials) and the personal libraries of the review authors. We applied no language or publication restriction. We reran the search in February 2015. We will deal with any studies of interest when we update the review.  Selection criteria:We included randomized controlled trials (RCTs), quasi-RCTs, and controlled clinical trials (CCTs) that compared an exercise interventioninitiated after ICU discharge to any other intervention or a control or ‘usual care’ programme in adult (≥18years) survivors ofcritical illness. Data collection and analysis:We used standard methodological procedures expected by The Cochrane Collaboration. Main results:We included six trials (483 adult ICU participants). Exercise-based interventions were delivered on the ward in two studies; both onthe ward and in the community in one study; and in the community in three studies. The duration of the intervention varied according to the length of stay in hospital following ICU discharge (up to a fixed duration of 12 weeks).Risk of bias was variable for all domains across all trials. High risk of bias was evident in all studies for performance bias, although blinding of participants and personnel in therapeutic rehabilitation trials can be pragmatically challenging. Low risk of bias was at least 50% for all other domains across all trials, although high risk of bias was present in one study for random sequence generation (selection bias), incomplete outcome data (attrition bias) and other sources. Risk of bias was unclear for remaining studies across the domains.All six studies measured effect on the primary outcome of functional exercise capacity, although there was wide variability in natureof intervention, outcome measures and associated metrics, and data reporting. Overall quality of the evidence was very low. Only two studies using the same outcome measure for functional exercise capacity, had the potential for pooling of data and assessment of heterogeneity. On statistical advice, this was considered inappropriate to perform this analysis and study findings were therefore qualitatively described. Individually, three studies reported positive results in favour of the intervention. A small benefit (versus. control)was evident in anaerobic threshold in one study (mean difference, MD (95% confidence interval, CI), 1.8 mlO2/kg/min (0.4 to 3.2),P value = 0.02), although this effect was short-term, and in a second study, both incremental (MD 4.7 (95% CI 1.69 to 7.75) Watts, P value = 0.003) and endurance (MD 4.12 (95% CI 0.68 to 7.56) minutes, P value = 0.021) exercise testing demonstrated improvement.Finally self-reported physical function increased significantly following a rehabilitation manual (P value = 0.006). Remaining studies found no effect of the intervention.Similar variability in with regard findings for the primary outcome of health-related quality of life were also evident. Only two studies evaluated this outcome. Following statistical advice, these data again were considered inappropriate for pooling to determine overall effect and assessment of heterogeneity. Qualitative description of findings was therefore undertaken. Individually, neither study reported differences between intervention and control groups for health-related quality of life as a result of the intervention. Overall quality of the evidence was very low.Mortality was reported by all studies, ranging from 0% to 18.8%. Only one non-mortality adverse event was reported across all patients in all studies (a minor musculoskeletal injury). Withdrawals, reported in four studies, ranged from 0% to 26.5% in control groups,and 8.2% to 27.6% in intervention groups. Loss to follow-up, reported in all studies, ranged from 0% to 14% in control groups, and 0% to 12.5% in intervention groups. Authors’ conclusions:We are unable, at this time, to determine an overall effect on functional exercise capacity, or health-related quality of life, of an exercise based intervention initiated after ICU discharge in survivors of critical illness. Meta-analysis of findings was not appropriate. This was due to insufficient study number and data. Individual study findings were inconsistent. Some studies reported a beneficial effect of the intervention on functional exercise capacity, and others not. No effect was reported on health-related quality of life. Methodological rigour was lacking across a number of domains influencing quality of the evidence. There was also wide variability in the characteristics of interventions, outcome measures and associated metrics, and data reporting.If further trials are identified, we may be able to determine the effect of exercise-based interventions following ICU discharge, on functional exercise capacity and health-related quality of life in survivors of critical illness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Acute respiratory distress syndrome (ARDS) is a common clinical syndrome with high mortality and long-term morbidity. To date there is no effective pharmacological therapy. Aspirin therapy has recently been shown to reduce the risk of developing ARDS, but the effect of aspirin on established ARDS is unknown.

METHODS: In a single large regional medical and surgical ICU between December 2010 and July 2012, all patients with ARDS were prospectively identified and demographic, clinical, and laboratory variables were recorded retrospectively. Aspirin usage, both pre-hospital and during intensive care unit (ICU) stay, was included. The primary outcome was ICU mortality. We used univariate and multivariate logistic regression analyses to assess the impact of these variables on ICU mortality.

RESULTS: In total, 202 patients with ARDS were included; 56 (28%) of these received aspirin either pre-hospital, in the ICU, or both. Using multivariate logistic regression analysis, aspirin therapy, given either before or during hospital stay, was associated with a reduction in ICU mortality (odds ratio (OR) 0.38 (0.15 to 0.96) P = 0.04). Additional factors that predicted ICU mortality for patients with ARDS were vasopressor use (OR 2.09 (1.05 to 4.18) P = 0.04) and APACHE II score (OR 1.07 (1.02 to 1.13) P = 0.01). There was no effect upon ICU length of stay or hospital mortality.

CONCLUSION: Aspirin therapy was associated with a reduced risk of ICU mortality. These data are the first to demonstrate a potential protective role for aspirin in patients with ARDS. Clinical trials to evaluate the role of aspirin as a pharmacological intervention for ARDS are needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context. The Public European Southern Observatory Spectroscopic Survey of Transient Objects (PESSTO) began as a public spectroscopic survey in April 2012. PESSTO classifies transients from publicly available sources and wide-field surveys, and selects science targets for detailed spectroscopic and photometric follow-up. PESSTO runs for nine months of the year, January - April and August - December inclusive, and typically has allocations of 10 nights per month. 

Aims. We describe the data reduction strategy and data products that are publicly available through the ESO archive as the Spectroscopic Survey data release 1 (SSDR1). 

Methods. PESSTO uses the New Technology Telescope with the instruments EFOSC2 and SOFI to provide optical and NIR spectroscopy and imaging. We target supernovae and optical transients brighter than 20.5<sup>m</sup> for classification. Science targets are selected for follow-up based on the PESSTO science goal of extending knowledge of the extremes of the supernova population. We use standard EFOSC2 set-ups providing spectra with resolutions of 13-18 Å between 3345-9995 Å. A subset of the brighter science targets are selected for SOFI spectroscopy with the blue and red grisms (0.935-2.53 μm and resolutions 23-33 Å) and imaging with broadband JHK<inf>s</inf> filters. 

Results. This first data release (SSDR1) contains flux calibrated spectra from the first year (April 2012-2013). A total of 221 confirmed supernovae were classified, and we released calibrated optical spectra and classifications publicly within 24 h of the data being taken (via WISeREP). The data in SSDR1 replace those released spectra. They have more reliable and quantifiable flux calibrations, correction for telluric absorption, and are made available in standard ESO Phase 3 formats. We estimate the absolute accuracy of the flux calibrations for EFOSC2 across the whole survey in SSDR1 to be typically ∼15%, although a number of spectra will have less reliable absolute flux calibration because of weather and slit losses. Acquisition images for each spectrum are available which, in principle, can allow the user to refine the absolute flux calibration. The standard NIR reduction process does not produce high accuracy absolute spectrophotometry but synthetic photometry with accompanying JHK<inf>s</inf> imaging can improve this. Whenever possible, reduced SOFI images are provided to allow this. 

Conclusions. Future data releases will focus on improving the automated flux calibration of the data products. The rapid turnaround between discovery and classification and access to reliable pipeline processed data products has allowed early science papers in the first few months of the survey.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new approach to speech enhancement from single-channel measurements involving both noise and channel distortion (i.e., convolutional noise), and demonstrates its applications for robust speech recognition and for improving noisy speech quality. The approach is based on finding longest matching segments (LMS) from a corpus of clean, wideband speech. The approach adds three novel developments to our previous LMS research. First, we address the problem of channel distortion as well as additive noise. Second, we present an improved method for modeling noise for speech estimation. Third, we present an iterative algorithm which updates the noise and channel estimates of the corpus data model. In experiments using speech recognition as a test with the Aurora 4 database, the use of our enhancement approach as a preprocessor for feature extraction significantly improved the performance of a baseline recognition system. In another comparison against conventional enhancement algorithms, both the PESQ and the segmental SNR ratings of the LMS algorithm were superior to the other methods for noisy speech enhancement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes in the economic climate and the delivery of health care require that pre-operative information programmes are effective and efficiently implemented. In order to be effective the pre-operative programme must meet the information needs of intensive care unit (ICU) patients and their relatives. Efficiency can be achieved through a structured pre-operative programme which provides a framework for teaching. The need to develop an ICU information booklet in a large teaching hospital in Northern Ireland has become essential to provide relevant information and improve the quality of service for patients and relatives, as set out in the White Paper, ‘Working for Patients’, (DoH, 1989). The first step in establishing a patient education programme was to ascertain patients' and relatives' informational needs. A ‘needs assessment’ identified the pre-operative information needs of ICU patients and their relatives (McGaughey, 1994) and the findings were used to plan and publish an information booklet. The ICU booklet provides a structure for pre-operative visits to ensure that patients and relatives information needs are met.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim The aim of the study is to evaluate factors that enable or constrain the implementation and service delivery of early warnings systems or acute care training in practice. Background To date there is limited evidence to support the effectiveness of acute care initiatives (early warning systems, acute care training, outreach) in reducing the number of adverse events (cardiac arrest, death, unanticipated Intensive Care admission) through increased recognition and management of deteriorating ward based patients in hospital [1-3]. The reasons posited are that previous research primarily focused on measuring patient outcomes following the implementation of an intervention or programme without considering the social factors (the organisation, the people, external influences) which may have affected the process of implementation and hence measured end-points. Further research which considers the social processes is required in order to understand why a programme works, or does not work, in particular circumstances [4]. Method The design is a multiple case study approach of four general wards in two acute hospitals where Early Warning Systems (EWS) and Acute Life-threatening Events Recognition and Treatment (ALERT) course have been implemented. Various methods are being used to collect data about individual capacities, interpersonal relationships and institutional balance and infrastructures in order to understand the intended and unintended process outcomes of implementing EWS and ALERT in practice. This information will be gathered from individual and focus group interviews with key participants (ALERT facilitators, nursing and medical ALERT instructors, ward managers, doctors, ward nurses and health care assistants from each hospital); non-participant observation of ward organisation and structure; audit of patients' EWS charts and audit of the medical notes of patients who deteriorated during the study period to ascertain whether ALERT principles were followed. Discussion & progress to date This study commenced in January 2007. Ethical approval has been granted and data collection is ongoing with interviews being conducted with key stakeholders. The findings from this study will provide evidence for policy-makers to make informed decisions regarding the direction for strategic and service planning of acute care services to improve the level of care provided to acutely ill patients in hospital. References 1. Esmonde L, McDonnell A, Ball C, Waskett C, Morgan R, Rashidain A et al. Investigating the effectiveness of Critical Care Outreach Services: A systematic review. Intensive Care Medicine 2006; 32: 1713-1721 2. McGaughey J, Alderdice F, Fowler R, Kapila A, Mayhew A, Moutray M. Outreach and Early Warning Systems for the prevention of Intensive Care admission and death of critically ill patients on general hospital wards. Cochrane Database of Systematic Reviews 2007, Issue 3. www.thecochranelibrary.com 3. Winters BD, Pham JC, Hunt EA, Guallar E, Berenholtz S, Pronovost PJ (2007) Rapid Response Systems: A systematic review. Critical Care Medicine 2007; 35 (5): 1238-43 4. Pawson R and Tilley N. Realistic Evaluation. London; Sage: 1997

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Realising memory intensive applications such as image and video processing on FPGA requires creation of complex, multi-level memory hierarchies to achieve real-time performance; however commerical High Level Synthesis tools are unable to automatically derive such structures and hence are unable to meet the demanding bandwidth and capacity constraints of these applications. Current approaches to solving this problem can only derive either single-level memory structures or very deep, highly inefficient hierarchies, leading in either case to one or more of high implementation cost and low performance. This paper presents an enhancement to an existing MC-HLS synthesis approach which solves this problem; it exploits and eliminates data duplication at multiple levels levels of the generated hierarchy, leading to a reduction in the number of levels and ultimately higher performance, lower cost implementations. When applied to synthesis of C-based Motion Estimation, Matrix Multiplication and Sobel Edge Detection applications, this enables reductions in Block RAM and Look Up Table (LUT) cost of up to 25%, whilst simultaneously increasing throughput.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How can applications be deployed on the cloud to achieve maximum performance? This question has become significant and challenging with the availability of a wide variety of Virtual Machines (VMs) with different performance capabilities in the cloud. The above question is addressed by proposing a six step benchmarking methodology in which a user provides a set of four weights that indicate how important each of the following groups: memory, processor, computation and storage are to the application that needs to be executed on the cloud. The weights along with cloud benchmarking data are used to generate a ranking of VMs that can maximise performance of the application. The rankings are validated through an empirical analysis using two case study applications, the first is a financial risk application and the second is a molecular dynamics simulation, which are both representative of workloads that can benefit from execution on the cloud. Both case studies validate the feasibility of the methodology and highlight that maximum performance can be achieved on the cloud by selecting the top ranked VMs produced by the methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The risks associated with zoonotic infections transmitted by companion animals are a serious public health concern: the control of zoonoses incidence in domestic dogs, both owned and stray, is hence important to protect human health. Integrated dog population management (DPM) programs, based on the availability of information systems providing reliable data on the structure and composition of the existing dog population in a given area, are fundamental for making realistic plans for any disease surveillance and action system. Traceability systems, based on the compulsory electronic identification of dogs and their registration in a computerised database, are one of the most effective ways to ensure the usefulness of DPM programs. Even if this approach provides many advantages, several areas of improvement have emerged in countries where it has been applied. In Italy, every region hosts its own dog register but these are not compatible with one another. This paper shows the advantages of a web-based-application to improve data management of dog regional registers. The approach used for building this system was inspired by farm animal traceability schemes and it relies on a network of services that allows multi-channel access by different devices and data exchange via the web with other existing applications, without changing the pre-existing platforms. Today the system manages a database for over 300,000 dogs registered in three different Italian regions. By integrating multiple Web Services, this approach could be the solution to gather data at national and international levels at reasonable cost and creating a traceability system on a large scale and across borders that can be used for disease surveillance and development of population management plans. © 2012 Elsevier B.V.