97 resultados para ARM Linux


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The end of Dennard scaling has pushed power consumption into a first order concern for current systems, on par with performance. As a result, near-threshold voltage computing (NTVC) has been proposed as a potential means to tackle the limited cooling capacity of CMOS technology. Hardware operating in NTV consumes significantly less power, at the cost of lower frequency, and thus reduced performance, as well as increased error rates. In this paper, we investigate if a low-power systems-on-chip, consisting of ARM's asymmetric big.LITTLE technology, can be an alternative to conventional high performance multicore processors in terms of power/energy in an unreliable scenario. For our study, we use the Conjugate Gradient solver, an algorithm representative of the computations performed by a large range of scientific and engineering codes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modifying induction therapy in AML may improve the remission rate and reduce the risk of relapse thereby improving survival. Escalation of the daunorubicin dose to 90mg/m(2) has shown benefit for some patient subgroups when compared with a dose of 45mg/m(2) and has been recommended as a standard of care. However 60mg/m(2) is widely used and has never been directly compared to 90mg/m(2). As part of the UK NCRI AML17 trial 1206 adults with untreated AML or high risk MDS, mostly under 60 years of age, were randomised to a first induction course of chemotherapy which delivered either 90mg/m(2) or 60mg/m(2) on days 1,3 and 5 combined with cytosine arabinoside. All patients then received a second course which included daunorubicin 50mg/m(2) on days 1,3 and 5. There was no overall difference in complete remission rate (CR) (73% vs 75%, OR1.07 (0.83-1.39), p=0.6) or in any recognised subgroup. The 60 day mortality was increased in the 90mg/m2 arm (10% vs 5% (HR 1.98(1.30-3.02) p=0.001)), which resulted in no difference in overall 2 year survival (59% vs 60%, HR 1.16(0.95-1.43), p=0.15). In exploratory subgroup analysis there was no subgroup which showed significant benefit, although there was a significant interaction by FLT3 ITD mutation. The trial is registered to www.isrctn.com as ISRCTN55675535.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of new treatments for older patients with acute myeloid leukemia is an active area, but has met with limited success. Vosaroxin, a quinolone-derived intercalating agent has several properties that could prove beneficial. Initial clinical studies showed it to be well-tolerated in older patients with relapsed/refractory disease. In vitro data suggested synergy with cytarabine (Ara-C). To evaluate vosaroxin, we performed 2 randomized comparisons within the "Pick a Winner" program. A total of 104 patients were randomized to vosaroxin vs low-dose Ara-C (LDAC) and 104 to vosaroxin + LDAC vs LDAC. When comparing vosaroxin with LDAC, neither response rate (complete recovery [CR]/complete recovery with incomplete count recovery [CRi], 26% vs 30%; odds ratio [OR], 1.16 (0.49-2.72); P = .7) nor 12-month survival (12% vs 31%; hazard ratio [HR], 1.94 [1.26-3.00]; P = .003) showed benefit for vosaroxin. Likewise, in the vosaroxin + LDAC vs LDAC comparison, neither response rate (CR/CRi, 38% vs 34%; OR, 0.83 [0.37-1.84]; P = .6) nor survival (33% vs 37%; HR, 1.30 [0.81-2.07]; P = .3) was improved. A major reason for this lack of benefit was excess early mortality in the vosaroxin + LDAC arm, most obviously in the second month following randomization. At its first interim analysis, the Data Monitoring and Ethics Committee recommended closure of the vosaroxin-containing trial arms because a clinically relevant benefit was unlikely.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Serious case reviews and research studies have indicated weaknesses in risk assessments conducted by child protection social workers. Social workers are adept at gathering information but struggle with analysis and assessment of risk. The Department for Education wants to know if the use of a structured decision-making tool can improve child protection assessments of risk.

Methods/design: This multi-site, cluster-randomised trial will assess the effectiveness of the Safeguarding Children Assessment and Analysis Framework (SAAF). This structured decision-making tool aims to improve social workers' assessments of harm, of future risk and parents' capacity to change. The comparison is management as usual.

Inclusion criteria: Children's Services Departments (CSDs) in England willing to make relevant teams available to be randomised, and willing to meet the trial's training and data collection requirements.

Exclusion criteria: CSDs where there were concerns about performance; where a major organisational restructuring was planned or under way; or where other risk assessment tools were in use.

Six CSDs are participating in this study. Social workers in the experimental arm will receive 2 days training in SAAF together with a range of support materials, and access to limited telephone consultation post-training. The primary outcome is child maltreatment. This will be assessed using data collected nationally on two key performance indicators: the first is the number of children in a year who have been subject to a second Child Protection Plan (CPP); the second is the number of re-referrals of children because of related concerns about maltreatment. Secondary outcomes are: i) the quality of assessments judged against a schedule of quality criteria and ii) the relationship between the three assessments required by the structured decision-making tool (level of harm, risk of (re) abuse and prospects for successful intervention).

Discussion: This is the first study to examine the effectiveness of SAAF. It will contribute to a very limited literature on the contribution that structured decision-making tools can make to improving risk assessment and case planning in child protection and on what is involved in their effective implementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: Active surveillance is increasingly accepted as a treatment option for favorable-risk prostate cancer. Long-term follow-up has been lacking. In this study, we report the long-term outcome of a large active surveillance protocol in men with favorable-risk prostate cancer.

PATIENTS AND METHODS: In a prospective single-arm cohort study carried out at a single academic health sciences center, 993 men with favorable- or intermediate-risk prostate cancer were managed with an initial expectant approach. Intervention was offered for a prostate-specific antigen (PSA) doubling time of less than 3 years, Gleason score progression, or unequivocal clinical progression. Main outcome measures were overall and disease-specific survival, rate of treatment, and PSA failure rate in the treated patients.

RESULTS: Among the 819 survivors, the median follow-up time from the first biopsy is 6.4 years (range, 0.2 to 19.8 years). One hundred forty-nine (15%) of 993 patients died, and 844 patients are alive (censored rate, 85.0%). There were 15 deaths (1.5%) from prostate cancer. The 10- and 15-year actuarial cause-specific survival rates were 98.1% and 94.3%, respectively. An additional 13 patients (1.3%) developed metastatic disease and are alive with confirmed metastases (n = 9) or have died of other causes (n = 4). At 5, 10, and 15 years, 75.7%, 63.5%, and 55.0% of patients remained untreated and on surveillance. The cumulative hazard ratio for nonprostate-to-prostate cancer mortality was 9.2:1.

CONCLUSION: Active surveillance for favorable-risk prostate cancer is feasible and seems safe in the 15-year time frame. In our cohort, 2.8% of patients have developed metastatic disease, and 1.5% have died of prostate cancer. This mortality rate is consistent with expected mortality in favorable-risk patients managed with initial definitive intervention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a rigorous methodology and new metrics for fair comparison of server and microserver platforms. Deploying our methodology and metrics, we compare a microserver with ARM cores against two servers with ×86 cores running the same real-time financial analytics workload. We define workload-specific but platform-independent performance metrics for platform comparison, targeting both datacenter operators and end users. Our methodology establishes that a server based on the Xeon Phi co-processor delivers the highest performance and energy efficiency. However, by scaling out energy-efficient microservers, we achieve competitive or better energy efficiency than a power-equivalent server with two Sandy Bridge sockets, despite the microserver's slower cores. Using a new iso-QoS metric, we find that the ARM microserver scales enough to meet market throughput demand, that is, a 100% QoS in terms of timely option pricing, with as little as 55% of the energy consumed by the Sandy Bridge server.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Energy efficiency is an essential requirement for all contemporary computing systems. We thus need tools to measure the energy consumption of computing systems and to understand how workloads affect it. Significant recent research effort has targeted direct power measurements on production computing systems using on-board sensors or external instruments. These direct methods have in turn guided studies of software techniques to reduce energy consumption via workload allocation and scaling. Unfortunately, direct energy measurements are hampered by the low power sampling frequency of power sensors. The coarse granularity of power sensing limits our understanding of how power is allocated in systems and our ability to optimize energy efficiency via workload allocation.
We present ALEA, a tool to measure power and energy consumption at the granularity of basic blocks, using a probabilistic approach. ALEA provides fine-grained energy profiling via sta- tistical sampling, which overcomes the limitations of power sens- ing instruments. Compared to state-of-the-art energy measurement tools, ALEA provides finer granularity without sacrificing accuracy. ALEA achieves low overhead energy measurements with mean error rates between 1.4% and 3.5% in 14 sequential and paral- lel benchmarks tested on both Intel and ARM platforms. The sampling method caps execution time overhead at approximately 1%. ALEA is thus suitable for online energy monitoring and optimization. Finally, ALEA is a user-space tool with a portable, machine-independent sampling method. We demonstrate two use cases of ALEA, where we reduce the energy consumption of a k-means computational kernel by 37% and an ocean modelling code by 33%, compared to high-performance execution baselines, by varying the power optimization strategy between basic blocks.