850 resultados para Significance
Resumo:
In the exploration of highly efficient direct ethanol fuel cells (DEFCs), how to promote the CO2 selectivity is a key issue which remains to be solved. Some advances have been made, for example, using bimetallic electrocatalysts, Rh has been found to be an efficient additive to platinum to obtain high CO2 selectivity experimentally. In this work, the mechanism of ethanol electrooxidation is investigated using first principles method. It is found that CH3CHOH* is the key intermediate during ethanol electrooxidation and the activity of β-dehydrogenation is the rate determining factor that affects the completeness of ethanol oxidation. In addition, a series of transition metals (Ru, Rh, Pd, Os and Ir) are alloyed on the top layer of Pt(111) in order to analyze their effects. The elementary steps, α-, β-C-H bond and C-C bond dissociations are calculated on these bimetallic M/Pt(111) surfaces and the formation potential of OH* from water dissociation is also calculated. We find that the active metals increase the activity of β-dehydrogenation but lower the OH* formation potential resulting in the active site being blocked. By considering both β-dehydrogenation and OH* formation, Ru, Os and Ir are identified to be unsuitable for the promotion of CO2 selectivity and only Rh is able to increase the selectivity of CO2 in DEFCs.
Resumo:
Dynamic Voltage and Frequency Scaling (DVFS) exhibits fundamental limitations as a method to reduce energy consumption in computing systems. In the HPC domain, where performance is of highest priority and codes are heavily optimized to minimize idle time, DVFS has limited opportunity to achieve substantial energy savings. This paper explores if operating processors Near the transistor Threshold Volt- age (NTV) is a better alternative to DVFS for break- ing the power wall in HPC. NTV presents challenges, since it compromises both performance and reliability to reduce power consumption. We present a first of its kind study of a significance-driven execution paradigm that selectively uses NTV and algorithmic error tolerance to reduce energy consumption in performance- constrained HPC environments. Using an iterative algorithm as a use case, we present an adaptive execution scheme that switches between near-threshold execution on many cores and above-threshold execution on one core, as the computational significance of iterations in the algorithm evolves over time. Using this scheme on state-of-the-art hardware, we demonstrate energy savings ranging between 35% to 67%, while compromising neither correctness nor performance.
Resumo:
An extension of approximate computing, significance-based computing exploits applications' inherent error resiliency and offers a new structural paradigm that strategically relaxes full computational precision to provide significant energy savings with minimal performance degradation.
Resumo:
In this paper we present a design methodology for algorithm/architecture co-design of a voltage-scalable, process variation aware motion estimator based on significance driven computation. The fundamental premise of our approach lies in the fact that all computations are not equally significant in shaping the output response of video systems. We use a statistical technique to intelligently identify these significant/not-so-significant computations at the algorithmic level and subsequently change the underlying architecture such that the significant computations are computed in an error free manner under voltage over-scaling. Furthermore, our design includes an adaptive quality compensation (AQC) block which "tunes" the algorithm and architecture depending on the magnitude of voltage over-scaling and severity of process variations. Simulation results show average power savings of similar to 33% for the proposed architecture when compared to conventional implementation in the 90 nm CMOS technology. The maximum output quality loss in terms of Peak Signal to Noise Ratio (PSNR) was similar to 1 dB without incurring any throughput penalty.
Resumo:
In this paper, we propose a design paradigm for energy efficient and variation-aware operation of next-generation multicore heterogeneous platforms. The main idea behind the proposed approach lies on the observation that not all operations are equally important in shaping the output quality of various applications and of the overall system. Based on such an observation, we suggest that all levels of the software design stack, including the programming model, compiler, operating system (OS) and run-time system should identify the critical tasks and ensure correct operation of such tasks by assigning them to dynamically adjusted reliable cores/units. Specifically, based on error rates and operating conditions identified by a sense-and-adapt (SeA) unit, the OS selects and sets the right mode of operation of the overall system. The run-time system identifies the critical/less-critical tasks based on special directives and schedules them to the appropriate units that are dynamically adjusted for highly-accurate/approximate operation by tuning their voltage/frequency. Units that execute less significant operations can operate at voltages less than what is required for correct operation and consume less power, if required, since such tasks do not need to be always exact as opposed to the critical ones. Such scheme can lead to energy efficient and reliable operation, while reducing the design cost and overheads of conventional circuit/micro-architecture level techniques.
Resumo:
The excavated north Antrim sites of Doonmore and Drumadoon are compared and attention is drawn to a group of like monuments in the same barony, here termed fortified outcrops. These are argued to be a type influenced by settlements in Argyll and introduced by the Dál Riata to Ulster through ‘counterstream migration’.
Resumo:
Measuring inconsistency is crucial to effective inconsistency management in software development. A complete measurement of inconsistency should focus on not only the degree but also the significance of inconsistency. However, most of the approaches available only take the degree of inconsistency into account. The significance of inconsistency has not yet been given much needed consideration. This paper presents an approach for measuring the significance of inconsistency arising from different viewpoints in the Viewpoints framework. We call an individual set of requirements belonging to different viewpoints a combined requirements collection in this paper. We argue that the
significance of inconsistency arising in a combined requirements collection is closely associated with global priority levels of requirements involved in the inconsistency. Here we assume that the global priority level of an individual requirement captures the relative importance of every viewpoint including this requirement as well as the local priority level of the requirement within the viewpoint. Then we use the synthesis of global priority levels of all the requirements in a combined collection to measure the significance of the
collection. Following this, we present a scoring matrix function to measure the significance of inconsistency in an inconsistent combined requirements collection, which describes the contribution made by each subset of the requirements collection to the significance of the set of requirements involved in the inconsistency. An ordering relationship between inconsistencies of two combined requirements collections, termed more significant than, is also presented by comparing their significance scoring matrix functions. Finally, these techniques were implemented in a prototype tool called IncMeasurer, which we developed as a proof of concept.
Resumo:
Objectives
A P-value <0.05 is one metric used to evaluate the results of a randomized controlled trial (RCT). We wondered how often statistically significant results in RCTs may be lost with small changes in the numbers of outcomes.
Study Design and Setting
A review of RCTs in high-impact medical journals that reported a statistically significant result for at least one dichotomous or time-to-event outcome in the abstract. In the group with the smallest number of events, we changed the status of patients without an event to an event until the P-value exceeded 0.05. We labeled this number the Fragility Index; smaller numbers indicated a more fragile result.
Results
The 399 eligible trials had a median sample size of 682 patients (range: 15-112,604) and a median of 112 events (range: 8-5,142); 53% reported a P-value <0.01. The median Fragility Index was 8 (range: 0-109); 25% had a Fragility Index of 3 or less. In 53% of trials, the Fragility Index was less than the number of patients lost to follow-up.
Conclusion
The statistically significant results of many RCTs hinge on small numbers of events. The Fragility Index complements the P-value and helps identify less robust results.
Resumo:
Aims: The utility of p53 as a prognostic assay has been elusive. The aims of this study were to describe a novel, reproducible scoring system and assess the relationship between differential p53 immunohistochemistry (IHC) expression patterns, TP53 mutation status and patient outcomes in breast cancer.
Methods and Results: Tissue microarrays were used to study p53 IHC expression patterns: expression was defined as extreme positive (EP), extreme negative (EN), and non-extreme (NE; intermediate patterns). Overall survival (OS) was used to define patient outcome. A representative subgroup (n = 30) showing the various p53 immunophenotypes was analysed for TP53 hotspot mutation status (exons 4-9). Extreme expression of any type occurred in 176 of 288 (61%) cases. As compared with NE expression, EP expression was significantly associated (P = 0.039) with poorer OS. In addition, as compared with NE expression, EN expression was associated (P = 0.059) with poorer OS. Combining cases showing either EP or EN expression better predicted OS than either pattern alone (P = 0.028). This combination immunophenotype was significant in univariate but not multivariate analysis. In subgroup analysis, six substitution exon mutations were detected, all corresponding to extreme IHC phenotypes. Five missense mutations corresponded to EP staining, and the nonsense mutation corresponded to EN staining. No mutations were detected in the NE group.
Conclusions: Patients with extreme p53 IHC expression have a worse OS than those with NE expression. Accounting for EN as well as EP expression improves the prognostic impact. Extreme expression positively correlates with nodal stage and histological grade, and negatively with hormone receptor status. Extreme expression may relate to specific mutational status.
Resumo:
A role for the minichromosome maintenance (MCM) proteins in cancer initiation and progression is slowly emerging. Functioning as a complex to ensure a single chromosomal replication per cell cycle, the six family members have been implicated in several neoplastic disease states, including breast cancer. Our study aim to investigate the prognostic significance of these proteins in breast cancer. We studied the expression of MCMs in various datasets and the associations of the expression with clinicopathological parameters. When considered alone, high level MCM4 overexpression was only weakly associated with shorter survival in the combined breast cancer patient cohort (n = 1441, Hazard Ratio = 1.31; 95% Confidence Interval = 1.11-1.55; p = 0.001). On the other hand, when we studied all six components of the MCM complex, we found that overexpression of all MCMs was strongly associated with shorter survival in the same cohort (n = 1441, Hazard Ratio = 1.75; 95% Confidence Interval = 1.31-2.34; p <0.001), suggesting these MCM proteins may cooperate to promote breast cancer progression. Indeed, their expressions were significantly correlated with each other in these cohorts. In addition, we found that increasing number of overexpressed MCMs was associated with negative ER status as well as treatment response. Together, our findings are reproducible in seven independent breast cancer cohorts, with 1441 patients, and suggest that MCM profiling could potentially be used to predict response to treatment and prognosis in breast cancer patients.
Resumo:
Approximate execution is a viable technique for energy-con\-strained environments, provided that applications have the mechanisms to produce outputs of the highest possible quality within the given energy budget.
We introduce a framework for energy-constrained execution with controlled and graceful quality loss. A simple programming model allows users to express the relative importance of computations for the quality of the end result, as well as minimum quality requirements. The significance-aware runtime system uses an application-specific analytical energy model to identify the degree of concurrency and approximation that maximizes quality while meeting user-specified energy constraints. Evaluation on a dual-socket 8-core server shows that the proposed
framework predicts the optimal configuration with high accuracy, enabling energy-constrained executions that result in significantly higher quality compared to loop perforation, a compiler approximation technique.
Resumo:
We introduce a task-based programming model and runtime system that exploit the observation that not all parts of a program are equally significant for the accuracy of the end-result, in order to trade off the quality of program outputs for increased energy-efficiency. This is done in a structured and flexible way, allowing for easy exploitation of different points in the quality/energy space, without adversely affecting application performance. The runtime system can apply a number of different policies to decide whether it will execute less-significant tasks accurately or approximately.
The experimental evaluation indicates that our system can achieve an energy reduction of up to 83% compared with a fully accurate execution and up to 35% compared with an approximate version employing loop perforation. At the same time, our approach always results in graceful quality degradation.