916 resultados para Fault tolerant computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Velvetgrass (Holcus lanatus L.), also known as Yorkshire fog grass, has evolved tolerance to high levels of arsenate, and this adaptation involves reduced accumulation of arsenate through the suppression of the high affinity phosphate-arsenate uptake system. To determine the role of P nutrition in arsenate tolerance, inhibition kinetics of arsenate influx by phosphate were determined. The concentration of inhibitor required to reduce maximum influx (V(max)) by 50%, K1, of phosphate inhibition of arsenate influx was 0.02 mol m-3 in both tolerant and nontolerant clones. This was compared with the concentration where influx is 50% of maximum, a K(m), for arsenate influx of 0.6 mol m-3 for tolerants and 0.025 mol m-3 for nontolerants and, therefore, phosphate was much more effective at inhibiting arsenate influx in tolerant genotypes. The high affinity phosphate uptake system is inducible under low plant phosphate status, this increasing plant phosphate status should increase tolerance by decreasing arsenate influx. Root extension in arsenate solutions of tolerant and nontolerant tillers grown under differing phosphate nutritional regimes showed that indeed, increased plant P status increased the tolerance to arsenate of both tolerant and nontolerant clones. That plant P status increased tolerance again argues that P nutrition has a critical role in arsenate tolerance. To determine if short term flux and solution culture studies were relevant to As and P accumulation in soils, soil and plant material from a range of As contaminated sites were analyzed. As predicted from the short-term competition studies, P was accumulated preferentially to As in arsenate tolerant clones growing on mine spoil soils even when acid extractable arsenate in the soils was much greater than acid extractable phosphate. Though phosphate was much more efficient at competing with arsenate for uptake, plants growing on arsenate contaminated land still accumulated considerable amounts of As. Plants from the differing habitats showed large variation in plant phosphate status, pasture plants having much higher P levels than plants growing on the most contaminated mine spoil soils. The selectivity of the phosphate-arsenate uptake system for phosphate compared with arsenate, coupled with the suppression of this uptake system enabled tolerant clones of the grass velvetgrass to grow on soils that were highly contaminated with arsenate and deficient in phosphate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methane-derived authigenic carbonate (MDAC) mound features at the Codling Fault Zone (CFZ), located in shallow waters (50-120m) of the western Irish Sea were investigated and provide a comparison to deep sea MDAC settings. Carbonates consisted of aragonite as the major mineral phase, with δ13C depletion to -50‰ and δ18O enrichment to~2‰. These isotope signatures, together with the co-precipitation of framboidal pyrite confirm that anaerobic oxidation of methane (AOM) is an important process mediating methane release to the water column and the atmosphere in this region. 18O-enrichment could be a result of MDAC precipitation with seawater in colder than present day conditions, or precipitation with 18O-enriched water transported from deep petroleum sources. The 13C depletion of bulk carbonate and sampled gas (-70‰) suggests a biogenic source, but significant mixing of thermogenic gas and depletion of the original isotope signature cannot be ruled out. Active seepage was recorded from one mound and together with extensive areas of reduced sediment, confirms that seepage is ongoing. The mounds appear to be composed of stacked pavements that are largely covered by sand and extensively eroded. The CFZ mounds are colonized by abundant Sabellaria polychaetes and possible Nemertesia hydroids, which benefit indirectly from available hard substrate. In contrast to deep sea MDAC settings where seep-related macrofauna are commonly reported, seep-specialist fauna appear to be lacking at the CFZ. In addition, unlike MDAC in deep waters where organic carbon input from photosynthesis is limited, lipid biomarkers and isotope signatures related to marine planktonic production (e.g. sterols, alkanols) were most abundant. Evidence for microbes involved in AOM was limited from samples taken; possibly due to this dilution effect from organic matter derived from the photic zone, and will require further investigation. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electric vehicles (EVs) and hybrid electric vehicles (HEVs) can reduce greenhouse gas emissions while switched reluctance motor (SRM) is one of the promising motor for such applications. This paper presents a novel SRM fault-diagnosis and fault-tolerance operation solution. Based on the traditional asymmetric half-bridge topology for the SRM driving, the central tapped winding of the SRM in modular half-bridge configuration are introduced to provide fault-diagnosis and fault-tolerance functions, which are set idle in normal conditions. The fault diagnosis can be achieved by detecting the characteristic of the excitation and demagnetization currents. An SRM fault-tolerance operation strategy is also realized by the proposed topology, which compensates for the missing phase torque under the open-circuit fault, and reduces the unbalanced phase current under the short-circuit fault due to the uncontrolled faulty phase. Furthermore, the current sensor placement strategy is also discussed to give two placement methods for low cost or modular structure. Simulation results in MATLAB/Simulink and experiments on a 750-W SRM validate the effectiveness of the proposed strategy, which may have significant implications and improve the reliability of EVs/HEVs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research presents a fast algorithm for projected support vector machines (PSVM) by selecting a basis vector set (BVS) for the kernel-induced feature space, the training points are projected onto the subspace spanned by the selected BVS. A standard linear support vector machine (SVM) is then produced in the subspace with the projected training points. As the dimension of the subspace is determined by the size of the selected basis vector set, the size of the produced SVM expansion can be specified. A two-stage algorithm is derived which selects and refines the basis vector set achieving a locally optimal model. The model expansion coefficients and bias are updated recursively for increase and decrease in the basis set and support vector set. The condition for a point to be classed as outside the current basis vector and selected as a new basis vector is derived and embedded in the recursive procedure. This guarantees the linear independence of the produced basis set. The proposed algorithm is tested and compared with an existing sparse primal SVM (SpSVM) and a standard SVM (LibSVM) on seven public benchmark classification problems. Our new algorithm is designed for use in the application area of human activity recognition using smart devices and embedded sensors where their sometimes limited memory and processing resources must be exploited to the full and the more robust and accurate the classification the more satisfied the user. Experimental results demonstrate the effectiveness and efficiency of the proposed algorithm. This work builds upon a previously published algorithm specifically created for activity recognition within mobile applications for the EU Haptimap project [1]. The algorithms detailed in this paper are more memory and resource efficient making them suitable for use with bigger data sets and more easily trained SVMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.

In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate execution is a viable technique for environments with energy constraints, provided that applications are given the mechanisms to produce outputs of the highest possible quality within the available energy budget. This paper introduces a framework for energy-constrained execution with controlled and graceful quality loss. A simple programming model allows developers to structure the computation in different tasks, and to express the relative importance of these tasks for the quality of the end result. For non-significant tasks, the developer can also supply less costly, approximate versions. The target energy consumption for a given execution is specified when the application is launched. A significance-aware runtime system employs an application-specific analytical energy model to decide how many cores to use for the execution, the operating frequency for these cores, as well as the degree of task approximation, so as to maximize the quality of the output while meeting the user-specified energy constraints. Evaluation on a dual-socket 16-core Intel platform using 9 benchmark kernels shows that the proposed framework picks the optimal configuration with high accuracy. Also, a comparison with loop perforation (a well-known compile-time approximation technique), shows that the proposed framework results in significantly higher quality for the same energy budget.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper outlines a means of improving the employability skills of first-year university students through a closely integrated model of employer engagement within computer science modules. The outlined approach illustrates how employability skills, including communication, teamwork and time management skills, can be contextualised in a manner that directly relates to student learning but can still be linked forward into employment. The paper tests the premise that developing employability skills early within the curriculum will result in improved student engagement and learning within later modules. The paper concludes that embedding employer participation within first-year models can help relate a distant notion of employability into something of more immediate relevance in terms of how students can best approach learning. Further, by enhancing employability skills early within the curriculum, it becomes possible to improve academic attainment within later modules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The circumstances in Colombo, Sri Lanka, and in Belfast, Northern Ireland, which led to a) the generalization of luminescent PET (photoinduced electron transfer) sensing/switching as a design tool, b) the construction of a market-leading blood electrolyte analyzer and c) the invention of molecular logic-based computation as an experimental field, are delineated. Efforts to extend the philosophy of these approaches into issues of small object identification, nanometric mapping, animal visual perception and visual art are also outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Partially ordered preferences generally lead to choices that do not abide by standard expected utility guidelines; often such preferences are revealed by imprecision in probability values. We investigate five criteria for strategy selection in decision trees with imprecision in probabilities: “extensive” Γ-maximin and Γ-maximax, interval dominance, maximality and E-admissibility. We present algorithms that generate strategies for all these criteria; our main contribution is an algorithm for Eadmissibility that runs over admissible strategies rather than over sets of probability distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper is concerned with the role of art and design in the history and philosophy of computing. It offers insights arising from research into a period in the 1960s and 70s, particularly in the UK, when computing became more available to artists and designers, focusing on John Lansdown (1929-1999) and Bruce Archer (1922-2005) in London. Models of computing interacted with conceptualisations of art, design and related creative activities in important ways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the application of the on-load exciting current Extended Park's Vector Approach for diagnosing incipient turn-to-turn winding faults in operating power transformers. Experimental and simulated test results demonstrate the effectiveness of the proposed technique, which is based on the spectral analysis of the AC component of the on-load exciting current Park's Vector modulus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese dout., Engenharia electrónica e computação - Processamento de sinal, Universidade do Algarve, 2008