932 resultados para computation- and data-intensive applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Realising memory intensive applications such as image and video processing on FPGA requires creation of complex, multi-level memory hierarchies to achieve real-time performance; however commerical High Level Synthesis tools are unable to automatically derive such structures and hence are unable to meet the demanding bandwidth and capacity constraints of these applications. Current approaches to solving this problem can only derive either single-level memory structures or very deep, highly inefficient hierarchies, leading in either case to one or more of high implementation cost and low performance. This paper presents an enhancement to an existing MC-HLS synthesis approach which solves this problem; it exploits and eliminates data duplication at multiple levels levels of the generated hierarchy, leading to a reduction in the number of levels and ultimately higher performance, lower cost implementations. When applied to synthesis of C-based Motion Estimation, Matrix Multiplication and Sobel Edge Detection applications, this enables reductions in Block RAM and Look Up Table (LUT) cost of up to 25%, whilst simultaneously increasing throughput.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How can applications be deployed on the cloud to achieve maximum performance? This question has become significant and challenging with the availability of a wide variety of Virtual Machines (VMs) with different performance capabilities in the cloud. The above question is addressed by proposing a six step benchmarking methodology in which a user provides a set of four weights that indicate how important each of the following groups: memory, processor, computation and storage are to the application that needs to be executed on the cloud. The weights along with cloud benchmarking data are used to generate a ranking of VMs that can maximise performance of the application. The rankings are validated through an empirical analysis using two case study applications, the first is a financial risk application and the second is a molecular dynamics simulation, which are both representative of workloads that can benefit from execution on the cloud. Both case studies validate the feasibility of the methodology and highlight that maximum performance can be achieved on the cloud by selecting the top ranked VMs produced by the methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The risks associated with zoonotic infections transmitted by companion animals are a serious public health concern: the control of zoonoses incidence in domestic dogs, both owned and stray, is hence important to protect human health. Integrated dog population management (DPM) programs, based on the availability of information systems providing reliable data on the structure and composition of the existing dog population in a given area, are fundamental for making realistic plans for any disease surveillance and action system. Traceability systems, based on the compulsory electronic identification of dogs and their registration in a computerised database, are one of the most effective ways to ensure the usefulness of DPM programs. Even if this approach provides many advantages, several areas of improvement have emerged in countries where it has been applied. In Italy, every region hosts its own dog register but these are not compatible with one another. This paper shows the advantages of a web-based-application to improve data management of dog regional registers. The approach used for building this system was inspired by farm animal traceability schemes and it relies on a network of services that allows multi-channel access by different devices and data exchange via the web with other existing applications, without changing the pre-existing platforms. Today the system manages a database for over 300,000 dogs registered in three different Italian regions. By integrating multiple Web Services, this approach could be the solution to gather data at national and international levels at reasonable cost and creating a traceability system on a large scale and across borders that can be used for disease surveillance and development of population management plans. © 2012 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The European badger (Melesmeles) is involved in the maintenance of bovine tuberculosis infection and onward spread to cattle. However, little is known about how transmission occurs. One possible route could be through direct contact between infected badgers and cattle. It is also possible that indirect contact between cattle and infected badger excretory products such as faeces or urine may occur either on pasture or within and around farm buildings. A better understanding of behaviour patterns in wild badgers may help to develop biosecurity measures to minimise direct and indirect contact between badgers and cattle. However, monitoring the behaviour of free-ranging badgers can be logistically challenging and labour intensive due to their nocturnal and semi-fossorial nature.We trialled a GPS and tri-axial accelerometer-equipped collar on a free-ranging badger to assess its potential value to elucidate behaviour-time budgets and functional habitat use. Results: During the recording period between 16:00 and 08:00 on a single night, resting was the most commonly identified behaviour (67.4%) followed by walking (20.9%), snuffling (9.5%) and trotting (2.3%).When examining accelerometer data associated with each GPS fix and habitat type (occurring 2 min 30 s before and after), walking was themost common behaviour in woodland (40.3%) and arable habitats (53.8%), while snuffling was themost common behaviour in pasture (61.9%). Several nocturnal resting periods were also observed. The total distance travelled was 2.28 km. Conclusions: In the present report, we demonstrate proof of principle in the application of a combined GPS and accelerometer device to collect detailed quantitative data on wild badger behaviour. Behaviour-time budgets allow us to investigate how badgers allocate energy to different activities and how thismight change with disease status. Such information could be useful in the development of measures to reduce opportunities for onward transmission of bovine tuberculosis from badgers to cattle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Displacement of fossil fuel-based power through biomass co-firing could reduce the greenhouse gas (GHG) emissions from fossil fuels. In this study, data-intensive techno-economic models were developed to evaluate different co-firing technologies as well as the configurations of these technologies. The models were developed to study 60 different scenarios involving various biomass feedstocks (wood chips, wheat straw, and forest residues) co-fired either with coal in a 500 MW subcritical pulverized coal (PC) plant or with natural gas in a 500 MW natural gas combined cycle (NGCC) plant to determine their technical potential and costs, as well as to determine environmental benefits. The results obtained reveal that the fully paid-off coal-fired power plant co-fired with forest residues is the most attractive option, having levelized costs of electricity (LCOE) of $53.12–$54.50/MW h and CO2 abatement costs of $27.41–$31.15/tCO2. When whole forest chips are co-fired with coal in a fully paid-off plant, the LCOE and CO2 abatement costs range from $54.68 to $56.41/MW h and $35.60 to $41.78/tCO2, respectively. The LCOE and CO2 abatement costs for straw range from $54.62 to $57.35/MW h and $35.07 to $38.48/tCO2, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inelastic electron scattering from light atomic species is of fundamental importance and has significant applications in fusion-plasma modeling. Therefore, it is of interest to apply advanced nonperturbative, close-coupling methods to the determination of electron-impact excitation for these atoms. Here we present the results of R matrix with pseudostate (RMPS) calculations of electron-impact excitation cross sections through the n=4 terms in Be, Be+, Be2+, and Be3+. In order to determine the effects of coupling of the bound states to the target continuum in these species, we compare the RMPS results with those from standard R-matrix calculations. In addition, we have performed time-dependent close-coupling calculations for excitation from the ground and the metastable terms of Be+ and the metastable term of Be3+. In general, these results are found to agree with those from our RMPS calculations. The full set of data resulting from this work is now available on the Oak Ridge National Laboratory Controlled Fusion Atomic Data Center web site, and will be employed for collisional-radiative modeling of Be in magnetically confined plasmas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trends and focii of interest in atomic modelling and data are identified in connection with recent observations and experiments in fusion and astrophysics. In the fusion domain, spectral observations are included of core, beam penetrated and divertor plasma. The helium beam experiments at JET and the studies with very heavy species at ASDEX and JET are noted. In the astrophysics domain, illustrations are given from the SOHO and CHANDRA spacecraft which span from the solar upper atmosphere, through soft x-rays from comets to supernovae remnants. It is shown that non-Maxwellian, dynamic and possibly optically thick regimes must be considered. The generalized collisional-radiative model properly describes the collisional regime of most astrophysical and laboratory fusion plasmas and yields self-consistent derived data for spectral emission, power balance and ionization state studies. The tuning of this method to routine analysis of the spectral observations is described. A forward look is taken as to how such atomic modelling, and the atomic data which underpin it, ought to evolve to deal with the extended conditions and novel environments of the illustrations. It is noted that atomic physics influences most aspects of fusion and astrophysical plasma behaviour but the effectiveness of analysis depends on the quality of the bi-directional pathway from fundamental data production through atomic/plasma model development to the confrontation with experiment. The principal atomic data capability at JET, and other fusion and astrophysical laboratories, is supplied via the Atomic Data and Analysis Structure (ADAS) Project. The close ties between the various experiments and ADAS have helped in this path of communication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exascale computation is the next target of high performance computing. In the push to create exascale computing platforms, simply increasing the number of hardware devices is not an acceptable option given the limitations of power consumption, heat dissipation, and programming models which are designed for current hardware platforms. Instead, new hardware technologies, coupled with improved programming abstractions and more autonomous runtime systems, are required to achieve this goal. This position paper presents the design of a new runtime for a new heterogeneous hardware platform being developed to explore energy efficient, high performance computing. By combining a number of different technologies, this framework will both simplify the programming of current and future HPC applications, as well as automating the scheduling of data and computation across this new hardware platform. In particular, this work explores the use of FPGAs to achieve both the power and performance goals of exascale, as well as utilising the runtime to automatically effect dynamic configuration and reconfiguration of these platforms. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To assess the efficiency of alternative monitoring services for people with ocular hypertension (OHT), a glaucoma risk factor.

DESIGN: Discrete event simulation model comparing five alternative care pathways: treatment at OHT diagnosis with minimal monitoring; biennial monitoring (primary and secondary care) with treatment if baseline predicted 5-year glaucoma risk is ≥6%; monitoring and treatment aligned to National Institute for Health and Care Excellence (NICE) glaucoma guidance (conservative and intensive).

SETTING: UK health services perspective.

PARTICIPANTS: Simulated cohort of 10 000 adults with OHT (mean intraocular pressure (IOP) 24.9 mm Hg (SD 2.4).

MAIN OUTCOME MEASURES: Costs, glaucoma detected, quality-adjusted life years (QALYs).

RESULTS: Treating at diagnosis was the least costly and least effective in avoiding glaucoma and progression. Intensive monitoring following NICE guidance was the most costly and effective. However, considering a wider cost-utility perspective, biennial monitoring was less costly and provided more QALYs than NICE pathways, but was unlikely to be cost-effective compared with treating at diagnosis (£86 717 per additional QALY gained). The findings were robust to risk thresholds for initiating monitoring but were sensitive to treatment threshold, National Health Service costs and treatment adherence.

CONCLUSIONS: For confirmed OHT, glaucoma monitoring more frequently than every 2 years is unlikely to be efficient. Primary treatment and minimal monitoring (assessing treatment responsiveness (IOP)) could be considered; however, further data to refine glaucoma risk prediction models and value patient preferences for treatment are needed. Consideration to innovative and affordable service redesign focused on treatment responsiveness rather than more glaucoma testing is recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a leading facility in laser-driven nuclear physics, ELI-NP will develop innovative research in the fields of materials behavior in extreme environments and radiobiology, with applications in the development of accelerator components, new materials for next generation fusion and fission reactors, shielding solutions for equipment and human crew in long term space missions and new biomedical technologies. The specific properties of the laser-driven radiation produced with two lasers of 1 PW at a pulse repetition rate of 1 Hz each are an ultra-short time scale, a relatively broadband spectrum and the possibility to provide simultaneously several types of radiation. Complex, cosmic-like radiation will be produced in a ground-based laboratory allowing comprehensive investigations of their effects on materials and biological systems. The expected maximum energy and intensity of the radiation beams are 19 MeV with 10^9 photon/pulse for photon radiation, 2 GeV with 108 electron/pulse for electron beams, 60 MeV with 10^12 proton/pulse for proton and ion beams and 60 MeV with 107 neutron/pulse for a neutron source. Research efforts will be directed also towards measurements for radioprotection of the prompt and activated dose, as a function of laser and target characteristics and to the development and testing of various dosimetric methods and equipment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The IfBB – Institute for Bioplastics and Biocomposites is a research institute within the Hochschule Hannover, University of Applied Sciences and Arts, which was established in 2011 to respond to the growing need for expert knowledge in the area of bioplastics. With its practice-oriented research and its collaboration with industrial partners, the IfBB is able to shore up the market for bioplastics and, in addition, foster unbiased public awareness and understanding of the topic. As an independent research-led expert institution for bioplastics, the IfBB is willing to share its expertise, research findings and data with any interested party via the Internet, online and offline publications or at fairs and conferences. In carrying on these efforts, substantial information regarding market trends, processes and resource needs for bioplastics is being presented here in a concise format, in addition to the more detailed and comprehensive publication and “Engineering Biopolymers”1. One of our main concerns is to furnish a more rational basis for discussing bioplastics and use fact-based arguments in the public discourse. Furthermore, “Biopolymers – facts and statistics” aims to provide specific, qualified answers easily and quickly for decision-makers in particular from public administration and the industrial sector. Therefore, this publication is made up like a set of rules and standards and largely foregoes textual detail. It offers extensive market-relevant and technical facts presented in graphs and charts, which means that the information is much easier to grasp. The reader can expect comparative market figures for various materials, regions, applications, process routes, agricultural land use or resource consumption, production capacities, geographic distribution, etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2013

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2014