8 resultados para computation- and data-intensive applications

em WestminsterResearch - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data registration refers to a series of techniques for matching or bringing similar objects or datasets together into alignment. These techniques enjoy widespread use in a diverse variety of applications, such as video coding, tracking, object and face detection and recognition, surveillance and satellite imaging, medical image analysis and structure from motion. Registration methods are as numerous as their manifold uses, from pixel level and block or feature based methods to Fourier domain methods. This book is focused on providing algorithms and image and video techniques for registration and quality performance metrics. The authors provide various assessment metrics for measuring registration quality alongside analyses of registration techniques, introducing and explaining both familiar and state–of–the–art registration methodologies used in a variety of targeted applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the inherent radio frequency analog challenges associated with near field communication systems. Furthermore, the paper presents a digital based sigma-delta modulator for near field communication transmitter implementations. The proposed digital transmitter architecture is designed to best support data intensive applications requiring higher data rates and complex modulation schemes. An NFC transmitter based on a single-bit sigma-delta DAC is introduced, and then the multi-bit extension with necessary simulation results are presented to confirm the suitability of the architecture for near field communication high speed applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is based on the novel use of a very high fidelity decimation filter chain for Electrocardiogram (ECG) signal acquisition and data conversion. The multiplier-free and multi-stage structure of the proposed filters lower the power dissipation while minimizing the circuit area which are crucial design constraints to the wireless noninvasive wearable health monitoring products due to the scarce operational resources in their electronic implementation. The decimation ratio of the presented filter is 128, working in tandem with a 1-bit 3rd order Sigma Delta (ΣΔ) modulator which achieves 0.04 dB passband ripples and -74 dB stopband attenuation. The work reported here investigates the non-linear phase effects of the proposed decimation filters on the ECG signal by carrying out a comparative study after phase correction. It concludes that the enhanced phase linearity is not crucial for ECG acquisition and data conversion applications since the signal distortion of the acquired signal, due to phase non-linearity, is insignificant for both original and phase compensated filters. To the best of the authors’ knowledge, being free of signal distortion is essential as this might lead to misdiagnosis as stated in the state of the art. This article demonstrates that with their minimal power consumption and minimal signal distortion features, the proposed decimation filters can effectively be employed in biosignal data processing units.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bioelectrochemical systems could have potential for bioremediation of contaminants either in situ or ex situ. The treatment of a mixture of phenanthrene and benzene using two different tubular microbial fuel cells (MFCs) designed for either in situ and ex situ applications in aqueous systems was investigated over long operational periods (up to 155 days). For in situ deployments, simultaneous removal of the petroleum hydrocarbons (>90% in term of degradation efficiency) and bromate, used as catholyte, (up to 79%) with concomitant biogenic electricity generation (peak power density up to 6.75 mWm−2) were obtained at a hydraulic retention time (HRT) of 10 days. The tubular MFC could be operated successfully at copiotrophic (100 ppm phenanthrene, 2000 ppm benzene at HRT 30 days) and oligotrophic (phenanthrene and benzene, 50 ppb each, HRT 10 days) substrate conditions suggesting its effectiveness and robustness at extreme substrate concentrations in anoxic environments. In the MFC designed for ex situ deployments, optimum MFC performance was obtained at HRT of 30 h giving COD removal and maximum power output of approximately 77% and 6.75 mWm−2 respectively. The MFC exhibited the ability to resist organic shock loadings and could maintain stable MFC performance. Results of this study suggest the potential use of MFC technology for possible in situ/ex situ hydrocarbon-contaminated groundwater treatment or refinery effluents clean-up, even at extreme contaminant level conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing offers massive scalability and elasticity required by many scien-tific and commercial applications. Combining the computational and data handling capabilities of clouds with parallel processing also has the potential to tackle Big Data problems efficiently. Science gateway frameworks and workflow systems enable application developers to implement complex applications and make these available for end-users via simple graphical user interfaces. The integration of such frameworks with Big Data processing tools on the cloud opens new oppor-tunities for application developers. This paper investigates how workflow sys-tems and science gateways can be extended with Big Data processing capabilities. A generic approach based on infrastructure aware workflows is suggested and a proof of concept is implemented based on the WS-PGRADE/gUSE science gateway framework and its integration with the Hadoop parallel data processing solution based on the MapReduce paradigm in the cloud. The provided analysis demonstrates that the methods described to integrate Big Data processing with workflows and science gateways work well in different cloud infrastructures and application scenarios, and can be used to create massively parallel applications for scientific analysis of Big Data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Researchers want to analyse Health Care data which may requires large pools of compute and data resources. To have them they need access to Distributed Computing Infrastructures (DCI). To use them it requires expertise which researchers may not have. Workflows can hide infrastructures. There are many workflow systems but they are not interoperable. To learn a workflow system and create workflows in a workflow system may require significant effort. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows of other workflow systems. As a result, the lack of interoperability prevents workflow sharing and a vast amount of research efforts is wasted. The FP7 Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs (SHIWA) project developed the Coarse-Grained Interoperability (CGI) to enable workflow sharing. The project created the SHIWA Simulation Platform (SSP) to support CGI as a production-level service. The paper describes how the CGI approach can be used for analysis and simulation in Health Care.