424 resultados para DSP


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite ethical and technical concerns, the in vivo method, or more commonly referred to mouse bioassay (MBA), is employed globally as a reference method for phycotoxin analysis in shellfish. This is particularly the case for paralytic shellfish poisoning (PSP) and emerging toxin monitoring. A high-performance liquid chromatography method (HPLC-FLD) has been developed for PSP toxin analysis, but due to difficulties and limitations in the method, this procedure has not been fully implemented as a replacement. Detection of the diarrhetic shellfish poisoning (DSP) toxins has moved towards LC-mass spectrometry (MS) analysis, whereas the analysis of the amnesic shellfish poisoning (ASP) toxin domoic acid is performed by HPLC. Although alternative methods of detection to the MBA have been described, each procedure is specific for a particular toxin and its analogues, with each group of toxins requiring separate analysis utilising different extraction procedures and analytical equipment. In addition, consideration towards the detection of unregulated and emerging toxins on the replacement of the MBA must be given. The ideal scenario for the monitoring of phycotoxins in shellfish and seafood would be to evolve to multiple toxin detection on a single bioanalytical sensing platform, i.e. 'an artificial mouse'. Immunologically based techniques and in particular surface plasmon resonance technology have been shown as a highly promising bioanalytical tool offering rapid, real-time detection requiring minimal quantities of toxin standards. A Biacore Q and a prototype multiplex SPR biosensor have been evaluated for their ability to be fit for purpose for the simultaneous detection of key regulated phycotoxin groups and the emerging toxin palytoxin. Deemed more applicable due to the separate flow channels, the prototype performance for domoic acid, okadaic acid, saxitoxin, and palytoxin calibration curves in shellfish achieved detection limits (IC20) of 4,000, 36, 144 and 46 μg/kg of mussel, respectively. A one-step extraction procedure demonstrated recoveries greater than 80 % for all toxins. For validation of the method at the 95 % confidence limit, the decision limits (CCα) determined from an extracted matrix curve were calculated to be 450, 36 and 24 μg/kg, and the detection capability (CCβ) as a screening method is ≤10 mg/kg, ≤160 μg/kg and ≤400 μg/kg for domoic acid, okadaic acid and saxitoxin, respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Future digital signal processing (DSP) systems must provide robustness on algorithm and application level to the presence of reliability issues that come along with corresponding implementations in modern semiconductor process technologies. In this paper, we address this issue by investigating the impact of unreliable memories on general DSP systems. In particular, we propose a novel framework to characterize the effects of unreliable memories, which enables us to devise novel methods to mitigate the associated performance loss. We propose to deploy specifically designed data representations, which have the capability of substantially improving the system reliability compared to that realized by conventional data representations used in digital integrated circuits, such as 2's-complement or sign-magnitude number formats. To demonstrate the efficacy of the proposed framework, we analyze the impact of unreliable memories on coded communication systems, and we show that the deployment of optimized data representations substantially improves the error-rate performance of such systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design and VLSI implementation of two key components of the class-IV partial response maximum likelihood channel (PR-IV) the adaptive filter and the Viterbi decoder are described. These blocks are implemented using parameterised VHDL modules, from a library of common digital signal processing (DSP) and arithmetic functions. Design studies, based on 0.6 micron 3.3V standard cell processes, indicate that worst case sampling rates of 49 mega-samples per second are achievable for this system, with proportionally high sampling rates for full custom designs and smaller dimension processes. Significant increases in the sampling rate, from 49 MHz to approximately 180 MHz, can be achieved by operating four filter modules in parallel, and this implementation has 50% lower power consumption than a pipelined filter operating at the same speed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Field Programmable Gate Array (FPGA) implementation of the commonly used Histogram of Oriented Gradients (HOG) algorithm is explored. The HOG algorithm is employed to extract features for object detection. A key focus has been to explore the use of a new FPGA-based processor which has been targeted at image processing. The paper gives details of the mapping and scheduling factors that influence the performance and the stages that were undertaken to allow the algorithm to be deployed on FPGA hardware, whilst taking into account the specific IPPro architecture features. We show that multi-core IPPro performance can exceed that of against state-of-the-art FPGA designs by up to 3.2 times with reduced design and implementation effort and increased flexibility all on a low cost, Zynq programmable system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fully Homomorphic Encryption (FHE) is a recently developed cryptographic technique which allows computations on encrypted data. There are many interesting applications for this encryption method, especially within cloud computing. However, the computational complexity is such that it is not yet practical for real-time applications. This work proposes optimised hardware architectures of the encryption step of an integer-based FHE scheme with the aim of improving its practicality. A low-area design and a high-speed parallel design are proposed and implemented on a Xilinx Virtex-7 FPGA, targeting the available DSP slices, which offer high-speed multiplication and accumulation. Both use the Comba multiplication scheduling method to manage the large multiplications required with uneven sized multiplicands and to minimise the number of read and write operations to RAM. Results show that speed up factors of 3.6 and 10.4 can be achieved for the encryption step with medium-sized security parameters for the low-area and parallel designs respectively, compared to the benchmark software implementation on an Intel Core2 Duo E8400 platform running at 3 GHz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need for fast response demand side participation (DSP) has never been greater due to increased wind power penetration. White domestic goods suppliers are currently developing a `smart' chip for a range of domestic appliances (e.g. refrigeration units, tumble dryers and storage heaters) to support the home as a DSP unit in future power systems. This paper presents an aggregated population-based model of a single compressor fridge-freezer. Two scenarios (i.e. energy efficiency class and size) for valley filling and peak shaving are examined to quantify and value DSP savings in 2020. The analysis shows potential peak reductions of 40 MW to 55 MW are achievable in the Single wholesale Electricity Market of Ireland (i.e. the test system), and valley demand increases of up to 30 MW. The study also shows the importance of the control strategy start time and the staggering of the devices to obtain the desired filling or shaving effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we introduce a statistical data-correction framework that aims at improving the DSP system performance in presence of unreliable memories. The proposed signal processing framework implements best-effort error mitigation for signals that are corrupted by defects in unreliable storage arrays using a statistical correction function extracted from the signal statistics, a data-corruption model, and an application-specific cost function. An application example to communication systems demonstrates the efficacy of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Harmful algal blooms (HABs) are a natural global phenomena emerging in severity and extent. Incidents have many economic, ecological and human health impacts. Monitoring and providing early warning of toxic HABs are critical for protecting public health. Current monitoring programmes include measuring the number of toxic phytoplankton cells in the water and biotoxin levels in shellfish tissue. As these efforts are demanding and labour intensive, methods which improve the efficiency are essential. This study compares the utilisation of a multitoxin surface plasmon resonance (multitoxin SPR) biosensor with enzyme-linked immunosorbent assay (ELISA) and analytical methods such as high performance liquid chromatography with fluorescence detection (HPLC-FLD) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) for toxic HAB monitoring efforts in Europe. Seawater samples (n = 256) from European waters, collected 2009-2011, were analysed for biotoxins: saxitoxin and analogues, okadaic acid and dinophysistoxins 1/2 (DTX1/DTX2) and domoic acid responsible for paralytic shellfish poisoning (PSP), diarrheic shellfish poisoning (DSP) and amnesic shellfish poisoning (ASP), respectively. Biotoxins were detected mainly in samples from Spain and Ireland. France and Norway appeared to have the lowest number of toxic samples. Both the multitoxin SPR biosensor and the RNA microarray were more sensitive at detecting toxic HABs than standard light microscopy phytoplankton monitoring. Correlations between each of the detection methods were performed with the overall agreement, based on statistical 2 × 2 comparison tables, between each testing platform ranging between 32% and 74% for all three toxin families illustrating that one individual testing method may not be an ideal solution. An efficient early warning monitoring system for the detection of toxic HABs could therefore be achieved by combining both the multitoxin SPR biosensor and RNA microarray.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta tese investiga a caracterização (e modelação) de dispositivos que realizam o interface entre os domínios digital e analógico, tal como os buffers de saída dos circuitos integrados (CI). Os terminais sem fios da atualidade estão a ser desenvolvidos tendo em vista o conceito de rádio-definido-por-software introduzido por Mitola. Idealmente esta arquitetura tira partido de poderosos processadores e estende a operação dos blocos digitais o mais próximo possível da antena. Neste sentido, não é de estranhar que haja uma crescente preocupação, no seio da comunidade científica, relativamente à caracterização dos blocos que fazem o interface entre os domínios analógico e digital, sendo os conversores digital-analógico e analógico-digital dois bons exemplos destes circuitos. Dentro dos circuitos digitais de alta velocidade, tais como as memórias Flash, um papel semelhante é desempenhado pelos buffers de saída. Estes realizam o interface entre o domínio digital (núcleo lógico) e o domínio analógico (encapsulamento dos CI e parasitas associados às linhas de transmissão), determinando a integridade do sinal transmitido. Por forma a acelerar a análise de integridade do sinal, aquando do projeto de um CI, é fundamental ter modelos que são simultaneamente eficientes (em termos computacionais) e precisos. Tipicamente a extração/validação dos modelos para buffers de saída é feita usando dados obtidos da simulação de um modelo detalhado (ao nível do transístor) ou a partir de resultados experimentais. A última abordagem não envolve problemas de propriedade intelectual; contudo é raramente mencionada na literatura referente à caracterização de buffers de saída. Neste sentido, esta tese de Doutoramento foca-se no desenvolvimento de uma nova configuração de medição para a caracterização e modelação de buffers de saída de alta velocidade, com a natural extensão aos dispositivos amplificadores comutados RF-CMOS. Tendo por base um procedimento experimental bem definido, um modelo estado-da-arte é extraído e validado. A configuração de medição desenvolvida aborda não apenas a integridade dos sinais de saída mas também do barramento de alimentação. Por forma a determinar a sensibilidade das quantias estimadas (tensão e corrente) aos erros presentes nas diversas variáveis associadas ao procedimento experimental, uma análise de incerteza é também apresentada.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Semi-autonomous avatars should be both realistic and believable. The goal is to learn from and reproduce the behaviours of the user-controlled input to enable semi-autonomous avatars to plausibly interact with their human-controlled counterparts. A powerful tool for embedding autonomous behaviour is learning by imitation. Hence, in this paper an ensemble of fuzzy inference systems cluster the user input data to identify natural groupings within the data to describe the users movement and actions in a more abstract way. Multiple clustering algorithms are investigated along with a neuro-fuzzy classifier; and an ensemble of fuzzy systems are evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Just as readers feel immersed when the story line adheres to their experiences, users will more easily feel immersed in a virtual environment if the behavior of the characters in that environment adheres to their expectations, based on their lifelong observations in the real world. This paper introduces a framework that allows authors to establish natural, human-like behavior, physical interaction and emotional engagement of characters living in a virtual environment. Represented by realistic virtual characters, this framework allows people to feel immersed in an Internet based virtual world in which they can meet and share experiences in a natural way as they can meet and share experiences in real life. Rather than just being visualized in a 3D space, the virtual characters (autonomous agents as well as avatars representing users) in the immersive environment facilitate social interaction and multi-party collaboration, mixing virtual with real.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado, Engenharia Biológica, Faculdade de Engenharia de Recursos Naturais, Universidade do Algarve, 2009

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we carry out a detailed performance analysis of a novel blind-source-seperation (BSS) based DSP algorithm that tackles the carrier phase synchronization error problem. The results indicate that the mismatch can be effectively compensated during the normal operation as well as in the rapidly changing environments. Since the compensation is carried out before any modulation specific processing, the proposed method works with all standard modulation formats and lends itself to efficient real-time custom integrated hardware or software implementations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the benefits of compensating transmitter gain and phase inbalances in the receiver for quadrature communication systems. It is assumed that the gain and phase imbalances are introduced at the transmitter only. A simple non-data aided DSP algorithm is used at the reciever to compensate for the imbalances. Computer simulation has been formed to study a coherent QPSK communication system.