5 resultados para Minimal Set

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The electroencephalogram (EEG) is a medical technology that is used in the monitoring of the brain and in the diagnosis of many neurological illnesses. Although coarse in its precision, the EEG is a non-invasive tool that requires minimal set-up times, and is suitably unobtrusive and mobile to allow continuous monitoring of the patient, either in clinical or domestic environments. Consequently, the EEG is the current tool-of-choice with which to continuously monitor the brain where temporal resolution, ease-of- use and mobility are important. Traditionally, EEG data are examined by a trained clinician who identifies neurological events of interest. However, recent advances in signal processing and machine learning techniques have allowed the automated detection of neurological events for many medical applications. In doing so, the burden of work on the clinician has been significantly reduced, improving the response time to illness, and allowing the relevant medical treatment to be administered within minutes rather than hours. However, as typical EEG signals are of the order of microvolts (μV ), contamination by signals arising from sources other than the brain is frequent. These extra-cerebral sources, known as artefacts, can significantly distort the EEG signal, making its interpretation difficult, and can dramatically disimprove automatic neurological event detection classification performance. This thesis therefore, contributes to the further improvement of auto- mated neurological event detection systems, by identifying some of the major obstacles in deploying these EEG systems in ambulatory and clinical environments so that the EEG technologies can emerge from the laboratory towards real-world settings, where they can have a real-impact on the lives of patients. In this context, the thesis tackles three major problems in EEG monitoring, namely: (i) the problem of head-movement artefacts in ambulatory EEG, (ii) the high numbers of false detections in state-of-the-art, automated, epileptiform activity detection systems and (iii) false detections in state-of-the-art, automated neonatal seizure detection systems. To accomplish this, the thesis employs a wide range of statistical, signal processing and machine learning techniques drawn from mathematics, engineering and computer science. The first body of work outlined in this thesis proposes a system to automatically detect head-movement artefacts in ambulatory EEG and utilises supervised machine learning classifiers to do so. The resulting head-movement artefact detection system is the first of its kind and offers accurate detection of head-movement artefacts in ambulatory EEG. Subsequently, addtional physiological signals, in the form of gyroscopes, are used to detect head-movements and in doing so, bring additional information to the head- movement artefact detection task. A framework for combining EEG and gyroscope signals is then developed, offering improved head-movement arte- fact detection. The artefact detection methods developed for ambulatory EEG are subsequently adapted for use in an automated epileptiform activity detection system. Information from support vector machines classifiers used to detect epileptiform activity is fused with information from artefact-specific detection classifiers in order to significantly reduce the number of false detections in the epileptiform activity detection system. By this means, epileptiform activity detection which compares favourably with other state-of-the-art systems is achieved. Finally, the problem of false detections in automated neonatal seizure detection is approached in an alternative manner; blind source separation techniques, complimented with information from additional physiological signals are used to remove respiration artefact from the EEG. In utilising these methods, some encouraging advances have been made in detecting and removing respiration artefacts from the neonatal EEG, and in doing so, the performance of the underlying diagnostic technology is improved, bringing its deployment in the real-world, clinical domain one step closer.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The contribution of buildings towards total worldwide energy consumption in developed countries is between 20% and 40%. Heating Ventilation and Air Conditioning (HVAC), and more specifically Air Handling Units (AHUs) energy consumption accounts on average for 40% of a typical medical device manufacturing or pharmaceutical facility’s energy consumption. Studies have indicated that 20 – 30% energy savings are achievable by recommissioning HVAC systems, and more specifically AHU operations, to rectify faulty operation. Automated Fault Detection and Diagnosis (AFDD) is a process concerned with potentially partially or fully automating the commissioning process through the detection of faults. An expert system is a knowledge-based system, which employs Artificial Intelligence (AI) methods to replicate the knowledge of a human subject matter expert, in a particular field, such as engineering, medicine, finance and marketing, to name a few. This thesis details the research and development work undertaken in the development and testing of a new AFDD expert system for AHUs which can be installed in minimal set up time on a large cross section of AHU types in a building management system vendor neutral manner. Both simulated and extensive field testing was undertaken against a widely available and industry known expert set of rules known as the Air Handling Unit Performance Assessment Rules (APAR) (and a later more developed version known as APAR_extended) in order to prove its effectiveness. Specifically, in tests against a dataset of 52 simulated faults, this new AFDD expert system identified all 52 derived issues whereas the APAR ruleset identified just 10. In tests using actual field data from 5 operating AHUs in 4 manufacturing facilities, the newly developed AFDD expert system for AHUs was shown to identify four individual fault case categories that the APAR method did not, as well as showing improvements made in the area of fault diagnosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advanced sensory systems address a number of major obstacles towards the provision for cost effective and proactive rehabilitation. Many of these systems employ technologies such as high-speed video or motion capture to generate quantitative measurements. However these solutions are accompanied by some major limitations including extensive set-up and calibration, restriction to indoor use, high cost and time consuming data analysis. Additionally many do not quantify improvement in a rigorous manner for example gait analysis for 5 minutes as opposed to 24 hour ambulatory monitoring. This work addresses these limitations using low cost, wearable wireless inertial measurement as a mobile and minimal infrastructure alternative. In cooperation with healthcare professionals the goal is to design and implement a reconfigurable and intelligent movement capture system. A key component of this work is an extensive benchmark comparison with the 'gold standard' VICON motion capture system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the proliferation of mobile wireless communication and embedded systems, the energy efficiency becomes a major design constraint. The dissipated energy is often referred as the product of power dissipation and the input-output delay. Most of electronic design automation techniques focus on optimising only one of these parameters either power or delay. Industry standard design flows integrate systematic methods of optimising either area or timing while for power consumption optimisation one often employs heuristics which are characteristic to a specific design. In this work we answer three questions in our quest to provide a systematic approach to joint power and delay Optimisation. The first question of our research is: How to build a design flow which incorporates academic and industry standard design flows for power optimisation? To address this question, we use a reference design flow provided by Synopsys and integrate in this flow academic tools and methodologies. The proposed design flow is used as a platform for analysing some novel algorithms and methodologies for optimisation in the context of digital circuits. The second question we answer is: Is possible to apply a systematic approach for power optimisation in the context of combinational digital circuits? The starting point is a selection of a suitable data structure which can easily incorporate information about delay, power, area and which then allows optimisation algorithms to be applied. In particular we address the implications of a systematic power optimisation methodologies and the potential degradation of other (often conflicting) parameters such as area or the delay of implementation. Finally, the third question which this thesis attempts to answer is: Is there a systematic approach for multi-objective optimisation of delay and power? A delay-driven power and power-driven delay optimisation is proposed in order to have balanced delay and power values. This implies that each power optimisation step is not only constrained by the decrease in power but also the increase in delay. Similarly, each delay optimisation step is not only governed with the decrease in delay but also the increase in power. The goal is to obtain multi-objective optimisation of digital circuits where the two conflicting objectives are power and delay. The logic synthesis and optimisation methodology is based on AND-Inverter Graphs (AIGs) which represent the functionality of the circuit. The switching activities and arrival times of circuit nodes are annotated onto an AND-Inverter Graph under the zero and a non-zero-delay model. We introduce then several reordering rules which are applied on the AIG nodes to minimise switching power or longest path delay of the circuit at the pre-technology mapping level. The academic Electronic Design Automation (EDA) tool ABC is used for the manipulation of AND-Inverter Graphs. We have implemented various combinatorial optimisation algorithms often used in Electronic Design Automation such as Simulated Annealing and Uniform Cost Search Algorithm. Simulated Annealing (SMA) is a probabilistic meta heuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. We used SMA to probabilistically decide between moving from one optimised solution to another such that the dynamic power is optimised under given delay constraints and the delay is optimised under given power constraints. A good approximation to the global optimum solution of energy constraint is obtained. Uniform Cost Search (UCS) is a tree search algorithm used for traversing or searching a weighted tree, tree structure, or graph. We have used Uniform Cost Search Algorithm to search within the AIG network, a specific AIG node order for the reordering rules application. After the reordering rules application, the AIG network is mapped to an AIG netlist using specific library cells. Our approach combines network re-structuring, AIG nodes reordering, dynamic power and longest path delay estimation and optimisation and finally technology mapping to an AIG netlist. A set of MCNC Benchmark circuits and large combinational circuits up to 100,000 gates have been used to validate our methodology. Comparisons for power and delay optimisation are made with the best synthesis scripts used in ABC. Reduction of 23% in power and 15% in delay with minimal overhead is achieved, compared to the best known ABC results. Also, our approach is also implemented on a number of processors with combinational and sequential components and significant savings are achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Petrochemical plastics/polymers are a common feature of day to day living as they occur in packaging, furniture, mobile phones, computers, construction equipment etc. However, these materials are produced from non-renewable materials and are resistant to microbial degradation in the environment. Considerable research has therefore been carried out into the production of sustainable, biodegradable polymers, amenable to microbial catabolism to CO2 and H2O. A key group of microbial polyesters, widely considered as optimal replacement polymers, are the Polyhydroxyalkaonates (PHAs). Primary research in this area has focused on using recombinant pure cultures to optimise PHA yields, however, despite considerable success, the high costs of pure culture fermentation have thus far hindered the commercial viability of PHAs thus produced. In more recent years work has begun to focus on mixed cultures for the optimisation of PHA production, with waste incorporations offering optimal production cost reductions. The scale of dairy processing in Ireland, and the high organic load wastewaters generated, represent an excellent potential substrate for bioconversion to PHAs in a mixed culture system. The current study sought to investigate the potential for such bioconversion in a laboratory scale biological system and to establish key operational and microbial characteristics of same. Two sequencing batch reactors were set up and operated along the lines of an enhanced biological phosphate removal (EBPR) system, which has PHA accumulation as a key step within repeated rounds of anaerobic/aerobic cycling. Influents to the reactors varied only in the carbon sources provided. Reactor 1 received artificial wastewater with acetate alone, which is known to be readily converted to PHA in the anaerobic step of EBPR. Reactor 2 wastewater influent contained acetate and skim milk to imitate a dairy processing effluent. Chemical monitoring of nutrient remediation within the reactors as continuously applied and EBPR consistent performances observed. Qualitative analysis of the sludge was carried out using fluorescence microscopy with Nile Blue A lipophillic stain and PHA production was confirmed in both reactors. Quantitative analysis via HPLC detection of crotonic acid derivatives revealed the fluorescence to be short chain length Polyhydroxybutyrate, with biomass dry weight accumulations of 11% and 13% being observed in reactors 1 and 2, respectively. Gas Chromatography-Mass Spectrometry for medium chain length methyl ester derivatives revealed the presence of hydroxyoctanoic, -decanoic and -dodecanoic acids in reactor 1. Similar analyses in reactor 2 revealed monomers of 3-hydroxydodecenoic and 3-hydroxytetradecanoic acids. Investigation of the microbial ecology of both reactors as conducted in an attempt to identify key species potentially contributing to reactor performance. Culture dependent investigations indicated that quite different communities were present in both reactors. Reactor 1 isolates demonstrated the following species distributions Pseudomonas (82%), Delftia acidovorans (3%), Acinetobacter sp. (5%) Aminobacter sp., (3%) Bacillus sp. (3%), Thauera sp., (3%) and Cytophaga sp. (3%). Relative species distributions among reactor 2 profiled isolates were more evenly distributed between Pseudoxanthomonas (32%), Thauera sp (24%), Acinetobacter (24%), Citrobacter sp (8%), Lactococcus lactis (5%), Lysinibacillus (5%) and Elizabethkingia (2%). In both reactors Gammaproteobacteria dominated the cultured isolates. Culture independent 16S rRNA gene analyses revealed differing profiles for both reactors. Reactor 1 clone distribution was as follows; Zooglea resiniphila (83%), Zooglea oryzae (2%), Pedobacter composti (5%), Neissericeae sp. (2%) Rhodobacter sp. (2%), Runella defluvii (3%) and Streptococcus sp. (3%). RFLP based species distribution among the reactor 2 clones was as follows; Runella defluvii (50%), Zoogloea oryzae (20%), Flavobacterium sp. (9%), Simplicispira sp. (6%), Uncultured Sphingobacteria sp. (6%), Arcicella (6%) and Leadbetterella bysophila (3%). Betaproteobacteria dominated the 16S rRNA gene clones identified in both reactors. FISH analysis with Nile Blue dual staining resolved these divergent findings, identifying the Betaproteobacteria as dominant PHA accumulators within the reactor sludges, although species/strain specific allocations could not be made. GC analysis of the sludge had indicated the presence of both medium chain length as well short chain length PHAs accumulating in both reactors. In addition the cultured isolates from the reactors had been identified previously as mcl and scl PHA producers, respectively. Characterisations of the PHA monomer profiles of the individual isolates were therefore performed to screen for potential novel scl-mcl PHAs. Nitrogen limitation driven PHA accumulation in E2 minimal media revealed a greater propensity among isoates for mcl-pHA production. HPLC analysis indicated that PHB production was not a major feature of the reactor isolates and this was supported by the low presence of scl phaC1 genes among PCR screened isolates. A high percentage distribution of phaC2 mcl-PHA synthase genes was recorded, with the majority sharing high percentage homology with class II synthases from Pseudomonas sp. The common presence of a phaC2 homologue was not reflected in the production of a common polymer. Considerable variation was noted in both the monomer composition and ratios following GC analysis. While co-polymer production could not be demonstrated, potentially novel synthase substrate specificities were noted which could be exploited further in the future.