9 resultados para Regulation-based classification system
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
Case-Based Reasoning (CBR) uses past experiences to solve new problems. The quality of the past experiences, which are stored as cases in a case base, is a big factor in the performance of a CBR system. The system's competence may be improved by adding problems to the case base after they have been solved and their solutions verified to be correct. However, from time to time, the case base may have to be refined to reduce redundancy and to get rid of any noisy cases that may have been introduced. Many case base maintenance algorithms have been developed to delete noisy and redundant cases. However, different algorithms work well in different situations and it may be difficult for a knowledge engineer to know which one is the best to use for a particular case base. In this thesis, we investigate ways to combine algorithms to produce better deletion decisions than the decisions made by individual algorithms, and ways to choose which algorithm is best for a given case base at a given time. We analyse five of the most commonly-used maintenance algorithms in detail and show how the different algorithms perform better on different datasets. This motivates us to develop a new approach: maintenance by a committee of experts (MACE). MACE allows us to combine maintenance algorithms to produce a composite algorithm which exploits the merits of each of the algorithms that it contains. By combining different algorithms in different ways we can also define algorithms that have different trade-offs between accuracy and deletion. While MACE allows us to define an infinite number of new composite algorithms, we still face the problem of choosing which algorithm to use. To make this choice, we need to be able to identify properties of a case base that are predictive of which maintenance algorithm is best. We examine a number of measures of dataset complexity for this purpose. These provide a numerical way to describe a case base at a given time. We use the numerical description to develop a meta-case-based classification system. This system uses previous experience about which maintenance algorithm was best to use for other case bases to predict which algorithm to use for a new case base. Finally, we give the knowledge engineer more control over the deletion process by creating incremental versions of the maintenance algorithms. These incremental algorithms suggest one case at a time for deletion rather than a group of cases, which allows the knowledge engineer to decide whether or not each case in turn should be deleted or kept. We also develop incremental versions of the complexity measures, allowing us to create an incremental version of our meta-case-based classification system. Since the case base changes after each deletion, the best algorithm to use may also change. The incremental system allows us to choose which algorithm is the best to use at each point in the deletion process.
Resumo:
Background: Many European countries including Ireland lack high quality, on-going, population based estimates of maternal behaviours and experiences during pregnancy. PRAMS is a CDC surveillance program which was established in the United States in 1987 to generate high quality, population based data to reduce infant mortality rates and improve maternal and infant health. PRAMS is the only on-going population based surveillance system of maternal behaviours and experiences that occur before, during and after pregnancy worldwide.Methods: The objective of this study was to adapt, test and evaluate a modified CDC PRAMS methodology in Ireland. The birth certificate file which is the standard approach to sampling for PRAMS in the United States was not available for the PRAMS Ireland study. Consequently, delivery record books for the period between 3 and 5 months before the study start date at a large urban obstetric hospital [8,900 births per year] were used to randomly sample 124 women. Name, address, maternal age, infant sex, gestational age at delivery, delivery method, APGAR score and birth weight were manually extracted from records. Stillbirths and early neonatal deaths were excluded using APGAR scores and hospital records. Women were sent a letter of invitation to participate including option to opt out, followed by a modified PRAMS survey, a reminder letter and a final survey.Results: The response rate for the pilot was 67%. Two per cent of women refused the survey, 7% opted out of the study and 24% did not respond. Survey items were at least 88% complete for all 82 respondents. Prevalence estimates of socially undesirable behaviours such as alcohol consumption during pregnancy were high [>50%] and comparable with international estimates.Conclusion: PRAMS is a feasible and valid method of collecting information on maternal experiences and behaviours during pregnancy in Ireland. PRAMS may offer a potential solution to data deficits in maternal health behaviour indicators in Ireland with further work. This study is important to researchers in Europe and elsewhere who may be interested in new ways of tailoring an established CDC methodology to their unique settings to resolve data deficits in maternal health.
Resumo:
Adequate hand-washing has been shown to be a critical activity in preventing the transmission of infections such as MRSA in health-care environments. Hand-washing guidelines published by various health-care related institutions recommend a technique incorporating six hand-washing poses that ensure all areas of the hands are thoroughly cleaned. In this paper, an embedded wireless vision system (VAMP) capable of accurately monitoring hand-washing quality is presented. The VAMP system hardware consists of a low resolution CMOS image sensor and FPGA processor which are integrated with a microcontroller and ZigBee standard wireless transceiver to create a wireless sensor network (WSN) based vision system that can be retargeted at a variety of health care applications. The device captures and processes images locally in real-time, determines if hand-washing procedures have been correctly undertaken and then passes the resulting high-level data over a low-bandwidth wireless link. The paper outlines the hardware and software mechanisms of the VAMP system and illustrates that it offers an easy to integrate sensor solution to adequately monitor and improve hand hygiene quality. Future work to develop a miniaturized, low cost system capable of being integrated into everyday products is also discussed.
Resumo:
As a by-product of the ‘information revolution’ which is currently unfolding, lifetimes of man (and indeed computer) hours are being allocated for the automated and intelligent interpretation of data. This is particularly true in medical and clinical settings, where research into machine-assisted diagnosis of physiological conditions gains momentum daily. Of the conditions which have been addressed, however, automated classification of allergy has not been investigated, even though the numbers of allergic persons are rising, and undiagnosed allergies are most likely to elicit fatal consequences. On the basis of the observations of allergists who conduct oral food challenges (OFCs), activity-based analyses of allergy tests were performed. Algorithms were investigated and validated by a pilot study which verified that accelerometer-based inquiry of human movements is particularly well-suited for objective appraisal of activity. However, when these analyses were applied to OFCs, accelerometer-based investigations were found to provide very poor separation between allergic and non-allergic persons, and it was concluded that the avenues explored in this thesis are inadequate for the classification of allergy. Heart rate variability (HRV) analysis is known to provide very significant diagnostic information for many conditions. Owing to this, electrocardiograms (ECGs) were recorded during OFCs for the purpose of assessing the effect that allergy induces on HRV features. It was found that with appropriate analysis, excellent separation between allergic and nonallergic subjects can be obtained. These results were, however, obtained with manual QRS annotations, and these are not a viable methodology for real-time diagnostic applications. Even so, this was the first work which has categorically correlated changes in HRV features to the onset of allergic events, and manual annotations yield undeniable affirmation of this. Fostered by the successful results which were obtained with manual classifications, automatic QRS detection algorithms were investigated to facilitate the fully automated classification of allergy. The results which were obtained by this process are very promising. Most importantly, the work that is presented in this thesis did not obtain any false positive classifications. This is a most desirable result for OFC classification, as it allows complete confidence to be attributed to classifications of allergy. Furthermore, these results could be particularly advantageous in clinical settings, as machine-based classification can detect the onset of allergy which can allow for early termination of OFCs. Consequently, machine-based monitoring of OFCs has in this work been shown to possess the capacity to significantly and safely advance the current state of clinical art of allergy diagnosis
Resumo:
The Internet and World Wide Web have had, and continue to have, an incredible impact on our civilization. These technologies have radically influenced the way that society is organised and the manner in which people around the world communicate and interact. The structure and function of individual, social, organisational, economic and political life begin to resemble the digital network architectures upon which they are increasingly reliant. It is increasingly difficult to imagine how our ‘offline’ world would look or function without the ‘online’ world; it is becoming less meaningful to distinguish between the ‘actual’ and the ‘virtual’. Thus, the major architectural project of the twenty-first century is to “imagine, build, and enhance an interactive and ever changing cyberspace” (Lévy, 1997, p. 10). Virtual worlds are at the forefront of this evolving digital landscape. Virtual worlds have “critical implications for business, education, social sciences, and our society at large” (Messinger et al., 2009, p. 204). This study focuses on the possibilities of virtual worlds in terms of communication, collaboration, innovation and creativity. The concept of knowledge creation is at the core of this research. The study shows that scholars increasingly recognise that knowledge creation, as a socially enacted process, goes to the very heart of innovation. However, efforts to build upon these insights have struggled to escape the influence of the information processing paradigm of old and have failed to move beyond the persistent but problematic conceptualisation of knowledge creation in terms of tacit and explicit knowledge. Based on these insights, the study leverages extant research to develop the conceptual apparatus necessary to carry out an investigation of innovation and knowledge creation in virtual worlds. The study derives and articulates a set of definitions (of virtual worlds, innovation, knowledge and knowledge creation) to guide research. The study also leverages a number of extant theories in order to develop a preliminary framework to model knowledge creation in virtual worlds. Using a combination of participant observation and six case studies of innovative educational projects in Second Life, the study yields a range of insights into the process of knowledge creation in virtual worlds and into the factors that affect it. The study’s contributions to theory are expressed as a series of propositions and findings and are represented as a revised and empirically grounded theoretical framework of knowledge creation in virtual worlds. These findings highlight the importance of prior related knowledge and intrinsic motivation in terms of shaping and stimulating knowledge creation in virtual worlds. At the same time, they highlight the importance of meta-knowledge (knowledge about knowledge) in terms of guiding the knowledge creation process whilst revealing the diversity of behavioural approaches actually used to create knowledge in virtual worlds and. This theoretical framework is itself one of the chief contributions of the study and the analysis explores how it can be used to guide further research in virtual worlds and on knowledge creation. The study’s contributions to practice are presented as actionable guide to simulate knowledge creation in virtual worlds. This guide utilises a theoretically based classification of four knowledge-creator archetypes (the sage, the lore master, the artisan, and the apprentice) and derives an actionable set of behavioural prescriptions for each archetype. The study concludes with a discussion of the study’s implications in terms of future research.
Resumo:
With the rapid growth of the Internet and digital communications, the volume of sensitive electronic transactions being transferred and stored over and on insecure media has increased dramatically in recent years. The growing demand for cryptographic systems to secure this data, across a multitude of platforms, ranging from large servers to small mobile devices and smart cards, has necessitated research into low cost, flexible and secure solutions. As constraints on architectures such as area, speed and power become key factors in choosing a cryptosystem, methods for speeding up the development and evaluation process are necessary. This thesis investigates flexible hardware architectures for the main components of a cryptographic system. Dedicated hardware accelerators can provide significant performance improvements when compared to implementations on general purpose processors. Each of the designs proposed are analysed in terms of speed, area, power, energy and efficiency. Field Programmable Gate Arrays (FPGAs) are chosen as the development platform due to their fast development time and reconfigurable nature. Firstly, a reconfigurable architecture for performing elliptic curve point scalar multiplication on an FPGA is presented. Elliptic curve cryptography is one such method to secure data, offering similar security levels to traditional systems, such as RSA, but with smaller key sizes, translating into lower memory and bandwidth requirements. The architecture is implemented using different underlying algorithms and coordinates for dedicated Double-and-Add algorithms, twisted Edwards algorithms and SPA secure algorithms, and its power consumption and energy on an FPGA measured. Hardware implementation results for these new algorithms are compared against their software counterparts and the best choices for minimum area-time and area-energy circuits are then identified and examined for larger key and field sizes. Secondly, implementation methods for another component of a cryptographic system, namely hash functions, developed in the recently concluded SHA-3 hash competition are presented. Various designs from the three rounds of the NIST run competition are implemented on FPGA along with an interface to allow fair comparison of the different hash functions when operating in a standardised and constrained environment. Different methods of implementation for the designs and their subsequent performance is examined in terms of throughput, area and energy costs using various constraint metrics. Comparing many different implementation methods and algorithms is nontrivial. Another aim of this thesis is the development of generic interfaces used both to reduce implementation and test time and also to enable fair baseline comparisons of different algorithms when operating in a standardised and constrained environment. Finally, a hardware-software co-design cryptographic architecture is presented. This architecture is capable of supporting multiple types of cryptographic algorithms and is described through an application for performing public key cryptography, namely the Elliptic Curve Digital Signature Algorithm (ECDSA). This architecture makes use of the elliptic curve architecture and the hash functions described previously. These components, along with a random number generator, provide hardware acceleration for a Microblaze based cryptographic system. The trade-off in terms of performance for flexibility is discussed using dedicated software, and hardware-software co-design implementations of the elliptic curve point scalar multiplication block. Results are then presented in terms of the overall cryptographic system.
Resumo:
Cancer represents a leading of cause of death in the developed world, inflicting tremendous suffering and plundering billions from health budgets. The traditional treatment approaches of surgery, radiotherapy and chemotherapy have achieved little in terms of cure for this deadly disease. Instead, life is prolonged for many, with dubious quality of life, only for disease to reappear with the inevitable fatal outcome. “Blue sky” thinking is required to tackle this disease and improve outcomes. The realisation and acceptance of the intrinsic role of the immune system in cancer pathogenesis, pathophysiology and treatment represented such a “blue sky” thought. Moreover, the embracement of immunotherapy, the concept of targeting immune cells rather than the tumour cells themselves, represents a paradigm shift in the approach to cancer therapy. The harnessing of immunotherapy demands radical and innovative therapeutic endeavours – endeavours such as gene and cell therapies and RNA interference, which two decades ago existed as mere concepts. This thesis straddles the frontiers of fundamental tumour immunobiology and novel therapeutic discovery, design and delivery. The work undertaken focused on two distinct immune cell populations known to undermine the immune response to cancer – suppressive T cells and macrophages. Novel RNAi mediators were designed, validated and incorporated into clinically relevant gene therapy vectors – involving a traditional lentiviral vector approach, and a novel bacterial vector strategy. Chapter 2 deals with the design of novel RNAi mediators against FOXP3 – a crucial regulator of the immunosuppressive regulatory T cell population. Two mediators were tested and validated. The superior mediator was taken forward as part of work in chapter 3. Chapter 3 deals with transposing the RNA sequence from chapter 2 into a DNA-based construct and subsequent incorporation into a lentiviral-based vector system. The lentiviral vector was shown to mediate gene delivery in vitro and functional RNAi was achieved against FOXP3. Proof of gene delivery was further confirmed in vivo in tumour-bearing animals. Chapter 4 focuses on a different immune cell population – tumour-associated macrophages. Non-invasive bacteria were explored as a specific means of delivering gene therapy to this phagocytic cell type. Proof of delivery was shown in vitro and in vivo. Moreover, in vivo delivery of a gene by this method achieved the desired immune response in terms of cytokine profile. Overall, the data presented here advance exploration within the field of cancer immunotherapy, introduce novel delivery and therapeutic strategies, and demonstrate pre-clinically the potential for such novel anti-cancer therapies.
Resumo:
This thesis is a study of military memorials and commemoration with a focus on Anglo-American practice. The main question is: How has history defined military memorials and commemoration and how have they changed since the 19th century. In an effort to resolve this, the work examines both historic and contemporary forms of memorials and commemoration and establishes that remembrance in sites of collective memory has been influenced by politics, conflicts and religion. Much has been written since the Great War about remembrance and memorialization; however, there is no common lexicon throughout the literature. In order to better explain and understand this complex subject, the work includes an up-to-date literature review and for the first time, terminologies are properly explained and defined. Particular attention is placed on recognizing important military legacies, being familiar with spiritual influences and identifying classic and new signs of remembrance. The thesis contends that commemoration is composed of three key principles – recognition, respect and reflection – that are intractably linked to the fabric of memorials. It also argues that it is time for the study of memorials to come of age and proposes Memorialogy as an interdisciplinary field of study of memorials and associated commemorative practices. Moreover, a more modern, adaptive, General Classification System is presented as a means of identifying and re-defining memorials according to certain groups, types and forms. Lastly, this thesis examines how peacekeeping and peace support operations are being memorialized and how the American tragic events of 11 September 2001 and the war in Afghanistan have forever changed the nature of memorials and commemoration within Canada and elsewhere. This work goes beyond what has been studied and written about over the last century and provides a deeper level of analysis and a fresh approach to understanding the field of Memorialogy.
Resumo:
Mercury is a potent neurotoxin even at low concentrations. The unoxidised metal has a high vapour pressure and can circulate through the atmosphere, but when oxidised can deposit and be accumulated through the food chain. This work aims to investigate the oxidation processes of atmospheric Hg0(g). The first part describes efforts to make a portable Hg sensor based on Cavity Enhanced Absorption Spectroscopy (CEAS). The detection limit achieved was 66 ngm−3 for a 10 second averaging time. The second part of this work describes experiments carried out in a temperature controlled atmospheric simulation chamber in the Desert Research Institute, Reno, Nevada, USA. The chamber was built around an existing Hg CRDS system that could measure Hg concentrations in the chamber of<100 ngm−3 at 1 Hz enabling reactions to be followed. The main oxidant studied was bromine, which was quantified with a LED based CEAS system across the chamber. Hg oxidation in the chamber was found to be mostly too slow for current models to explain. A seven reaction model was developed and tested to find which parameters were capable of explaining the deviation. The model was overdetermined and no unique solution could be found. The most likely possibility was that the first oxidation step Hg + Br →HgBr was slower than the preferred literature value by a factor of two. However, if the more uncertain data at low [Br2] was included then the only parameter that could explain the experiments was a fast, temperature independent dissociation of HgBr some hundreds of times faster than predicted thermolysis or photolysis rates. Overall this work concluded that to quantitatively understand the reaction of Hg with Br2, the intermediates HgBr and Br must be measured. This conclusion will help to guide the planning of future studies of atmospheric Hg chemistry.