951 resultados para Specific inhalation challenge test


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern software application testing, such as the testing of software driven by graphical user interfaces (GUIs) or leveraging event-driven architectures in general, requires paying careful attention to context. Model-based testing (MBT) approaches first acquire a model of an application, then use the model to construct test cases covering relevant contexts. A major shortcoming of state-of-the-art automated model-based testing is that many test cases proposed by the model are not actually executable. These \textit{infeasible} test cases threaten the integrity of the entire model-based suite, and any coverage of contexts the suite aims to provide. In this research, I develop and evaluate a novel approach for classifying the feasibility of test cases. I identify a set of pertinent features for the classifier, and develop novel methods for extracting these features from the outputs of MBT tools. I use a supervised logistic regression approach to obtain a model of test case feasibility from a randomly selected training suite of test cases. I evaluate this approach with a set of experiments. The outcomes of this investigation are as follows: I confirm that infeasibility is prevalent in MBT, even for test suites designed to cover a relatively small number of unique contexts. I confirm that the frequency of infeasibility varies widely across applications. I develop and train a binary classifier for feasibility with average overall error, false positive, and false negative rates under 5\%. I find that unique event IDs are key features of the feasibility classifier, while model-specific event types are not. I construct three types of features from the event IDs associated with test cases, and evaluate the relative effectiveness of each within the classifier. To support this study, I also develop a number of tools and infrastructure components for scalable execution of automated jobs, which use state-of-the-art container and continuous integration technologies to enable parallel test execution and the persistence of all experimental artifacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New psychoactive substances (NPSs) have appeared on the recreational drug market at an unprecedented rate in recent years. Many are not new drugs but failed products of the pharmaceutical industry. The speed and variety of drugs entering the market poses a new complex challenge for the forensic toxicology community. The detection of these substances in biological matrices can be difficult as the exact compounds of interest may not be known. Many NPS are sold under the same brand name and therefore users themselves may not know what substances they have ingested. The majority of analytical methods for the detection of NPSs tend to focus on a specific class of compounds rather than a wide variety. In response to this, a robust and sensitive method was developed for the analysis of various NPS by solid phase extraction (SPE) with gas chromatography mass spectrometry (GCMS). Sample preparation and derivatisation were optimised testing a range of SPE cartridges and derivatising agents, as well as derivatisation incubation time and temperature. The final gas chromatography mass spectrometry method was validated in accordance with SWGTOX 2013 guidelines over a wide concentration range for both blood and urine for 23 and 25 analytes respectively. This included the validation of 8 NBOMe compounds in blood and 10 NBOMe compounds in urine. This GC-MS method was then applied to 8 authentic samples with concentrations compared to those originally identified by NMS laboratories. The rapid influx of NPSs has resulted in the re-analysis of samples and thus, the stability of these substances is crucial information. The stability of mephedrone was investigated, examining the effect that storage temperatures and preservatives had on analyte stability daily for 1 week and then weekly for 10 weeks. Several laboratories identified NPSs use through the cross-reactivity of these substances with existing screening protocols such as ELISA. The application of Immunalysis ketamine, methamphetamine and amphetamine ELISA kits for the detection of NPS was evaluated. The aim of this work was to determine if any cross-reactivity from NPS substances was observed, and to determine whether these existing kits would identify NPS use within biological samples. The cross- reactivity of methoxetamine, 3-MeO-PCE and 3-MeO-PCP for different commercially point of care test (POCT) was also assessed for urine. One of the newest groups of compounds to appear on the NPS market is the NBOMe series. These drugs pose a serious threat to public health due to their high potency, with fatalities already reported in the literature. These compounds are falsely marketed as LSD which increases the chance of adverse effects due to the potency differences between these 2 substances. A liquid chromatography tandem mass spectrometry (LC-MS/MS) method was validated in accordance with SWGTOX 2013 guidelines for the detection for 25B, 25C and 25I-NBOMe in urine and hair. Long-Evans rats were administered 25B-, 25C- and 25I-NBOMe at doses ranging from 30-300 µg/kg over a period of 10 days. Tail flick tests were then carried out on the rats in order to determine whether any analgesic effects were observed as a result of dosing. Rats were also shaved prior to their first dose and reshaved after the 10-day period. Hair was separated by colour (black and white) and analysed using the validated LC-MS/MS method, assessing the impact hair colour has on the incorporation of these drugs. Urine was collected from the rats, analysed using the validated LC-MS/MS method and screened for potential metabolites using both LC-MS/MS and quadrupole time of flight (QToF) instrumentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer and cardio-vascular diseases are the leading causes of death world-wide. Caused by systemic genetic and molecular disruptions in cells, these disorders are the manifestation of profound disturbance of normal cellular homeostasis. People suffering or at high risk for these disorders need early diagnosis and personalized therapeutic intervention. Successful implementation of such clinical measures can significantly improve global health. However, development of effective therapies is hindered by the challenges in identifying genetic and molecular determinants of the onset of diseases; and in cases where therapies already exist, the main challenge is to identify molecular determinants that drive resistance to the therapies. Due to the progress in sequencing technologies, the access to a large genome-wide biological data is now extended far beyond few experimental labs to the global research community. The unprecedented availability of the data has revolutionized the capabilities of computational researchers, enabling them to collaboratively address the long standing problems from many different perspectives. Likewise, this thesis tackles the two main public health related challenges using data driven approaches. Numerous association studies have been proposed to identify genomic variants that determine disease. However, their clinical utility remains limited due to their inability to distinguish causal variants from associated variants. In the presented thesis, we first propose a simple scheme that improves association studies in supervised fashion and has shown its applicability in identifying genomic regulatory variants associated with hypertension. Next, we propose a coupled Bayesian regression approach -- eQTeL, which leverages epigenetic data to estimate regulatory and gene interaction potential, and identifies combinations of regulatory genomic variants that explain the gene expression variance. On human heart data, eQTeL not only explains a significantly greater proportion of expression variance in samples, but also predicts gene expression more accurately than other methods. We demonstrate that eQTeL accurately detects causal regulatory SNPs by simulation, particularly those with small effect sizes. Using various functional data, we show that SNPs detected by eQTeL are enriched for allele-specific protein binding and histone modifications, which potentially disrupt binding of core cardiac transcription factors and are spatially proximal to their target. eQTeL SNPs capture a substantial proportion of genetic determinants of expression variance and we estimate that 58% of these SNPs are putatively causal. The challenge of identifying molecular determinants of cancer resistance so far could only be dealt with labor intensive and costly experimental studies, and in case of experimental drugs such studies are infeasible. Here we take a fundamentally different data driven approach to understand the evolving landscape of emerging resistance. We introduce a novel class of genetic interactions termed synthetic rescues (SR) in cancer, which denotes a functional interaction between two genes where a change in the activity of one vulnerable gene (which may be a target of a cancer drug) is lethal, but subsequently altered activity of its partner rescuer gene restores cell viability. Next we describe a comprehensive computational framework --termed INCISOR-- for identifying SR underlying cancer resistance. Applying INCISOR to mine The Cancer Genome Atlas (TCGA), a large collection of cancer patient data, we identified the first pan-cancer SR networks, composed of interactions common to many cancer types. We experimentally test and validate a subset of these interactions involving the master regulator gene mTOR. We find that rescuer genes become increasingly activated as breast cancer progresses, testifying to pervasive ongoing rescue processes. We show that SRs can be utilized to successfully predict patients' survival and response to the majority of current cancer drugs, and importantly, for predicting the emergence of drug resistance from the initial tumor biopsy. Our analysis suggests a potential new strategy for enhancing the effectiveness of existing cancer therapies by targeting their rescuer genes to counteract resistance. The thesis provides statistical frameworks that can harness ever increasing high throughput genomic data to address challenges in determining the molecular underpinnings of hypertension, cardiovascular disease and cancer resistance. We discover novel molecular mechanistic insights that will advance the progress in early disease prevention and personalized therapeutics. Our analyses sheds light on the fundamental biological understanding of gene regulation and interaction, and opens up exciting avenues of translational applications in risk prediction and therapeutics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dinoflagellates of Alexandrium genus are known to be producers of paralytic shellfish toxins that regularly impact the shellfish aquaculture industry and fisheries. Accurate detection of Alexandrium including A. minutum is crucial for environmental monitoring and sanitary issues. In this study, we firstly developed a quantitative lateral flow immunoassay (LFIA) using super-paramagnetic nanobeads for A. minutum whole cells. This dipstick assay relies on two distinct monoclonal antibodies used in a sandwich format and directed against surface antigens of this organism. No sample preparation is required. Either frozen or live cells can be detected and quantified. The specificity and sensitivity are assessed by using phytoplankton culture and field samples spiked with a known amount of cultured A. minutum cells. This LFIA is shown to be highly specific for A. minutum and able to detect reproducibly 105 cells/L within 30 min. The test is applied to environmental samples already characterized by light microscopy counting. No significant difference is observed between the cell densities obtained by these two methods. This handy super-paramagnetic lateral flow immnunoassay biosensor can greatly assist water quality monitoring programs as well as ecological research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation traces the ways in which nineteenth-century fictional narratives of white settlement represent “family” as, on the one hand, an abstract theoretical model for a unified and relatively homogenous British settler empire and on the other, a fundamental challenge to ideas about imperial integrity and transnational Anglo-Saxon racial identification. I argue that representations of transoceanic white families in nineteenth-century fictions about Australian settler colonialism negotiate the tension between the bounded domesticity of an insular English nation and the kind of kinship that spans oceans and continents as a result of mass emigration from the British isles to the United States, Canada, New Zealand, and the Australian colonies. As such, these fictions construct productive analogies between the familial metaphors and affective language in the political discourse of “Greater Britain”—-a transoceanic imagined community of British settler colonies and their “mother country” united by race and language—-and ideas of family, gender, and domesticity as they operate within specific bourgeois families. Concerns over the disruption of transoceanic families bear testament to contradictions between the idea of a unified imperial identity (both British and Anglo-Saxon), the proliferation of fractured local identities (such as settlers’ English, Irish Catholic, and Australian nationalisms), and the conspicuous absence of indigenous families from narratives of settlement. I intervene at the intersection of postcolonial literary criticism and gender theory by examining the strategic deployments of heteronormative kinship metaphors and metonymies in the rhetorical consolidation of settler colonial space. Settler colonialism was distinct from the “civilizing” domination of subject peoples in South Asia in that it depended on the rhetorical construction of colonial territory as empty space or as land occupied by nearly extinct “primitive” races. This dissertation argues that political rhetoric, travel narratives, and fiction used the image of white female bourgeois reproductive power and sentimental attachment as a technology for settler colonial success, embodying this technology both in the benevolent figure of the metropolitan “mother country” (the paternalistic female counter to the material realities of patriarchal and violent settler colonial practices) and in fictional juxtapositions of happy white settler fecund families with the solitary self-extinguishing figure of the black aboriginal “savage.” Yet even in the narratives where the continuity and coherence of families across imperial space is questioned—-and “Greater Britain” itself—-domesticity and heteronormative familial relations effectively rewrite settler space as white, Anglo-Saxon and bourgeois, and the sentimentalism of troubled European families masks the presence and genocide of indigenous aboriginal peoples. I analyze a range of novels and political texts, canonical and non-canonical, metropolitan and colonial. My introductory first chapter examines the discourse on a “Greater Britain” in the travel narratives of J.A. Froude, Charles Wentworth Dilke, and Anthony Trollope and in the Oxbridge lectures of Herman Merivale and J.R. Seeley. These writers make arguments for an imperial economy of affect circulating between Britain and the settler colonies that reinforces political connections, and at times surpasses the limits of political possibility by relying on the language of sentiment and feeling to build a transoceanic “Greater British” community. Subsequent chapters show how metropolitan and colonial fiction writers, including Charles Dickens, Anthony Trollope, Marcus Clarke, Henry Kingsley, and Catherine Helen Spence, test the viability of this “Greater British” economy of affect by presenting transoceanic family connections and structures straining under the weight of forces including the vast distances between colonies and the “mother country,” settler violence, and the transportation system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract : Providing high-quality clinical experiences to prepare students for the complexities of the current health-care system has become a challenge for nurse educators. Additionally, there are concerns that the current model of clinical practice is suboptimal. Consequently, nursing programs have explored the partial replacement of traditional in-hospital clinical experiences with a simulated clinical experience. Despite research demonstrating numerous benefits to students following participation in simulation activities, insufficient research conducted within Québec exists to convince the governing bodies (Ordre des infirmières et des infirmiers du Québec, OIIQ; Ministère de L’Éducation supérieur, de la Recherche, de la Science et de la Technologie) to fully embrace simulation as part of nurse training. The purpose of this study was to examine the use of a simulated clinical experience (SCE) as a viable, partial pedagogical substitute for traditional clinical experience by examining the effects of a SCE on CEGEP nursing students’ perceptions of self-efficacy (confidence), and their ability to achieve course objectives. The findings will contribute new information to the current body of research in simulation. The specific case of obstetrical practice was examined. Based on two sections of the Nursing III-Health and Illness (180-30K-AB) course, the sample was comprised of 65 students (thirty-one students from section 0001 and thirty-four students from section 0002) whose mean age was 24.8 years. With two sections of the course available, the opportunity for comparison was possible. A triangulation mixed method design was used. An adapted version of Ravert’s (2004) Nursing Skills for Evaluation tool was utilized to collect data regarding students’ perceptions of confidence related to the nursing skills required for care of mothers and their newborns. Students’ performance and achievement of course objectives was measured through an Objective Structured Clinical Examination (OSCE) consisting of three marked stations designed to test the theoretical and clinical aspects of course content. The OSCE was administered at the end of the semester following completion of the traditional clinical experience. Students’ qualitative comments on the post -test survey, along with journal entries served to support the quantitative scale evaluation. Two of the twelve days (15 hours) allocated for obstetrical clinical experience were replaced by a SCE (17%) over the course of the semester. Students participated in various simulation activities developed to address a range of cognitive, psychomotor and critical thinking skills. Scenarios incorporating the use of human patient simulators, and designed using the Jeffries Framework (2005), exposed students to the care of families and infants during the perinatal period to both reflect and build upon class and course content in achievement of course objectives and program competencies. Active participation in all simulation activities exposed students to Bandura’s four main sources of experience (mastery experiences, vicarious experiences, social persuasion, and physiologic/emotional responses) to enhance the development of students’ self-efficacy. Results of the pre-test and post-test summative scores revealed a statistically significant increase in student confidence in performing skills related to maternal and newborn care (p < .0001) following participation in the SCE. Confidence pre-test and post-test scores were not affected by the students’ section. Skills related to the care of the post-partum mother following vaginal or Caesarean section delivery showed the greatest change in confidence ratings. OSCE results showed a mean total class score (both sections) of 57.4 (70.0 %) with normal distribution. Mean scores were 56.5 (68.9%) for section 0001 and 58.3 (71.1%) for section 0002. Total scores were similar between sections (p =0.342) based on pairwise comparison. Analysis of OSCE scores as compared to students’ final course grade revealed similar distributions. Finally, qualitative analysis identified how students’ perceived the SCE. Students cited gains in knowledge, development of psychomotor skills and improved clinical judgement following participation in simulation activities. These were attributed to the « hands on » practice obtained from working in small groups, a safe and authentic learning environment and one in which students could make mistakes and correct errors as having the greatest impact on learning through simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies—paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vertebrate genomes are organised into a variety of nuclear environments and chromatin states that have profound effects on the regulation of gene transcription. This variation presents a major challenge to the expression of transgenes for experimental research, genetic therapies and the production of biopharmaceuticals. The majority of transgenes succumb to transcriptional silencing by their chromosomal environment when they are randomly integrated into the genome, a phenomenon known as chromosomal position effect (CPE). It is not always feasible to target transgene integration to transcriptionally permissive “safe harbour” loci that favour transgene expression, so there remains an unmet need to identify gene regulatory elements that can be added to transgenes which protect them against CPE. Dominant regulatory elements (DREs) with chromatin barrier (or boundary) activity have been shown to protect transgenes from CPE. The HS4 element from the chicken beta-globin locus and the A2UCOE element from a human housekeeping gene locus have been shown to function as DRE barriers in a wide variety of cell types and species. Despite rapid advances in the profiling of transcription factor binding, chromatin states and chromosomal looping interactions, progress towards functionally validating the many candidate barrier elements in vertebrates has been very slow. This is largely due to the lack of a tractable and efficient assay for chromatin barrier activity. In this study, I have developed the RGBarrier assay system to test the chromatin barrier activity of candidate DREs at pre-defined isogenic loci in human cells. The RGBarrier assay consists in a Flp-based RMCE reaction for the integration of an expression construct, carrying candidate DREs, in a pre-characterised chromosomal location. The RGBarrier system involves the tracking of red, green and blue fluorescent proteins by flow cytometry to monitor on-target versus off-target integration and transgene expression. The analysis of the reporter (GFP) expression for several weeks gives a measure of the protective ability of each candidate elements from chromosomal silencing. This assay can be scaled up to test tens of new putative barrier elements in the same chromosomal context in parallel. The defined chromosomal contexts of the RGBarrier assays will allow for detailed mechanistic studies of chromosomal silencing and DRE barrier element action. Understanding these mechanisms will be of paramount importance for the design of specific solutions for overcoming chromosomal silencing in specific transgenic applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the development and correlation of a thermal model that forms the foundation of a thermal capacitance spacecraft propellant load estimator. Specific details of creating the thermal model for the diaphragm propellant tank used on NASA’s Magnetospheric Multiscale spacecraft using ANSYS and the correlation process implemented are presented. The thermal model was correlated to within +/- 3 Celsius of the thermal vacuum test data, and was determined sufficient to make future propellant predictions on MMS. The model was also found to be relatively sensitive to uncertainties in applied heat flux and mass knowledge of the tank. More work is needed to improve temperature predictions in the upper hemisphere of the propellant tank where predictions were found to be 2-2.5 Celsius lower than the test data. A road map for applying the model to predict propellant loads on the actual MMS spacecraft in 2017-2018 is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Prostate cancer is a major cause of cancer death in Nigerian men. Attempts to reduce mortality from prostate cancer have focused mainly on early detection of the disease by the use of PSA testing. As a result of the increased incidence of prostate cancer in Nigeria despite the widespread availability of testing facilities, it became pertinent to understand the salient factors that prompt Nigerian men to go for prostate cancer testing. Objective: This study explores the factors that influence a group of Nigerian men’s decision to go for Prostate Specific Antigen (PSA) testing. Methods: Following ethical approval, semi structured interviews were conducted with a group of 10 men who had PSA test following consultation with their doctor with signs and symptoms at the University of Benin Teaching Hospital from July to August, 2010. Interview transcripts were analysed by employing steps proposed by Collaizi (1978). Results: Five themes were identified: the symptoms experienced, the influence of friends and relatives, older age associated with increased awareness, accessibility to testing services and the knowledge of the PSA test. Conclusion: The study revealed that there continues to be a considerable lack of awareness and knowledge about prostate cancer and screening.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new ‘Danger Theory’ (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of ‘grounding’ the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presence and concentrations of radionuclides could be as a result of natural and human activities. This study examined the associations and differences among soil, sediment and water specific activities of long-lived radioactive element (LLRE). Gamma spectroscopy was used to measure the concentration of the LLRE along the Mini Okoro/Oginigba Creek, Port Harcourt. Specific activities of three selected LLRE were derived. Correlation analysis was carried out to examine associations among the specific activities across different substrates. A strong and a significant negative correlation exists between the specific activities of Water 40K and Soil 232Th (r =-0.721, p<0.05); Water 238U and Soil 238U (r = -0.717, p<0.05) and Water 40K and Sediment 238U (r=-0.69, p<0.05). Comparison using Mann-Whitney U test shows that, soil and sediment are similar in their specific activities with Z values of -0.408, -1.209 and -1.021 (p > 0.05) for 40K, 232Th and 238U respectively. The concentration of solid samples (soil and sediment) is different from the liquid (water) samples. These associations can be attributed to some specific underlying factors. And in other to understand them there is need for more studies.