898 resultados para detection systems
Resumo:
On-site detection of inoculum of polycyclic plant pathogens could potentially contribute to management of disease outbreaks. A 6-min, in-field competitive immunochromatographic lateral flow device (CLFD) assay was developed for detection of Alternaria brassicae (the cause of dark leaf spot in brassica crops) in air sampled above the crop canopy. Visual recording of the test result by eye provides a detection threshold of approximately 50 dark leaf spot conidia. Assessment using a portable reader improved test sensitivity. In combination with a weather-driven infection model, CLFD assays were evaluated as part of an in-field risk assessment to identify periods when brassica crops were at risk from A. brassicae infection. The weather-driven model overpredicted A. brassicae infection. An automated 7-day multivial cyclone air sampler combined with a daily in-field CLFD assay detected A. brassicae conidia air samples from above the crops. Integration of information from an in-field detection system (CLFD) with weather-driven mathematical models predicting pathogen infection have the potential for use within disease management systems.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
The FIREDASS (FIRE Detection And Suppression Simulation) project is concerned with the development of fine water mist systems as a possible replacement for the halon fire suppression system currently used in aircraft cargo holds. The project is funded by the European Commission, under the BRITE EURAM programme. The FIREDASS consortium is made up of a combination of Industrial, Academic, Research and Regulatory partners. As part of this programme of work, a computational model has been developed to help engineers optimise the design of the water mist suppression system. This computational model is based on Computational Fluid Dynamics (CFD) and is composed of the following components: fire model; mist model; two-phase radiation model; suppression model and detector/activation model. The fire model - developed by the University of Greenwich - uses prescribed release rates for heat and gaseous combustion products to represent the fire load. Typical release rates have been determined through experimentation conducted by SINTEF. The mist model - developed by the University of Greenwich - is a Lagrangian particle tracking procedure that is fully coupled to both the gas phase and the radiation field. The radiation model - developed by the National Technical University of Athens - is described using a six-flux radiation model. The suppression model - developed by SINTEF and the University of Greenwich - is based on an extinguishment crietrion that relies on oxygen concentration and temperature. The detector/ activation model - developed by Cerberus - allows the configuration of many different detector and mist configurations to be tested within the computational model. These sub-models have been integrated by the University of Greenwich into the FIREDASS software package. The model has been validated using data from the SINTEF/GEC test campaigns and it has been found that the computational model gives good agreement with these experimental results. The best agreement is obtained at the ceiling which is where the detectors and misting nozzles would be located in a real system. In this paper the model is briefly described and some results from the validation of the fire and mist model are presented.
Resumo:
Abstract : Images acquired from unmanned aerial vehicles (UAVs) can provide data with unprecedented spatial and temporal resolution for three-dimensional (3D) modeling. Solutions developed for this purpose are mainly operating based on photogrammetry concepts, namely UAV-Photogrammetry Systems (UAV-PS). Such systems are used in applications where both geospatial and visual information of the environment is required. These applications include, but are not limited to, natural resource management such as precision agriculture, military and police-related services such as traffic-law enforcement, precision engineering such as infrastructure inspection, and health services such as epidemic emergency management. UAV-photogrammetry systems can be differentiated based on their spatial characteristics in terms of accuracy and resolution. That is some applications, such as precision engineering, require high-resolution and high-accuracy information of the environment (e.g. 3D modeling with less than one centimeter accuracy and resolution). In other applications, lower levels of accuracy might be sufficient, (e.g. wildlife management needing few decimeters of resolution). However, even in those applications, the specific characteristics of UAV-PSs should be well considered in the steps of both system development and application in order to yield satisfying results. In this regard, this thesis presents a comprehensive review of the applications of unmanned aerial imagery, where the objective was to determine the challenges that remote-sensing applications of UAV systems currently face. This review also allowed recognizing the specific characteristics and requirements of UAV-PSs, which are mostly ignored or not thoroughly assessed in recent studies. Accordingly, the focus of the first part of this thesis is on exploring the methodological and experimental aspects of implementing a UAV-PS. The developed system was extensively evaluated for precise modeling of an open-pit gravel mine and performing volumetric-change measurements. This application was selected for two main reasons. Firstly, this case study provided a challenging environment for 3D modeling, in terms of scale changes, terrain relief variations as well as structure and texture diversities. Secondly, open-pit-mine monitoring demands high levels of accuracy, which justifies our efforts to improve the developed UAV-PS to its maximum capacities. The hardware of the system consisted of an electric-powered helicopter, a high-resolution digital camera, and an inertial navigation system. The software of the system included the in-house programs specifically designed for camera calibration, platform calibration, system integration, onboard data acquisition, flight planning and ground control point (GCP) detection. The detailed features of the system are discussed in the thesis, and solutions are proposed in order to enhance the system and its photogrammetric outputs. The accuracy of the results was evaluated under various mapping conditions, including direct georeferencing and indirect georeferencing with different numbers, distributions and types of ground control points. Additionally, the effects of imaging configuration and network stability on modeling accuracy were assessed. The second part of this thesis concentrates on improving the techniques of sparse and dense reconstruction. The proposed solutions are alternatives to traditional aerial photogrammetry techniques, properly adapted to specific characteristics of unmanned, low-altitude imagery. Firstly, a method was developed for robust sparse matching and epipolar-geometry estimation. The main achievement of this method was its capacity to handle a very high percentage of outliers (errors among corresponding points) with remarkable computational efficiency (compared to the state-of-the-art techniques). Secondly, a block bundle adjustment (BBA) strategy was proposed based on the integration of intrinsic camera calibration parameters as pseudo-observations to Gauss-Helmert model. The principal advantage of this strategy was controlling the adverse effect of unstable imaging networks and noisy image observations on the accuracy of self-calibration. The sparse implementation of this strategy was also performed, which allowed its application to data sets containing a lot of tie points. Finally, the concepts of intrinsic curves were revisited for dense stereo matching. The proposed technique could achieve a high level of accuracy and efficiency by searching only through a small fraction of the whole disparity search space as well as internally handling occlusions and matching ambiguities. These photogrammetric solutions were extensively tested using synthetic data, close-range images and the images acquired from the gravel-pit mine. Achieving absolute 3D mapping accuracy of 11±7 mm illustrated the success of this system for high-precision modeling of the environment.
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
Invasive species pose a major threat to aquatic ecosystems. Their impact can be particularly severe in tropical regions, like those in northern Australia, where >20 invasive fish species are recorded. In temperate regions, environmental DNA (eDNA) technology is gaining momentum as a tool to detect aquatic pests, but the technology's effectiveness has not been fully explored in tropical systems with their unique climatic challenges (i.e. high turbidity, temperatures and ultraviolet light). In this study, we modified conventional eDNA protocols for use in tropical environments using the invasive fish, Mozambique tilapia (Oreochromis mossambicus) as a detection model. We evaluated the effects of high water temperatures and fish density on the detection of tilapia eDNA, using filters with larger pores to facilitate filtration. Large-pore filters (20 μm) were effective in filtering turbid waters and retaining sufficient eDNA, whilst achieving filtration times of 2-3 min per 2-L sample. High water temperatures, often experienced in the tropics (23, 29, 35 °C), did not affect eDNA degradation rates, although high temperatures (35 °C) did significantly increase fish eDNA shedding rates. We established a minimum detection limit for tilapia (1 fish/0.4 megalitres/after 4 days) and found that low water flow (3.17 L/s) into ponds with high fish density (>16 fish/0.4 megalitres) did not affect eDNA detection. These results demonstrate that eDNA technology can be effectively used in tropical ecosystems to detect invasive fish species. © 2016 John Wiley & Sons Ltd.
Resumo:
Abstract. The use of artificial immune systems in intrusion detection is an appealing concept for two reasons. Firstly, the human immune system provides the human body with a high level of protection from invading pathogens, in a robust, self-organised and distributed manner. Secondly, current techniques used in computer security are not able to cope with the dynamic and increasingly complex nature of computer systems and their security. It is hoped that biologically inspired approaches in this area, including the use of immune-based systems will be able to meet this challenge. Here we collate the algorithms used, the development of the systems and the outcome of their implementation. It provides an introduction and review of the key developments within this field, in addition to making suggestions for future research.
Resumo:
Abstract. Dendritic cells are antigen presenting cells that provide a vital link between the innate and adaptive immune system. Research into this family of cells has revealed that they perform the role of coordinating T-cell based immune responses, both reactive and for generating tolerance. We have derived an algorithm based on the functionality of these cells, and have used the signals and differentiation pathways to build a control mechanism for an artificial immune system. We present our algorithmic details in addition to some preliminary results, where the algorithm was applied for the purpose of anomaly detection. We hope that this algorithm will eventually become the key component within a large, distributed immune system, based on sound immunological concepts.
Resumo:
The role of T-cells within the immune system is to confirm and assess anomalous situations and then either respond to or tolerate the source of the effect. To illustrate how these mechanisms can be harnessed to solve real-world problems, we present the blueprint of a T-cell inspired algorithm for computer security worm detection. We show how the three central T-cell processes, namely T-cell maturation, differentiation and proliferation, naturally map into this domain and further illustrate how such an algorithm fits into a complete immune inspired computer security system and framework.
Resumo:
This thesis focuses on digital equalization of nonlinear fiber impairments for coherent optical transmission systems. Building from well-known physical models of signal propagation in single-mode optical fibers, novel nonlinear equalization techniques are proposed, numerically assessed and experimentally demonstrated. The structure of the proposed algorithms is strongly driven by the optimization of the performance versus complexity tradeoff, envisioning the near-future practical application in commercial real-time transceivers. The work is initially focused on the mitigation of intra-channel nonlinear impairments relying on the concept of digital backpropagation (DBP) associated with Volterra-based filtering. After a comprehensive analysis of the third-order Volterra kernel, a set of critical simplifications are identified, culminating in the development of reduced complexity nonlinear equalization algorithms formulated both in time and frequency domains. The implementation complexity of the proposed techniques is analytically described in terms of computational effort and processing latency, by determining the number of real multiplications per processed sample and the number of serial multiplications, respectively. The equalization performance is numerically and experimentally assessed through bit error rate (BER) measurements. Finally, the problem of inter-channel nonlinear compensation is addressed within the context of 400 Gb/s (400G) superchannels for long-haul and ultra-long-haul transmission. Different superchannel configurations and nonlinear equalization strategies are experimentally assessed, demonstrating that inter-subcarrier nonlinear equalization can provide an enhanced signal reach while requiring only marginal added complexity.
Resumo:
The immune system provides an ideal metaphor for anomaly detection in general and computer security in particular. Based on this idea, artificial immune systems have been used for a number of years for intrusion detection, unfortunately so far with little success. However, these previous systems were largely based on immunological theory from the 1970s and 1980s and over the last decade our understanding of immunological processes has vastly improved. In this paper we present two new immune inspired algorithms based on the latest immunological discoveries, such as the behaviour of Dendritic Cells. The resultant algorithms are applied to real world intrusion problems and show encouraging results. Overall, we believe there is a bright future for these next generation artificial immune algorithms
Resumo:
Poultry colibacillosis due to Avian Pathogenic Escherichia coli (APEC) is responsible for several extra-intestinal pathological conditions, leading to serious economic damage in poultry production. The most commonly associated pathologies are airsacculitis, colisepticemia, and cellulitis in broiler chickens, and salpingitis and peritonitis in broiler breeders. In this work a total of 66 strains isolated from dead broiler breeders affected with colibacillosis and 61 strains from healthy broilers were studied. Strains from broiler breeders were typified with serogroups O2, O18, and O78, which are mainly associated with disease. The serogroup O78 was the most prevalent (58%). All the strains were checked for the presence of 11 virulence genes: 1) arginine succinyltransferase A (astA); ii) E. coli hemeutilization protein A (chuA); iii) colicin V A/B (cvaA/B); iv) fimbriae mannose-binding type 1 (fimC); v) ferric yersiniabactin uptake A (fyuA); vi) iron-repressible high-molecular-weight proteins 2 (irp2); vii) increased serum survival (iss); viii) iron-uptake systems of E. coli D (iucD); ix) pielonefritis associated to pili C (papC); x) temperature sensitive haemaglutinin (tsh), and xi) vacuolating autotransporter toxin (vat), by Multiplex-PCR. The results showed that all genes are present in both commensal and pathogenic E. coli strains. The iron uptake-related genes and the serum survival gene were more prevalent among APEC. The adhesin genes, except tsh, and the toxin genes, except astA, were also more prevalent among APEC isolates. Except for astA and tsh, APEC strains harbored the majority of the virulence-associated genes studied and fimC was the most prevalent gene, detected in 96.97 and 88.52% of APEC and AFEC strains, respectively. Possession of more than one iron transport system seems to play an important role on APEC survival.
Resumo:
The use of artificial immune systems in intrusion detection is an appealing concept for two reasons. Firstly, the human immune system provides the human body with a high level of protection from invading pathogens, in a robust, self-organised and distributed manner. Secondly, current techniques used in computer security are not able to cope with the dynamic and increasingly complex nature of computer systems and their security. It is hoped that biologically inspired approaches in this area, including the use of immune-based systems will be able to meet this challenge. Here we review the algorithms used, the development of the systems and the outcome of their implementation. We provide an introduction and analysis of the key developments within this field, in addition to making suggestions for future research.