875 resultados para Intrusion Detection Systems
Resumo:
Oxidised biomolecules in aged tissue could potentially be used as biomarkers for age-related diseases; however, it is still unclear whether they causatively contribute to ageing or are consequences of the ageing process. To assess the potential of using protein oxidation as markers of ageing, mass spectrometry (MS) was employed for the identification and quantification of oxidative modifications in obese (ob/ob) mice. Lean muscle mass and strength is reduced in obesity, representing a sarcopenic model in which the levels of oxidation can be evaluated for different muscular systems including calcium homeostasis, metabolism and contractility. Several oxidised residues were identified by tandem MS (MS/MS) in both muscle homogenate and isolated sarcoplasmic reticulum (SR), an organelle that regulates intracellular calcium levels in muscle. These modifications include oxidation of methionine, cysteine, tyrosine, and tryptophan in several proteins such as sarcoplasmic reticulum calcium ATPase (SERCA), glycogen phosphorylase, and myosin. Once modifications had been identified, multiple reaction monitoring MS (MRM) was used to quantify the percentage modification of oxidised residues within the samples. Preliminary data suggests proteins in ob/ob mice are more oxidised than the controls. For example SERCA, which constitutes 60-70% of the SR, had approximately a 2-fold increase in cysteine trioxidation of Cys561 in the obese model when compared to the control. Other obese muscle proteins have also shown a similar increase in oxidation for various residues. Further analysis with complex protein mixtures will determine the potential diagnostic use of MRM experiments for analysing protein oxidation in small biological samples such as muscle needle biopsies.
Resumo:
This chapter explores ways in which rigorous mathematical techniques, termed formal methods, can be employed to improve the predictability and dependability of autonomic computing. Model checking, formal specification, and quantitative verification are presented in the contexts of conflict detection in autonomic computing policies, and of implementation of goal and utility-function policies in autonomic IT systems, respectively. Each of these techniques is illustrated using a detailed case study, and analysed to establish its merits and limitations. The analysis is then used as a basis for discussing the challenges and opportunities of this endeavour to transition the development of autonomic IT systems from the current practice of using ad-hoc methods and heuristic towards a more principled approach. © 2012, IGI Global.
Resumo:
Insulated gate bipolar transistor (IGBT) modules are important safety critical components in electrical power systems. Bond wire lift-off, a plastic deformation between wire bond and adjacent layers of a device caused by repeated power/thermal cycles, is the most common failure mechanism in IGBT modules. For the early detection and characterization of such failures, it is important to constantly detect or monitor the health state of IGBT modules, and the state of bond wires in particular. This paper introduces eddy current pulsed thermography (ECPT), a nondestructive evaluation technique, for the state detection and characterization of bond wire lift-off in IGBT modules. After the introduction of the experimental ECPT system, numerical simulation work is reported. The presented simulations are based on the 3-D electromagnetic-thermal coupling finite-element method and analyze transient temperature distribution within the bond wires. This paper illustrates the thermal patterns of bond wires using inductive heating with different wire statuses (lifted-off or well bonded) under two excitation conditions: nonuniform and uniform magnetic field excitations. Experimental results show that uniform excitation of healthy bonding wires, using a Helmholtz coil, provides the same eddy currents on each, while different eddy currents are seen on faulty wires. Both experimental and numerical results show that ECPT can be used for the detection and characterization of bond wires in power semiconductors through the analysis of the transient heating patterns of the wires. The main impact of this paper is that it is the first time electromagnetic induction thermography, so-called ECPT, has been employed on power/electronic devices. Because of its capability of contactless inspection of multiple wires in a single pass, and as such it opens a wide field of investigation in power/electronic devices for failure detection, performance characterization, and health monitoring.
Resumo:
Phospholipid oxidation by adventitious damage generates a wide variety of products with potentially novel biological activities that can modulate inflammatory processes associated with various diseases. To understand the biological importance of oxidised phospholipids (OxPL) and their potential role as disease biomarkers requires precise information about the abundance of these compounds in cells and tissues. There are many chemiluminescence and spectrophotometric assays available for detecting oxidised phospholipids, but they all have some limitations. Mass spectrometry coupled with liquid chromatography is a powerful and sensitive approach that can provide detailed information about the oxidative lipidome, but challenges still remain. The aim of this work is to develop improved methods for detection of OxPLs by optimisation of chromatographic separation through testing several reverse phase columns and solvent systems, and using targeted mass spectrometry approaches. Initial experiments were carried out using oxidation products generated in vitro to optimise the chromatography separation parameters and mass spectrometry parameters. We have evaluated the chromatographic separation of oxidised phosphatidylcholines (OxPCs) and oxidised phosphatidylethanolamines (OXPEs) using C8, C18 and C30 reverse phase, polystyrene – divinylbenzene based monolithic and mixed – mode hydrophilic interaction (HILIC) columns, interfaced with mass spectrometry. Our results suggest that the monolithic column was best able to separate short chain OxPCs and OxPEs from long chain oxidised and native PCs and PEs. However, variation in charge of polar head groups and extreme diversity of oxidised species make analysis of several classes of OxPLs within one analytical run impractical. We evaluated and optimised the chromatographic separation of OxPLs by serially coupling two columns: HILIC and monolith column that provided us the larger coverage of OxPL species in a single analytical run.
Resumo:
Raster graphic ampelometric software was not exclusively developed for the estimation of leaf area, but also for the characterization of grapevine (Viti vinifera L.) leaves. The software was written in C-Hprogramming language, using the C-1-1- Builder 2007 for Windows 95-XP and Linux operation systems. It handles desktop-scanned images. On the image analysed with the GRA.LE.D., the user has to determine 11 points. These points are then connected and the distances between them calculated. The GRA.LE.D. software supports standard ampelometric measurements such as leaf area, angles between the veins and lengths of the veins. These measurements are recorded by the software and exported into plain ASCII text files for single or multiple samples. Twenty-two biometric data points of each leaf are identified by the GRA.LE.D. It presents the opportunity to statistically analyse experimental data, allows comparison of cultivars and enables graphic reconstruction of leaves using the Microsoft Excel Chart Wizard. The GRA. LE.D. was thoroughly calibrated and compared to other widely used instruments and methods such as photo-gravimetry, LiCor L0100, WinDIAS2.0 and ImageTool. By comparison, the GRA.LE.D. presented the most accurate measurements of leaf area, but the LiCor L0100 and the WinDIAS2.0 were faster, while the photo-gravimetric method proved to be the most time-consuming. The WinDIAS2.0 instrument was the least reliable. The GRA.LE.D. is uncomplicated, user-friendly, accurate, consistent, reliable and has wide practical application.
Resumo:
The main challenges of multimedia data retrieval lie in the effective mapping between low-level features and high-level concepts, and in the individual users' subjective perceptions of multimedia content. ^ The objectives of this dissertation are to develop an integrated multimedia indexing and retrieval framework with the aim to bridge the gap between semantic concepts and low-level features. To achieve this goal, a set of core techniques have been developed, including image segmentation, content-based image retrieval, object tracking, video indexing, and video event detection. These core techniques are integrated in a systematic way to enable the semantic search for images/videos, and can be tailored to solve the problems in other multimedia related domains. In image retrieval, two new methods of bridging the semantic gap are proposed: (1) for general content-based image retrieval, a stochastic mechanism is utilized to enable the long-term learning of high-level concepts from a set of training data, such as user access frequencies and access patterns of images. (2) In addition to whole-image retrieval, a novel multiple instance learning framework is proposed for object-based image retrieval, by which a user is allowed to more effectively search for images that contain multiple objects of interest. An enhanced image segmentation algorithm is developed to extract the object information from images. This segmentation algorithm is further used in video indexing and retrieval, by which a robust video shot/scene segmentation method is developed based on low-level visual feature comparison, object tracking, and audio analysis. Based on shot boundaries, a novel data mining framework is further proposed to detect events in soccer videos, while fully utilizing the multi-modality features and object information obtained through video shot/scene detection. ^ Another contribution of this dissertation is the potential of the above techniques to be tailored and applied to other multimedia applications. This is demonstrated by their utilization in traffic video surveillance applications. The enhanced image segmentation algorithm, coupled with an adaptive background learning algorithm, improves the performance of vehicle identification. A sophisticated object tracking algorithm is proposed to track individual vehicles, while the spatial and temporal relationships of vehicle objects are modeled by an abstract semantic model. ^
Resumo:
Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. ^ In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance. ^
Resumo:
Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.
Resumo:
Fast spreading unknown viruses have caused major damage on computer systems upon their initial release. Current detection methods have lacked capabilities to detect unknown viruses quickly enough to avoid mass spreading and damage. This dissertation has presented a behavior based approach to detecting known and unknown viruses based on their attempt to replicate. Replication is the qualifying fundamental characteristic of a virus and is consistently present in all viruses making this approach applicable to viruses belonging to many classes and executing under several conditions. A form of replication called self-reference replication, (SR-replication), has been formalized as one main type of replication which specifically replicates by modifying or creating other files on a system to include the virus itself. This replication type was used to detect viruses attempting replication by referencing themselves which is a necessary step to successfully replicate files. The approach does not require a priori knowledge about known viruses. Detection was accomplished at runtime by monitoring currently executing processes attempting to replicate. Two implementation prototypes of the detection approach called SRRAT were created and tested on the Microsoft Windows operating systems focusing on the tracking of user mode Win32 API system calls and Kernel mode system services. The research results showed SR-replication capable of distinguishing between file infecting viruses and benign processes with little or no false positives and false negatives. ^
Resumo:
Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^
Resumo:
Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.
Resumo:
The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^
Resumo:
Advancements in the micro-and nano-scale fabrication techniques have opened up new avenues for the development of portable, scalable and easier-to-use biosensors. Over the last few years, electrodes made of carbon have been widely used as sensing units in biosensors due to their attractive physiochemical properties. The aim of this research is to investigate different strategies to develop functionalized high surface carbon micro/nano-structures for electrochemical and biosensing devices. High aspect ratio three-dimensional carbon microarrays were fabricated via carbon microelectromechanical systems (C-MEMS) technique, which is based on pyrolyzing pre-patterned organic photoresist polymers. To further increase the surface area of the carbon microstructures, surface porosity was introduced by two strategies, i.e. (i) using F127 as porogen and (ii) oxygen reactive ion etch (RIE) treatment. Electrochemical characterization showed that porous carbon thin film electrodes prepared by using F127 as porogen had an effective surface area (Aeff 185%) compared to the conventional carbon electrode. To achieve enhanced electrochemical sensitivity for C-MEMS based functional devices, graphene was conformally coated onto high aspect ratio three-dimensional (3D) carbon micropillar arrays using electrostatic spray deposition (ESD) technique. The amperometric response of graphene/carbon micropillar electrode arrays exhibited higher electrochemical activity, improved charge transfer and a linear response towards H2O2 detection between 250&mgr;M to 5.5mM. Furthermore, carbon structures with dimensions from 50 nano-to micrometer level have been fabricated by pyrolyzing photo-nanoimprint lithography patterned organic resist polymer. Microstructure, elemental composition and resistivity characterization of the carbon nanostructures produced by this process were very similar to conventional photoresist derived carbon. Surface functionalization of the carbon nanostructures was performed using direct amination technique. Considering the need for requisite functional groups to covalently attach bioreceptors on the carbon surface for biomolecule detection, different oxidation techniques were compared to study the types of carbon-oxygen groups formed on the surface and their percentages with respect to different oxidation pretreatment times. Finally, a label-free detection strategy using signaling aptamer/protein binding complex for platelet-derived growth factor oncoprotein detection on functionalized three-dimensional carbon microarrays platform was demonstrated. The sensor showed near linear relationship between the relative fluorescence difference and protein concentration even in the sub-nanomolar range with an excellent detection limit of 5 pmol.
Resumo:
The local area network (LAN) interconnecting computer systems and soft- ware can make a significant contribution to the hospitality industry. The author discusses the advantages and disadvantages of such systems.
Resumo:
Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.