931 resultados para Rule-based techniques
Resumo:
Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.
Resumo:
This paper presents an image-based rendering system using algebraic relations between different views of an object. The system uses pictures of an object taken from known positions. Given three such images it can generate "virtual'' ones as the object would look from any position near the ones that the two input images were taken from. The extrapolation from the example images can be up to about 60 degrees of rotation. The system is based on the trilinear constraints that bind any three view so fan object. As a side result, we propose two new methods for camera calibration. We developed and used one of them. We implemented the system and tested it on real images of objects and faces. We also show experimentally that even when only two images taken from unknown positions are given, the system can be used to render the object from other view points as long as we have a good estimate of the internal parameters of the camera used and we are able to find good correspondence between the example images. In addition, we present the relation between these algebraic constraints and a factorization method for shape and motion estimation. As a result we propose a method for motion estimation in the special case of orthographic projection.
Resumo:
We apply modern synchrotron-based structural techniques to the study of serine adsorbed on the pure andAumodified intrinsically chiral Cu{531} surface. XPS and NEXAFS data in combination with DFT show that on the pure surface both enantiomers adsorb in l4 geometries (with de-protonated b-OH groups) at low coverage and in l3 geometries at saturation coverage. Significantly larger enantiomeric differences are seen for the l4 geometries, which involve substrate bonds of three side groups of the chiral center, i.e. a three-point interaction. The l3 adsorption geometry, where only the carboxylate and amino groups form substrate bonds, leads to smaller but still significant enantiomeric differences, both in geometry and the decomposition behavior. When Cu{531} is modified by the deposition of 1 and 2ML Au the orientations of serine at saturation coverage are significantly different from those on the clean surface. In all cases, however, a l3 bond coordination is found at saturation involving different numbers of Au atoms, which leads to relatively small enantiomeric differences.
Resumo:
Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.
Resumo:
This paper considers the use of Association Rule Mining (ARM) and our proposed Transaction based Rule Change Mining (TRCM) to identify the rule types present in tweet’s hashtags over a specific consecutive period of time and their linkage to real life occurrences. Our novel algorithm was termed TRCM-RTI in reference to Rule Type Identification. We created Time Frame Windows (TFWs) to detect evolvement statuses and calculate the lifespan of hashtags in online tweets. We link RTI to real life events by monitoring and recording rule evolvement patterns in TFWs on the Twitter network.
Resumo:
This thesis presents the study and development of fault-tolerant techniques for programmable architectures, the well-known Field Programmable Gate Arrays (FPGAs), customizable by SRAM. FPGAs are becoming more valuable for space applications because of the high density, high performance, reduced development cost and re-programmability. In particular, SRAM-based FPGAs are very valuable for remote missions because of the possibility of being reprogrammed by the user as many times as necessary in a very short period. SRAM-based FPGA and micro-controllers represent a wide range of components in space applications, and as a result will be the focus of this work, more specifically the Virtex® family from Xilinx and the architecture of the 8051 micro-controller from Intel. The Triple Modular Redundancy (TMR) with voters is a common high-level technique to protect ASICs against single event upset (SEU) and it can also be applied to FPGAs. The TMR technique was first tested in the Virtex® FPGA architecture by using a small design based on counters. Faults were injected in all sensitive parts of the FPGA and a detailed analysis of the effect of a fault in a TMR design synthesized in the Virtex® platform was performed. Results from fault injection and from a radiation ground test facility showed the efficiency of the TMR for the related case study circuit. Although TMR has showed a high reliability, this technique presents some limitations, such as area overhead, three times more input and output pins and, consequently, a significant increase in power dissipation. Aiming to reduce TMR costs and improve reliability, an innovative high-level technique for designing fault-tolerant systems in SRAM-based FPGAs was developed, without modification in the FPGA architecture. This technique combines time and hardware redundancy to reduce overhead and to ensure reliability. It is based on duplication with comparison and concurrent error detection. The new technique proposed in this work was specifically developed for FPGAs to cope with transient faults in the user combinational and sequential logic, while also reducing pin count, area and power dissipation. The methodology was validated by fault injection experiments in an emulation board. The thesis presents comparison results in fault coverage, area and performance between the discussed techniques.
Resumo:
Erbium-activated silica-based planar waveguides were prepared by three different technological routes: RF-sputtering, sol-gel and ion exchange. Various parameters of preparation were varied in order to optimize the waveguides for operation in the NIR region. Particular attention was devoted to the minimization of the losses and the increase of the luminescence efficiency of the metastable I-4(13/2) state of the Er3+ ion. Waveguide properties were determined by m-line spectroscopy and loss measurements. Waveguide Raman and luminescence spectroscopy were used to obtain information about the structure of the prepared films and about the dynamical processes related to the luminescence of the Er3+ ions.
Resumo:
An amperometric biosensor based on cholinesterase (ChE) has been used for the determination of selected carbamate insecticides in vegetable samples. The linear range of the biosensor for the N-methylcarbamates (aldicarb, carbaryl, carbofuran, methomyl and propoxur) varied from 5 x 10(-5) to 50 mg kg(-1). Limits of detection were calculated on the basis that the ChE enzymes were 10% inhibited and varied, depending of the combination ChE (as acetyl- or butyrylcholinesterase) vs. inhibitor (pesticide), from 1 x 10(-4) to 3.5 mg kg(-1). The biosensor-based carbamate determination was compared to liquid chromatography/UV methods. Three vegetable samples were spiked with carbofuran and propoxur at 125 mu g kg(-1) followed by conventional procedures. Good correlations were observed for carbofuran in the vegetable extracts (79, 96 and 91% recoveries for potato, carrot and sweet pepper, respectively), whereas for propoxur unsatisfactory results were obtained. Potato and carrot samples were spiked with 10, 50 and 125 mu g kg(-1) carbofuran, followed by direct determination by the amperometric biosensor. The fortified sampler; resulted in very high inhibition values, and recoveries were: 28, 34 and 99% for potato, and 140, 90 and 101% for carrot, respectively, at these three fortification levels. (C) 1998 Elsevier B.V. B.V.
Resumo:
A major challenge in cancer radiotherapy is to deliver a lethal dose of radiation to the target volume while minimizing damage to the surrounding normal tissue. We have proposed a model on how treatment efficacy might be improved by interfering with biological responses to DNA damage using exogenous electric fields as a strategy to drastically reduce radiation doses in cancer therapy. This approach is demonstrated at this Laboratory through case studies with prokaryotes (bacteria) and eukaryotes (yeast) cells, in which cellkilling rates induced by both gamma radiation and exogenous electric fields were measured. It was found that when cells exposed to gamma radiation are immediately submitted to a weak electric field, cell death increases more than an order of magnitude compared to the effect of radiation alone. This finding suggests, although does not prove, that DNA damage sites are reached and recognized by means of long-range electric DNA-protein interaction, and that exogenous electric fields could destructively interfere with this process. As a consequence, DNA repair is avoided leading to massive cell death. Here we are proposing the use this new technique for the design and construction of novel radiotherapy facilities associated with linac generated gamma beams under controlled conditions of dose and beam intensity.
Resumo:
This paper presents the work in progress of an on-demand software deployment system based on application virtualization concepts which eliminates the need of software installation and configuration on each computer. Some mechanisms were created, such as mapping of utilization of resources by the application to improve the software distribution and startup; a virtualization middleware which give all resources needed for the software execution; an asynchronous P2P transport used to optimizing distribution on the network; and off-line support where the user can execute the application even when the server is not available or when is out of the network. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
In businesses such as the software industry, which uses knowledge as a resource, activities are knowledge intensive, requiring constant adoption of new technologies and practices. Another feature of this environment is that the industry is particularly susceptible to failure; with this in mind, the objective of this research is to analyze the integration of Knowledge Management techniques into the activity of risk management as it applies to software development projects of micro and small Brazilian incubated technology-based firms. Research methods chosen were the Multiple Case Study. The main risk factor for managers and developers is that scope or goals are often unclear or misinterpreted. For risk management, firms have found that Knowledge Management techniques of conversion combination would be the most applicable for use; however, those most commonly used refer to the conversion mode as internalization.. © 2013 Elsevier Ltd. APM and IPMA.
Resumo:
Let m and n be integers greater than 1. Given lattices A and B of dimensions m and n, respectively, a technique for constructing a lattice from them of dimension m+n-1 is introduced. Furthermore, if A and B possess bases satisfying certain conditions, then a second technique yields a lattice of dimension m+n-2. The relevant parameters of the new lattices are given in terms of the respective parameters of A,B, and a lattice C isometric to a sublattice of A and B. Denser sphere packings than previously known ones in dimensions 52, 68, 84, 248, 520, and 4098 are obtained. © 2012 Elsevier Inc. All rights reserved.