912 resultados para Missile Attacks
Resumo:
Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.
Resumo:
Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.
Resumo:
Concrete is a universal material in the construction industry. With natural resources like sand and aggregate, fast depleting, it is time to look for alternate materials to substitute these in the process of making concrete. There are instances like exposure to solar radiation, fire, furnaces, and nuclear reactor vessels, special applications like missile launching pads etc., where concrete is exposed to temperature variations In this research work, an attempt has been made to understand the behaviour of concrete when weathered laterite aggregate is used in both conventional and self compacting normal strength concrete. The study has been extended to understand the thermal behaviour of both types of laterised concretes and to check suitability as a fire protection material. A systematic study of laterised concrete considering parameters like source of laterite aggregate, grades of Ordinary Portland Cement (OPC) and types of supplementary cementitious materials (fly ash and GGBFS) has been carried out to arrive at a feasible combination of various ingredients in laterised concrete. A mix design methodology has been proposed for making normal strength laterised self compacting concrete based on trial mixes and the same has also been validated. The physical and mechanical properties of laterised concretes have been studied with respect to different variables like exposure temperature (200°C, 400°C and 600°C) and cooling environment (air cooled and water cooled). The behaviour of ferrocement elements with laterised self compacting concrete has also been studied by varying the cover to mesh reinforcement (10mm to 50mm at an interval of 10mm), exposure temperature and cooling environment.
Resumo:
Biometrics deals with the physiological and behavioral characteristics of an individual to establish identity. Fingerprint based authentication is the most advanced biometric authentication technology. The minutiae based fingerprint identification method offer reasonable identification rate. The feature minutiae map consists of about 70-100 minutia points and matching accuracy is dropping down while the size of database is growing up. Hence it is inevitable to make the size of the fingerprint feature code to be as smaller as possible so that identification may be much easier. In this research, a novel global singularity based fingerprint representation is proposed. Fingerprint baseline, which is the line between distal and intermediate phalangeal joint line in the fingerprint, is taken as the reference line. A polygon is formed with the singularities and the fingerprint baseline. The feature vectors are the polygonal angle, sides, area, type and the ridge counts in between the singularities. 100% recognition rate is achieved in this method. The method is compared with the conventional minutiae based recognition method in terms of computation time, receiver operator characteristics (ROC) and the feature vector length. Speech is a behavioural biometric modality and can be used for identification of a speaker. In this work, MFCC of text dependant speeches are computed and clustered using k-means algorithm. A backpropagation based Artificial Neural Network is trained to identify the clustered speech code. The performance of the neural network classifier is compared with the VQ based Euclidean minimum classifier. Biometric systems that use a single modality are usually affected by problems like noisy sensor data, non-universality and/or lack of distinctiveness of the biometric trait, unacceptable error rates, and spoof attacks. Multifinger feature level fusion based fingerprint recognition is developed and the performances are measured in terms of the ROC curve. Score level fusion of fingerprint and speech based recognition system is done and 100% accuracy is achieved for a considerable range of matching threshold
Resumo:
Extensive use of the Internet coupled with the marvelous growth in e-commerce and m-commerce has created a huge demand for information security. The Secure Socket Layer (SSL) protocol is the most widely used security protocol in the Internet which meets this demand. It provides protection against eaves droppings, tampering and forgery. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL. But recent attacks against RC4 and HMAC have raised questions in the confidence on these algorithms. Hence two novel cryptographic algorithms MAJE4 and MACJER-320 have been proposed as substitutes for them. The focus of this work is to demonstrate the performance of these new algorithms and suggest them as dependable alternatives to satisfy the need of security services in SSL. The performance evaluation has been done by using practical implementation method.
Resumo:
In today's complicated computing environment, managing data has become the primary concern of all industries. Information security is the greatest challenge and it has become essential to secure the enterprise system resources like the databases and the operating systems from the attacks of the unknown outsiders. Our approach plays a major role in detecting and managing vulnerabilities in complex computing systems. It allows enterprises to assess two primary tiers through a single interface as a vulnerability scanner tool which provides a secure system which is also compatible with the security compliance of the industry. It provides an overall view of the vulnerabilities in the database, by automatically scanning them with minimum overhead. It gives a detailed view of the risks involved and their corresponding ratings. Based on these priorities, an appropriate mitigation process can be implemented to ensure a secured system. The results show that our approach could effectively optimize the time and cost involved when compared to the existing systems
Resumo:
This article is on the life and works of Dr. Kalam as a student, a teacher, a team leader, the President of India and above all a great visionary. It is also expected to be a sequel to the one entitled ‘A meeting with the missile man’
Resumo:
With this document, we provide a compilation of in-depth discussions on some of the most current security issues in distributed systems. The six contributions have been collected and presented at the 1st Kassel Student Workshop on Security in Distributed Systems (KaSWoSDS’08). We are pleased to present a collection of papers not only shedding light on the theoretical aspects of their topics, but also being accompanied with elaborate practical examples. In Chapter 1, Stephan Opfer discusses Viruses, one of the oldest threats to system security. For years there has been an arms race between virus producers and anti-virus software providers, with no end in sight. Stefan Triller demonstrates how malicious code can be injected in a target process using a buffer overflow in Chapter 2. Websites usually store their data and user information in data bases. Like buffer overflows, the possibilities of performing SQL injection attacks targeting such data bases are left open by unwary programmers. Stephan Scheuermann gives us a deeper insight into the mechanisms behind such attacks in Chapter 3. Cross-site scripting (XSS) is a method to insert malicious code into websites viewed by other users. Michael Blumenstein explains this issue in Chapter 4. Code can be injected in other websites via XSS attacks in order to spy out data of internet users, spoofing subsumes all methods that directly involve taking on a false identity. In Chapter 5, Till Amma shows us different ways how this can be done and how it is prevented. Last but not least, cryptographic methods are used to encode confidential data in a way that even if it got in the wrong hands, the culprits cannot decode it. Over the centuries, many different ciphers have been developed, applied, and finally broken. Ilhan Glogic sketches this history in Chapter 6.
Resumo:
In recent times, increased emphasis has been placed on diversifying the types of trees to shade cacao (Theobroma cacao L.) and to achieve additional services. Agroforestry systems that include profitable and native timber trees are a viable alternative but it is necessary to understand the growth characteristics of these species under different environmental conditions. Thus, timber tree species selection should be based on plant responses to biotic and abiotic factors. The aims of this study were (1) to evaluate growth rates and leaf area indices of the four commercial timber species: Cordia thaisiana, Cedrela odorata, Swietenia macrophylla and Tabebuia rosea in conjunction with incidence of insect attacks and (2) to compare growth rates of four Venezuelan Criollo cacao cultivars planted under the shade of these four timber species during the first 36 months after establishment. Parameters monitored in timber trees were: survival rates, growth rates expressed as height and diameter at breast height and leaf area index. In the four Cacao cultivars: height and basal diameter. C. thaisiana and C. odorata had the fastest growth and the highest survival rates. Growth rates of timber trees will depend on their susceptibility to insect attacks as well as to total leaf area. All cacao cultivars showed higher growth rates under the shade of C. odorata. Growth rates of timber trees and cacao cultivars suggest that combinations of cacao and timber trees are a feasible agroforestry strategy in Venezuela.
Resumo:
Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.
Resumo:
In der algebraischen Kryptoanalyse werden moderne Kryptosysteme als polynomielle, nichtlineare Gleichungssysteme dargestellt. Das Lösen solcher Gleichungssysteme ist NP-hart. Es gibt also keinen Algorithmus, der in polynomieller Zeit ein beliebiges nichtlineares Gleichungssystem löst. Dennoch kann man aus modernen Kryptosystemen Gleichungssysteme mit viel Struktur generieren. So sind diese Gleichungssysteme bei geeigneter Modellierung quadratisch und dünn besetzt, damit nicht beliebig. Dafür gibt es spezielle Algorithmen, die eine Lösung solcher Gleichungssysteme finden. Ein Beispiel dafür ist der ElimLin-Algorithmus, der mit Hilfe von linearen Gleichungen das Gleichungssystem iterativ vereinfacht. In der Dissertation wird auf Basis dieses Algorithmus ein neuer Solver für quadratische, dünn besetzte Gleichungssysteme vorgestellt und damit zwei symmetrische Kryptosysteme angegriffen. Dabei sind die Techniken zur Modellierung der Chiffren von entscheidender Bedeutung, so das neue Techniken entwickelt werden, um Kryptosysteme darzustellen. Die Idee für das Modell kommt von Cube-Angriffen. Diese Angriffe sind besonders wirksam gegen Stromchiffren. In der Arbeit werden unterschiedliche Varianten klassifiziert und mögliche Erweiterungen vorgestellt. Das entstandene Modell hingegen, lässt sich auch erfolgreich auf Blockchiffren und auch auf andere Szenarien erweitern. Bei diesen Änderungen muss das Modell nur geringfügig geändert werden.
Resumo:
Vegetables represent a main source of micro-nutrients which can improve the health status of malnourished poor in the world. Spinach (Spinacia oleracea L.) is a popular leafy vegetable in many countries which is rich with several important micro-nutrients. Thus, consuming Spinach helps to overcome micro-nutrient deficiencies. Pests and pathogens act as major yield constraints in food production. Root-knot nematodes, Meloidogyne species, constitute a large group of highly destructive plant pests. Spinach is found to be highly susceptible for these nematode attacks. Though agricultural production has largely benefited from modern technologies and innovations, some important dimensions which can minimize the yield losses have been neglected by most of the growers. Pre-plant or initial nematode density in soil is a crucial biotic factor which is directly responsible for crop losses. Hence, information on preplant nematode densities and the corresponding damage is of vital importance to develop successful control procedures to enhance crop production. In the present study, effect of seven initial densities of M. incognita, i.e., 156, 312, 625, 1250, 2,500, 5,000 and 10,000 infective juveniles (IJs)/plant (equivalent to 1000cm3 soil) on the growth and root infestation on potted spinach plants was determined in a screen house. In order to ensure a high accuracy, root infestation was ascertained by the number of galls formed, the percentage galled-length of feeder roots and galled-feeder roots, and egg production, per plant. Fifty days post-inoculation, shoot length and weight, and root length were suppressed at the lowest IJs density. However, the pathogenic effect was pronounced at the highest density at which 43%, 46% and 45% reduction in shoot length and weight, and root length, respectively, was recorded. The highest reduction in root weight (26%) was detected at the second highest density. The Number of galls and percentage galled-length of feeder roots/per plant showed significant progressive increase across the increasing IJs density with the highest mean value of 432.3 and 54%, respectively. The two shoot growth parameters and root length showed significant inverse relationship with the increasing gall formation. Moreover, the shoot and root length were shown to be mutually dependent on each other. Suppression of shoot growth of spinach greatly affects the grower’s economy. Hence, control measures are essentially needed to ensure a better production of spinach via reducing the pre-plant density below the level of 0.156 IJs/cm3.
Resumo:
Memory errors are a common cause of incorrect software execution and security vulnerabilities. We have developed two new techniques that help software continue to execute successfully through memory errors: failure-oblivious computing and boundless memory blocks. The foundation of both techniques is a compiler that generates code that checks accesses via pointers to detect out of bounds accesses. Instead of terminating or throwing an exception, the generated code takes another action that keeps the program executing without memory corruption. Failure-oblivious code simply discards invalid writes and manufactures values to return for invalid reads, enabling the program to continue its normal execution path. Code that implements boundless memory blocks stores invalid writes away in a hash table to return as the values for corresponding out of bounds reads. he net effect is to (conceptually) give each allocated memory block unbounded size and to eliminate out of bounds accesses as a programming error. We have implemented both techniques and acquired several widely used open source servers (Apache, Sendmail, Pine, Mutt, and Midnight Commander).With standard compilers, all of these servers are vulnerable to buffer overflow attacks as documented at security tracking web sites. Both failure-oblivious computing and boundless memory blocks eliminate these security vulnerabilities (as well as other memory errors). Our results show that our compiler enables the servers to execute successfully through buffer overflow attacks to continue to correctly service user requests without security vulnerabilities.
Resumo:
La migraña es el síndrome de cefalea primaria mejor conocido. Es una cefalea incapacitante de curso fluctuante. Con importante impacto en la calidad de vida de los pacientes y de altos costos económicos para estos ya que se presenta generalmente en los años más productivos de la persona, así como altos costos para el sistema. La meta es la supresión de las crisis por lo cual una de las piedras angulares del tratamiento es la profilaxis de los ataques. Para esto se han estudiado múltiples medicamentos, en los últimos años la toxina botulínica ha despertado entusiasmo. El objetivo de este trabajo era, mediante una revisión sistemática de la literatura, establecer si la toxina botulínica es un medicamento eficaz en la prevención de las crisis migrañosas.
Resumo:
El conflicto árabe-israelí es de los más antiguos que existen en el mundo, es por esto que es importante conocer algunas de las causalidades del por qué este conflicto no ha logrado solucionarse. En este caso se estudiará como el discurso de Hamás logra influir en la prolongación de este conflicto. Para determinar como el discurso logra influir se hace la división en tres capítulos. En el primer capitulo se determinan cuales son los elementos característicos de su discurso; en este caso los elementos son antisemitismo, irredentismo y nacionalismo que se encuentran presentes en todos los discursos y en su carta fundacional. En una segunda parte se busca determinar como el discurso se convierte en realidad, haciéndose tangible por medio de atentados terroristas por parte del brazo armado de Hamás denominado Ezzedine Al-Qassam. En el último capítulo se determina como el discurso se convierte en un elemento determinante para la prolongación del conflicto y de que manera afecta tanto a la población israelí como a la palestina. Por otra parte se busca evidenciar como el discurso de Hamás influye en diferentes niveles (individual, comunitario, binacional e internacional). Finalmente se logra establecer como el discurso de Hamás y las experiencias de vida de israelíes y palestinos configuran el escenario perfecto para la continuación del conflicto.