881 resultados para flash crowd attack


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Surveillance networks are typically monitored by a few people, viewing several monitors displaying the camera feeds. It is then very difficult for a human operator to effectively detect events as they happen. Recently, computer vision research has begun to address ways to automatically process some of this data, to assist human operators. Object tracking, event recognition, crowd analysis and human identification at a distance are being pursued as a means to aid human operators and improve the security of areas such as transport hubs. The task of object tracking is key to the effective use of more advanced technologies. To recognize an event people and objects must be tracked. Tracking also enhances the performance of tasks such as crowd analysis or human identification. Before an object can be tracked, it must be detected. Motion segmentation techniques, widely employed in tracking systems, produce a binary image in which objects can be located. However, these techniques are prone to errors caused by shadows and lighting changes. Detection routines often fail, either due to erroneous motion caused by noise and lighting effects, or due to the detection routines being unable to split occluded regions into their component objects. Particle filters can be used as a self contained tracking system, and make it unnecessary for the task of detection to be carried out separately except for an initial (often manual) detection to initialise the filter. Particle filters use one or more extracted features to evaluate the likelihood of an object existing at a given point each frame. Such systems however do not easily allow for multiple objects to be tracked robustly, and do not explicitly maintain the identity of tracked objects. This dissertation investigates improvements to the performance of object tracking algorithms through improved motion segmentation and the use of a particle filter. A novel hybrid motion segmentation / optical flow algorithm, capable of simultaneously extracting multiple layers of foreground and optical flow in surveillance video frames is proposed. The algorithm is shown to perform well in the presence of adverse lighting conditions, and the optical flow is capable of extracting a moving object. The proposed algorithm is integrated within a tracking system and evaluated using the ETISEO (Evaluation du Traitement et de lInterpretation de Sequences vidEO - Evaluation for video understanding) database, and significant improvement in detection and tracking performance is demonstrated when compared to a baseline system. A Scalable Condensation Filter (SCF), a particle filter designed to work within an existing tracking system, is also developed. The creation and deletion of modes and maintenance of identity is handled by the underlying tracking system; and the tracking system is able to benefit from the improved performance in uncertain conditions arising from occlusion and noise provided by a particle filter. The system is evaluated using the ETISEO database. The dissertation then investigates fusion schemes for multi-spectral tracking systems. Four fusion schemes for combining a thermal and visual colour modality are evaluated using the OTCBVS (Object Tracking and Classification in and Beyond the Visible Spectrum) database. It is shown that a middle fusion scheme yields the best results and demonstrates a significant improvement in performance when compared to a system using either mode individually. Findings from the thesis contribute to improve the performance of semi-automated video processing and therefore improve security in areas under surveillance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research investigates wireless intrusion detection techniques for detecting attacks on IEEE 802.11i Robust Secure Networks (RSNs). Despite using a variety of comprehensive preventative security measures, the RSNs remain vulnerable to a number of attacks. Failure of preventative measures to address all RSN vulnerabilities dictates the need for a comprehensive monitoring capability to detect all attacks on RSNs and also to proactively address potential security vulnerabilities by detecting security policy violations in the WLAN. This research proposes novel wireless intrusion detection techniques to address these monitoring requirements and also studies correlation of the generated alarms across wireless intrusion detection system (WIDS) sensors and the detection techniques themselves for greater reliability and robustness. The specific outcomes of this research are: A comprehensive review of the outstanding vulnerabilities and attacks in IEEE 802.11i RSNs. A comprehensive review of the wireless intrusion detection techniques currently available for detecting attacks on RSNs. Identification of the drawbacks and limitations of the currently available wireless intrusion detection techniques in detecting attacks on RSNs. Development of three novel wireless intrusion detection techniques for detecting RSN attacks and security policy violations in RSNs. Development of algorithms for each novel intrusion detection technique to correlate alarms across distributed sensors of a WIDS. Development of an algorithm for automatic attack scenario detection using cross detection technique correlation. Development of an algorithm to automatically assign priority to the detected attack scenario using cross detection technique correlation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Science has been under attack in the last thirty years, and recently a number of prominent scientists have been busy fighting back. Here, an argument is presented that the `science wars' stem from an unreasonably strict adherence to the reductive method on the part of science, but that weakening this stance need not imply a lapse into subjectivity. One possible method for formalising the description of non-separable, contextually dependent complex systems is presented. This is based upon a quantum-like approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background : Migraine is a common cause of disability. Many subjects (30 – 40%) do not respond to the 5-HT 1B/1D agonists (the triptans) commonly used in the treatment of migraine attacks. Calcitonin gene-related protein (CGRP) receptor antagonism is a new approach to the treatment of migraine attacks. Objectives/methods : This evaluation is of a Phase III clinical trial comparing telcagepant, an orally active CGRP receptor antagonist, with zolmitriptan in subjects during an attack of migraine. Results : Telcagepant 300 mg has a similar efficacy to zolmitriptan in relieving pain, phonophobia, photophobia, and nausea. Telcagepant was better tolerated than zolmitriptan. Conclusions : The initial Phase III clinical trial results with telcagepant are promising but several further clinical trials are needed to determine the place of telcagepant in the treatment of migraine attacks

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Buffer overflow vulnerabilities continue to prevail and the sophistication of attacks targeting these vulnerabilities is continuously increasing. As a successful attack of this type has the potential to completely compromise the integrity of the targeted host, early detection is vital. This thesis examines generic approaches for detecting executable payload attacks, without prior knowledge of the implementation of the attack, in such a way that new and previously unseen attacks are detectable. Executable payloads are analysed in detail for attacks targeting the Linux and Windows operating systems executing on an Intel IA-32 architecture. The execution flow of attack payloads are analysed and a generic model of execution is examined. A novel classification scheme for executable attack payloads is presented which allows for characterisation of executable payloads and facilitates vulnerability and threat assessments, and intrusion detection capability assessments for intrusion detection systems. An intrusion detection capability assessment may be utilised to determine whether or not a deployed system is able to detect a specific attack and to identify requirements for intrusion detection functionality for the development of new detection methods. Two novel detection methods are presented capable of detecting new and previously unseen executable attack payloads. The detection methods are capable of identifying and enumerating the executable payload’s interactions with the operating system on the targeted host at the time of compromise. The detection methods are further validated using real world data including executable payload attacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dreaming of Amelia (2009) recounts a small group of HSC students’ final year of high school. Told from multiple perspectives, the novel focuses on shifting senses of self, maturity, and agency as the protagonists move from adolescence to adulthood. The central conflict of the novel results from two ‘bad kids from the bad crowd at bad Brookfield High’ (blurb) transferring to wealthy private school, Ashbury; Amelia and Riley are scholarship students who do not fit with Ashbury’s profile of 'normal student' as it is understood by the school’s students or staff, and their presence in the school community forces many people to reassess their understanding of individual value (or, at least, that’s what the novel claims happens). In the shifting of perceptions, allegiances, and relationships, each of the main characters achieves a stronger sense of their identity, and Dreaming of Amelia is thus firmly located within the tradition of Young Adult (YA) literature, with all its stereotypes of adolescence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The core business of public health is to protect and promote health in the population. Public health planning is the means to maximise these aspirations. Health professionals develop plans to address contemporary health priorities as the evidence about changing patterns of mortality and morbidity is presented. Officials are also alert to international trends in patterns of disease that have the potential to affect the health of Australians. Integrated planning and preparation is currently underway involving all emergency health services, hospitals and population health units to ensure Australia's quick and efficient response to any major infectious disease outbreak, such as avian influenza (bird flu). Public health planning for the preparations for the Sydney Olympics and Paralympic Games in 2000 took almost three years. ‘Its major components included increased surveillance of communicable disease; presentations to sentinel emergency departments; medical encounters at Olympic venues; cruise ship surveillance; environmental and food safety inspections; bioterrorism surveillance and global epidemic intelligence’ (Jorm et al 2003, 102). In other words, the public health plan was developed to ensure food safety, hospital capacity, safe crowd control, protection against infectious diseases, and an integrated emergency and disaster plan. We have national and state plans for vaccinating children against infectious diseases in childhood; plans to promote dental health for children in schools; and screening programs for cervical, breast and prostate cancer. An effective public health response to a change in the distribution of morbidity and mortality requires planning. All levels of government plan for the public’s health. Local governments (councils) ensure healthy local environments to protect the public’s health. They plan parks for recreation, construct traffic-calming devices near schools to prevent childhood accidents, build shade structures and walking paths, and even embed drafts/chess squares in tables for people to sit and play. Environmental Health officers ensure food safety in restaurants and measure water quality. These public health measures attempt to promote the quality of life of residents. Australian and state governments produce plans that protect and promote health through various policy and program initiatives and innovations. To be effective, program plans need to be evaluated. However, building an integrated evaluation plan into a program plan is often forgotten, as planning and evaluation are seen as two distinct entities. Consequently, it is virtually impossible to measure, with any confidence, the extent to which a program has achieved its goals and objectives. This chapter introduces you to the concepts of public health program planning and evaluation. Case studies and reflection questions are presented to illustrate key points. As various authors use different terminology to describe the same concepts/actions of planning and evaluation, the glossary at the back of this book will help you to clarify the terms used in this chapter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Alzaid et al. proposed a forward & backward secure key management scheme in wireless sensor networks for Process Control Systems (PCSs) or Supervisory Control and Data Acquisition (SCADA) systems. The scheme, however, is still vulnerable to an attack called the sandwich attack that can be launched when the adversary captures two sensor nodes at times t1 and t2, and then reveals all the group keys used between times t1 and t2. In this paper, a fix to the scheme is proposed in order to limit the vulnerable time duration to an arbitrarily chosen time span while keeping the forward and backward secrecy of the scheme untouched. Then, the performance analysis for our proposal, Alzaid et al.’s scheme, and Nilsson et al.’s scheme is given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The author's approach to the problems associated with building in bushfire prone landscapes comes from 12 years of study of the biophysical and cultural landscapes in the Great Southern Region of Western Australia - research which resulted in the design and construction of the H-house at Bremer Bay. The house was developed using a 'ground up' approach whereby Dr Weir conducted topographical surveys and worked with a local botanist and a bushfire risk consultant to ascertain the level of threat that fire presented to this particular site. The intention from the outset however, was not to design a bushfire resistant house per se, but to develop a design which would place the owners in close proximity to the highly biodiverse heath vegetation of their site. The research aim was to find ways - through architectural design-to link the patterns of usage of the house with other site specific conditions related to the prevailing winds, solar orientation and seasonal change. The H-house has a number of features which increase the level of bushfire safety. These include: Fire rated roller shutters (tested by the CSIRO for ember attack and radiant heat), Fire resistant double glazing (on windows not protected by the shutters), Fibre-cement sheet cladding of the underside of the elevated timber floor structure, Manually operated high pressure sprinkler system on exposed timber decks, A fire refuge (an enlarged laundry, shower area) within the house with a dedicated cabinet for fire fighting equipment) and A low pressure solar powered domestic water supply system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This special issue of Futures is concerned with community engagement strategies that help to inform medium and long-term futures studies in order to foster sustainable urban environments. Recent special issues of Futures, such as Human Extinction (41:10) and Utopian Thought (41:4), reflect the increasing significance of sustainability issues, which is why we present another crucial component of sustainability, community engagement. Responding to futurists’ long term concerns about climate change outlined in Futures 41(9) [1], Stevenson concluded that we can no longer support infinite growth, and that our goal should be to reshape the economy to let us live within our means. In the face of the continued and accelerated crisis in environmental, economic and social sustainability, a number of trends informed our call for papers on the possible role of community engagement in contributing to enhanced urban sustainability: • Changes in the public sphere in terms of participation, online deliberation systems, polity of urban futures; • The possible use of user-generated content for urban planning (paralleling the rise of user generated content elsewhere); • The related role of social networking, collective and civic intelligence, and crowd- sourcing in urban futures; • The rise of technologies such as wireless Internet and mobile applications, and the impact of neogeography, simulations and 3D virtual environments that reproduce and analyse complex social phenomena and city systems in urban futures, design and planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Virtual 3D models of long bones are increasingly being used for implant design and research applications. The current gold standard for the acquisition of such data is Computed Tomography (CT) scanning. Due to radiation exposure, CT is generally limited to the imaging of clinical cases and cadaver specimens. Magnetic Resonance Imaging (MRI) does not involve ionising radiation and therefore can be used to image selected healthy human volunteers for research purposes. The feasibility of MRI as alternative to CT for the acquisition of morphological bone data of the lower extremity has been demonstrated in recent studies [1, 2]. Some of the current limitations of MRI are long scanning times and difficulties with image segmentation in certain anatomical regions due to poor contrast between bone and surrounding muscle tissues. Higher field strength scanners promise to offer faster imaging times or better image quality. In this study image quality at 1.5T is quantitatively compared to images acquired at 3T. --------- The femora of five human volunteers were scanned using 1.5T and 3T MRI scanners from the same manufacturer (Siemens) with similar imaging protocols. A 3D flash sequence was used with TE = 4.66 ms, flip angle = 15° and voxel size = 0.5 × 0.5 × 1 mm. PA-Matrix and body matrix coils were used to cover the lower limb and pelvis respectively. Signal to noise ratio (SNR) [3] and contrast to noise ratio (CNR) [3] of the axial images from the proximal, shaft and distal regions were used to assess the quality of images from the 1.5T and 3T scanners. The SNR was calculated for the muscle and bone-marrow in the axial images. The CNR was calculated for the muscle to cortex and cortex to bone marrow interfaces, respectively. --------- Preliminary results (one volunteer) show that the SNR of muscle for the shaft and distal regions was higher in 3T images (11.65 and 17.60) than 1.5T images (8.12 and 8.11). For the proximal region the SNR of muscles was higher in 1.5T images (7.52) than 3T images (6.78). The SNR of bone marrow was slightly higher in 1.5T images for both proximal and shaft regions, while it was lower in the distal region compared to 3T images. The CNR between muscle and bone of all three regions was higher in 3T images (4.14, 6.55 and 12.99) than in 1.5T images (2.49, 3.25 and 9.89). The CNR between bone-marrow and bone was slightly higher in 1.5T images (4.87, 12.89 and 10.07) compared to 3T images (3.74, 10.83 and 10.15). These results show that the 3T images generated higher contrast between bone and the muscle tissue than the 1.5T images. It is expected that this improvement of image contrast will significantly reduce the time required for the mainly manual segmentation of the MR images. Future work will focus on optimizing the 3T imaging protocol for reducing chemical shift and susceptibility artifacts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

CFD has been successfully used in the optimisation of aerodynamic surfaces using a given set of parameters such as Mach numbers and angle of attack. While carrying out a multidisciplinary design optimisation one deals with situations where the parameters have some uncertain attached. Any optimisation carried out for fixed values of input parameters gives a design which may be totally unacceptable under off-design conditions. The challenge is to develop a robust design procedure which takes into account the fluctuations in the input parameters. In this work, we attempt this using a modified Taguchi approach. This is incorporated into an evolutionary algorithm with many features developed in house. The method is tested for an UCAV design which simultaneously handles aerodynamics, electromagnetics and maneuverability. Results demonstrate that the method has considerable potential.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed Denial of Services DDoS, attacks has become one of the biggest threats for resources over Internet. Purpose of these attacks is to make servers deny from providing services to legitimate users. These attacks are also used for occupying media bandwidth. Currently intrusion detection systems can just detect the attacks but cannot prevent / track the location of intruders. Some schemes also prevent the attacks by simply discarding attack packets, which saves victim from attack, but still network bandwidth is wasted. In our opinion, DDoS requires a distributed solution to save wastage of resources. The paper, presents a system that helps us not only in detecting such attacks but also helps in tracing and blocking (to save the bandwidth as well) the multiple intruders using Intelligent Software Agents. The system gives dynamic response and can be integrated with the existing network defense systems without disturbing existing Internet model. We have implemented an agent based networking monitoring system in this regard.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today’s evolving networks are experiencing a large number of different attacks ranging from system break-ins, infection from automatic attack tools such as worms, viruses, trojan horses and denial of service (DoS). One important aspect of such attacks is that they are often indiscriminate and target Internet addresses without regard to whether they are bona fide allocated or not. Due to the absence of any advertised host services the traffic observed on unused IP addresses is by definition unsolicited and likely to be either opportunistic or malicious. The analysis of large repositories of such traffic can be used to extract useful information about both ongoing and new attack patterns and unearth unusual attack behaviors. However, such an analysis is difficult due to the size and nature of the collected traffic on unused address spaces. In this dissertation, we present a network traffic analysis technique which uses traffic collected from unused address spaces and relies on the statistical properties of the collected traffic, in order to accurately and quickly detect new and ongoing network anomalies. Detection of network anomalies is based on the concept that an anomalous activity usually transforms the network parameters in such a way that their statistical properties no longer remain constant, resulting in abrupt changes. In this dissertation, we use sequential analysis techniques to identify changes in the behavior of network traffic targeting unused address spaces to unveil both ongoing and new attack patterns. Specifically, we have developed a dynamic sliding window based non-parametric cumulative sum change detection techniques for identification of changes in network traffic. Furthermore we have introduced dynamic thresholds to detect changes in network traffic behavior and also detect when a particular change has ended. Experimental results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, using both synthetically generated datasets and real network traces collected from a dedicated block of unused IP addresses.