153 resultados para IP masquerading


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To assess extent of coder agreement for external causes of injury using ICD-10-AM for injury-related hospitalisations in Australian public hospitals. Methods: A random sample of 4850 discharges from 2002 to 2004 was obtained from a stratified random sample of 50 hospitals across four states in Australia. On-site medical record reviews were conducted and external cause codes were assigned blinded to the original coded data. Code agreement levels were grouped into the following agreement categories: block level, 3-character level, 4-character level, 5th-character level, and complete code level. Results: At a broad block level, code agreement was found in over 90% of cases for most mechanisms (eg, transport, fall). Percentage disagreement was 26.0% at the 3-character level; agreement for the complete external cause code was 67.6%. For activity codes, the percentage of disagreement at the 3-character level was 7.3% and agreement for the complete activity code was 68.0%. For place of occurrence codes, the percentage of disagreement at the 4-character level was 22.0%; agreement for the complete place code was 75.4%. Conclusions: With 68% agreement for complete codes and 74% agreement for 3-character codes, as well as variability in agreement levels across different code blocks, place and activity codes, researchers need to be aware of the reliability of their specific data of interest when they wish to undertake trend analyses or case selection for specific causes of interest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract - Mobile devices in the near future will need to collaborate to fulfill their function. Collaboration will be done by communication. We use a real world example of robotic soccer to come up with the necessary structures required for robotic communication. A review of related work is done and it is found no examples come close to providing a RANET. The robotic ad hoc network (RANET) we suggest uses existing structures pulled from the areas of wireless networks, peer to peer and software life-cycle management. Gaps are found in the existing structures so we describe how to extend some structures to satisfy the design. The RANET design supports robot cooperation by exchanging messages, discovering needed skills that other robots on the network may possess and the transfer of these skills. The network is built on top of a Bluetooth wireless network and uses JXTA to communicate and transfer skills. OSGi bundles form the skills that can be transferred. To test the nal design a reference implementation is done. Deficiencies in some third party software is found, specifically JXTA and JamVM and GNU Classpath. Lastly we look at how to fix the deciencies by porting the JXTA C implementation to the target robotic platform and potentially eliminating the TCP/IP layer, using UDP instead of TCP or using an adaptive TCP/IP stack. We also propose a future areas of investigation; how to seed the configuration for the Personal area network (PAN) Bluetooth protocol extension so a Bluetooth TCP/IP link is more quickly formed and using the STP to allow multi-hop messaging and transfer of skills.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universities are increasingly encouraged to take a leading role in economic development, particularly through innovation. Simultaneously, economic development policy itself is increasingly focused on small and medium-sized enterprises (SMEs), creating overlapping interactions in the roles of government policy, universities and SMEs and the processes of innovation creation and dissemination. This paper examines issues arising from these developments and relating to the key stakeholders (industry, government and universities in particular), the enabling mechanisms (network governance, relevant education, training and learning, and suitable structures), and local and cross-local links. The authors then use quantitative analysis of 450 SMEs in the UK to begin to evaluate the roles of universities and highlight areas for further theoretical development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cipher Cities was a practice-led research project developed in 3 stages between 2005 and 2007 resulting in the creation of a unique online community, ‘Cipher Cities’, that provides simple authoring tools and processes for individuals and groups to create their own mobile events and event journals, build community profile and participate in other online community activities. Cipher Cities was created to revitalise peoples relationship to everyday places by giving them the opportunity and motivation to create and share complex digital stories in simple and engaging ways. To do so we developed new design processes and methods for both the research team and the end user to appropriate web and mobile technologies. To do so we collaborated with ethnographers, designers and ICT researchers and developers. In teams we ran a series of workshops in a wide variety of cities in Australia to refine an engagement process and to test a series of iteratively developed prototypes to refine the systems that supported community motivation and collaboration. The result of the research is 2 fold: 1. a sophisticated prototype for researchers and designers to further experiment with community engagement methodologies using existing and emerging communications technologies. 2. A ‘human dimensions matrix’. This matrix assists in the identification and modification of place based interventions in the social, technical, spatial, cultural, pedagogical conditions of any given community. This matrix has now become an essential part of a number of subsequent projects and assists design collaborators to successfully conceptualise, generate and evaluate interactive experiences. the research team employed practice-led action research methodologies that involved a collaborative effort across the fields of interaction design and social science, in particular ethnography, in order to: 1. seek, contest, refine a design methodology that would maximise the successful application of a dynamic system to create new kinds of interactions between people, places and artefacts’. 2. To design and deploy an application that intervenes in place-based and mobile technologies and offers people simple interfaces to create and share digital stories. Cipher Cities was awarded 3 separate CRC competitive grants (over $270,000 in total) to assist 3 stages of research covering the development of the Ethnographic Design Methodologies, the development of the tools, and the testing and refinement of both the engagement models and technologies. The resulting methodologies and tools are in the process of being commercialised by the Australasian CRC for Interaction Design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring unused or dark IP addresses offers opportunities to extract useful information about both on-going and new attack patterns. In recent years, different techniques have been used to analyze such traffic including sequential analysis where a change in traffic behavior, for example change in mean, is used as an indication of malicious activity. Change points themselves say little about detected change; further data processing is necessary for the extraction of useful information and to identify the exact cause of the detected change which is limited due to the size and nature of observed traffic. In this paper, we address the problem of analyzing a large volume of such traffic by correlating change points identified in different traffic parameters. The significance of the proposed technique is two-fold. Firstly, automatic extraction of information related to change points by correlating change points detected across multiple traffic parameters. Secondly, validation of the detected change point by the simultaneous presence of another change point in a different parameter. Using a real network trace collected from unused IP addresses, we demonstrate that the proposed technique enables us to not only validate the change point but also extract useful information about the causes of change points.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Citrus canker is a disease of citrus and closely related species, caused by the bacterium Xanthomonas citri subsp. citri. This disease, previously exotic to Australia, was detected on a single farm [infested premise-1, (IP1). IP is the terminology used in official biosecurity protocols to describe a locality at which an exotic plant pest has been confirmed or is presumed to exist. IP are numbered sequentially as they are detected] in Emerald, Queensland in July 2004. During the following 10 months the disease was subsequently detected on two other farms (IP2 and IP3) within the same area and studies indicated the disease first occurred on IP1 and spread to IP2 and IP3. The oldest, naturally infected plant tissue observed on any of these farms indicated the disease was present on IP1 for several months before detection and established on IP2 and IP3 during the second quarter (i.e. autumn) 2004. Transect studies on some IP1 blocks showed disease incidences ranged between 52 and 100% (trees infected). This contrasted to very low disease incidence, less than 4% of trees within a block, on IP2 and IP3. The mechanisms proposed for disease spread within blocks include weather-assisted dispersal of the bacterium (e.g. wind-driven rain) and movement of contaminated farm equipment, in particular by pivot irrigator towers via mechanical damage in combination with abundant water. Spread between blocks on IP2 was attributed to movement of contaminated farm equipment and/or people. Epidemiology results suggest: (i) successive surveillance rounds increase the likelihood of disease detection; (ii) surveillance sensitivity is affected by tree size; and (iii) individual destruction zones (for the purpose of eradication) could be determined using disease incidence and severity data rather than a predefined set area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most interesting questions that arise in patent law are the ones that test the boundaries of patentable subject matter. One of those questions has been put forward recently in the United States in an argument in favour of patenting the plots of fictional stories. United States attorney Andrew F Knight has claimed that storylines are patentable subject matter and should be recognised as such. What he claims is patentable is not the copyrightable expression of a written story or even a written outline of a plot but the underlying plot of a story itself. The commercial application of ‘storyline patents’, as he describes them, is said to be their exclusive use in books and movies. This article analyses the claims made and argues that storylines are not patentable subject matter under Australian law. It also contends that policy considerations, as well as the very nature of creative works, weigh against recognition of ‘storyline patents’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Report, prepared for Smart Service Queensland (“SSQ”), addresses legal issues, areas of risk and other factors associated with activities conducted on three popular online platforms—YouTube, MySpace and Second Life (which are referred to throughout this Report as the “Platforms”). The Platforms exemplify online participatory spaces and behaviours, including blogging and networking, multimedia sharing, and immersive virtual environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design talks LOUDLY!!! Is a series of interactive presentations exploring issues and opportunities involving professional design. --------------- These seminars are organised by the Industrial Design Network Queensland (IDnetQLD) in coordination with the Design Institute of Australia (DIA). This event was held at the State Library of Queensland (SLQ) with invited public presentations by a panel of industry experts from the Australian Government – IP Australia. --------------- The first seminar "Intellectual Property : designing 4 success" highlighted to design professionals how the various forms of Intellectual Property interact, what protections and pitfalls exist, and how these impact upon the work and responsibilities of designers. The overlaps, gaps and in congruencies in the various IP protection systems were highlighted by the expert line-up of speakers. --------------- The underlying message is that a clear understanding of all IP types is necessary in order to gain the best advantage from IP protection and therefore eliminate potential IP ownership issues before they become a problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To demonstrate properties of the International Classification of the External Cause of Injury (ICECI) as a tool for use in injury prevention research. Methods: The Childhood Injury Prevention Study (CHIPS) is a prospective longitudinal follow up study of a cohort of 871 children 5–12 years of age, with a nested case crossover component. The ICECI is the latest tool in the International Classification of Diseases (ICD) family and has been designed to improve the precision of coding injury events. The details of all injury events recorded in the study, as well as all measured injury related exposures, were coded using the ICECI. This paper reports a substudy on the utility and practicability of using the ICECI in the CHIPS to record exposures. Interrater reliability was quantified for a sample of injured participants using the Kappa statistic to measure concordance between codes independently coded by two research staff. Results: There were 767 diaries collected at baseline and event details from 563 injuries and exposure details from injury crossover periods. There were no event, location, or activity details which could not be coded using the ICECI. Kappa statistics for concordance between raters within each of the dimensions ranged from 0.31 to 0.93 for the injury events and 0.94 and 0.97 for activity and location in the control periods. Discussion: This study represents the first detailed account of the properties of the ICECI revealed by its use in a primary analytic epidemiological study of injury prevention. The results of this study provide considerable support for the ICECI and its further use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

GMPLS is a generalized form of MPLS (MultiProtocol Label Switching). MPLS is IP packet based and it uses MPLS-TE for Packet Traffic Engineering. GMPLS is extension to MPLS capabilities. It provides separation between transmission, control and management plane and network management. Control plane allows various applications like traffic engineering, service provisioning, and differentiated services. GMPLS control plane architecture includes signaling (RSVP-TE, CR-LDP) and routing (OSPF-TE, ISIS-TE) protocols. This paper provides an overview of the signaling protocols, describes their main functionalities, and provides a general evaluation of both the protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emerging data streaming applications in Wireless Sensor Networks require reliable and energy-efficient Transport Protocols. Our recent Wireless Sensor Network deployment in the Burdekin delta, Australia, for water monitoring [T. Le Dinh, W. Hu, P. Sikka, P. Corke, L. Overs, S. Brosnan, Design and deployment of a remote robust sensor network: experiences from an outdoor water quality monitoring network, in: Second IEEE Workshop on Practical Issues in Building Sensor Network Applications (SenseApp 2007), Dublin, Ireland, 2007] is one such example. This application involves streaming sensed data such as pressure, water flow rate, and salinity periodically from many scattered sensors to the sink node which in turn relays them via an IP network to a remote site for archiving, processing, and presentation. While latency is not a primary concern in this class of application (the sampling rate is usually in terms of minutes or hours), energy-efficiency is. Continuous long-term operation and reliable delivery of the sensed data to the sink are also desirable. This paper proposes ERTP, an Energy-efficient and Reliable Transport Protocol for Wireless Sensor Networks. ERTP is designed for data streaming applications, in which sensor readings are transmitted from one or more sensor sources to a base station (or sink). ERTP uses a statistical reliability metric which ensures the number of data packets delivered to the sink exceeds the defined threshold. Our extensive discrete event simulations and experimental evaluations show that ERTP is significantly more energyefficient than current approaches and can reduce energy consumption by more than 45% when compared to current approaches. Consequently, sensor nodes are more energy-efficient and the lifespan of the unattended WSN is increased.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today’s evolving networks are experiencing a large number of different attacks ranging from system break-ins, infection from automatic attack tools such as worms, viruses, trojan horses and denial of service (DoS). One important aspect of such attacks is that they are often indiscriminate and target Internet addresses without regard to whether they are bona fide allocated or not. Due to the absence of any advertised host services the traffic observed on unused IP addresses is by definition unsolicited and likely to be either opportunistic or malicious. The analysis of large repositories of such traffic can be used to extract useful information about both ongoing and new attack patterns and unearth unusual attack behaviors. However, such an analysis is difficult due to the size and nature of the collected traffic on unused address spaces. In this dissertation, we present a network traffic analysis technique which uses traffic collected from unused address spaces and relies on the statistical properties of the collected traffic, in order to accurately and quickly detect new and ongoing network anomalies. Detection of network anomalies is based on the concept that an anomalous activity usually transforms the network parameters in such a way that their statistical properties no longer remain constant, resulting in abrupt changes. In this dissertation, we use sequential analysis techniques to identify changes in the behavior of network traffic targeting unused address spaces to unveil both ongoing and new attack patterns. Specifically, we have developed a dynamic sliding window based non-parametric cumulative sum change detection techniques for identification of changes in network traffic. Furthermore we have introduced dynamic thresholds to detect changes in network traffic behavior and also detect when a particular change has ended. Experimental results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, using both synthetically generated datasets and real network traces collected from a dedicated block of unused IP addresses.