961 resultados para Forensics computer science
Resumo:
Secret-sharing schemes describe methods to securely share a secret among a group of participants. A properly constructed secret-sharing scheme guarantees that the share belonging to one participant does not reveal anything about the shares of others or even the secret itself. Besides the obvious feature which is to distribute a secret, secret-sharing schemes have also been used in secure multi-party computations and redundant residue number systems for error correction codes. In this paper, we propose that the secret-sharing scheme be used as a primitive in a Network-based Intrusion Detection System (NIDS) to detect attacks in encrypted networks. Encrypted networks such as Virtual Private Networks (VPNs) fully encrypt network traffic which can include both malicious and non-malicious traffic. Traditional NIDS cannot monitor encrypted traffic. Our work uses a combination of Shamir's secret-sharing scheme and randomised network proxies to enable a traditional NIDS to function normally in a VPN environment. In this paper, we introduce a novel protocol that utilises a secret-sharing scheme to detect attacks in encrypted networks.
Resumo:
Thermogravimetric analysis-mass spectrometry, X-ray diffraction and scanning electron microscopy (SEM) were used to characterize eight kaolinite samples from China. The results show that the thermal decomposition occurs in three main steps (a) desorption of water below 100 °C, (b) dehydration at about 225 °C, (c) well defined dehydroxylation at around 450 °C. It is also found that decarbonization took place at 710 °C due to the decomposition of calcite impurity in kaolin. The temperature of dehydroxylation of kaolinite is found to be influenced by the degree of disorder of the kaolinite structure and the gases evolved in the decomposition process can be various because of the different amount and kinds of impurities. It is evident by the mass spectra that the interlayer carbonate from impurity of calcite and organic carbon is released as CO2 around 225, 350 and 710 °C in the kaolinite samples.
Resumo:
This paper describes the approach taken to the clustering task at INEX 2009 by a group at the Queensland University of Technology. The Random Indexing (RI) K-tree has been used with a representation that is based on the semantic markup available in the INEX 2009 Wikipedia collection. The RI K-tree is a scalable approach to clustering large document collections. This approach has produced quality clustering when evaluated using two different methodologies.
Resumo:
This technical report is concerned with one aspect of environmental monitoring—the detection and analysis of acoustic events in sound recordings of the environment. Sound recordings offer ecologists the potential advantages of cheaper and increased sampling. An acoustic event detection algorithm is introduced that outputs a compact rectangular marquee description of each event. It can disentangle superimposed events, which are a common occurrence during morning and evening choruses. Next, three uses to which acoustic event detection can be put are illustrated. These tasks have been selected because they illustrate quite different modes of analysis: (1) the detection of diffuse events caused by wind and rain, which are a frequent contaminant of recordings of the terrestrial environment; (2) the detection of bird calls using the spatial distribution of their component events; and (3) the preparation of acoustic maps for whole ecosystem analysis. This last task utilises the temporal distribution of events over a daily, monthly or yearly cycle.
Resumo:
We describe the design and implementation of a public-key platform, secFleck, based on a commodity Trusted Platform Module (TPM) chip that extends the capability of a standard node. Unlike previous software public-key implementations this approach provides E- Commerce grade security; is computationally fast, energy efficient; and has low financial cost — all essential attributes for secure large-scale sen- sor networks. We describe the secFleck message security services such as confidentiality, authenticity and integrity, and present performance re- sults including computation time, energy consumption and cost. This is followed by examples, built on secFleck, of symmetric key management, secure RPC and secure software update.
Resumo:
Mining is the process of extracting mineral resources from the Earth for commercial value. It is an ancient human activity which can be traced back to Palaeolithic times (43 000 years ago), where for example the mineral hematite was mined to produce the red pigment ochre. The importance of many mined minerals is reflected in the names of the major milestones in human civilizations: the stone, copper, bronze, and iron ages. Much later coal provided the energy that was critical to the industrial revolution and still underpins modern society, creating 38% of world energy generation today. Ancient mines used human and later animal labor and broke rock using stone tools, heat, and water, and later iron tools. Today’s mines are heavily mechanized with large diesel and electrically powered vehicles, and rock is broken with explosives or rock cutting machines.
Resumo:
This paper describes an autonomous navigation system for a large underground mining vehicle. The control architecture is based on a robust reactive wall-following behaviour. To make it purposeful we provide driving hints derived from an approximate nodal-map. For most of the time, the vehicle is driven with weak localization (odometry). This need only be improved at intersections where decisions must be made – a technique we refer to as opportunistic localization. The paper briefly reviews absolute and relative navigation strategies, and describes an implementation of a reactive navigation system on a 30 tonne Load-Haul-Dump truck. This truck has achieved full-speed autonomous operation at an artificial test mine, and subsequently, at a operational underground mine.
Resumo:
This paper describes technologies we have developed to perform autonomous large-scale off-world excavation. A scale dragline excavator of size similar to that required for lunar excavation was made capable of autonomous control. Systems have been put in place to allow remote operation of the machine from anywhere in the world. Algorithms have been developed for complete autonomous digging and dumping of material taking into account machine and terrain constraints and regolith variability. Experimental results are presented showing the ability to autonomously excavate and move large amounts of regolith and accurately place it at a specified location.
Resumo:
Draglines are massive machines commonly used in surface mining to strip overburden, revealing the targeted minerals for extraction. Automating some or all of the phases of operation of these machines offers the potential for significant productivity and maintenance benefits. The mining industry has a history of slow uptake of automation systems due to the challenges contained in the harsh, complex, three-dimensional (3D), dynamically changing mine operating environment. Robotics as a discipline is finally starting to gain acceptance as a technology with the potential to assist mining operations. This article examines the evolution of robotic technologies applied to draglines in the form of machine embedded intelligent systems. Results from this work include a production trial in which 250,000 tons of material was moved autonomously, experiments demonstrating steps towards full autonomy, and teleexcavation experiments in which a dragline in Australia was tasked by an operator in the United States.
Resumo:
Starbug is an inexpensive, miniature autonomous underwater vehicle ideal for data collection and ecosystem surveys. Starbug is small enough to be launched by one person without the need for specialised equipment, such as cranes, and it operates with minimal to no human intervention. Starbug was one of the first autonomous underwater vehicles (AUVs) in the world where vision is the primary means of navigation and control. More details of Starbug can be found here: http://www.csiro.au/science/starbug.html
Resumo:
If mobile robots are to perform useful tasks in the real-world they will require a catalog of fundamental navigation competencies and a means to select between them. In this paper we describe our work on strongly vision-based competencies: road-following, person or vehicle following, pose and position stabilization. Results from experiments on an outdoor autonomous tractor, a car-like vehicle, are presented.
Resumo:
We present a novel vision-based technique for navigating an Unmanned Aerial Vehicle (UAV) through urban canyons. Our technique relies on both optic flow and stereo vision information. We show that the combination of stereo and optic-flow (stereo-flow) is more effective at navigating urban canyons than either technique alone. Optic flow from a pair of sideways-looking cameras is used to stay centered in a canyon and initiate turns at junctions, while stereo vision from a forward-facing stereo head is used to avoid obstacles to the front. The technique was tested in full on an autonomous tractor at CSIRO and in part on the USC autonomous helicopter. Experimental results are presented from these two robotic platforms operating in outdoor environments. We show that the autonomous tractor can navigate urban canyons using stereoflow, and that the autonomous helicopter can turn away from obstacles to the side using optic flow. In addition, preliminary results show that a single pair of forward-facing fisheye cameras can be used for both stereo and optic flow. The center portions of the fisheye images are used for stereo, while flow is measured in the periphery of the images.
Resumo:
The highly unstructured nature of coral reef environments makes them difficult for current robotic vehicles to efficiently navigate. Typical research and commercial platforms have limited autonomy within these environments and generally require tethers and significant external infrastructure. This paper outlines the development of a new robotic vehicle for underwater monitoring and surveying in highly unstructured environments and presents experimental results illustrating the vehicle’s performance. The hybrid AUV design developed by the CSIRO robotic reef monitoring team realises a compromise between endurance, manoeuvrability and functionality. The vehicle represents a new era in AUV design specifically focused at providing a truly low-cost research capability that will progress environmental monitoring through unaided navigation, cooperative robotics, sensor network distribution and data harvesting.
Resumo:
The Dynamic Data eXchange (DDX) is our third generation platform for building distributed robot controllers. DDX allows a coalition of programs to share data at run-time through an efficient shared memory mechanism managed by a store. Further, stores on multiple machines can be linked by means of a global catalog and data is moved between the stores on an as needed basis by multi-casting. Heterogeneous computer systems are handled. We describe the architecture of DDX and the standard clients we have developed that let us rapidly build complex control systems with minimal coding.
Resumo:
A vast amount of research into autonomous underwater navigation has, and is, being conducted around the world. However, typical research and commercial platforms have limited autonomy and are generally unable to navigate efficiently within coral reef environments without tethers and significant external infrastructure. This paper outlines the development and presents experimental results into the performance evaluation of a new robotic vehicle for underwater monitoring and surveying in highly unstructured environments. The hybrid AUV design developed by the CSIRO robotic reef monitoring team realises a compromise between endurance, manoeuvrability and functionality. The vehicle represents a new era in AUV design specifically focused at providing a truly lowcost research capability that will progress environmental monitoring through unaided navigation, cooperative robotics, sensor network distribution and data harvesting.