388 resultados para HIPERPAV II (Computer file)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer profiling is the automated forensic examination of a computer system in order to provide a human investigator with a characterisation of the activities that have taken place on that system. As part of this process, the logical components of the computer system – components such as users, files and applications - are enumerated and the relationships between them discovered and reported. This information is enriched with traces of historical activity drawn from system logs and from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work examines the impact of temporal inconsistency in such information and discusses two types of temporal inconsistency that may arise – inconsistency arising out of the normal errant behaviour of a computer system, and inconsistency arising out of deliberate tampering by a suspect – and techniques for dealing with inconsistencies of the latter kind. We examine the impact of deliberate tampering through experiments conducted with prototype computer profiling software. Based on the results of these experiments, we discuss techniques which can be employed in computer profiling to deal with such temporal inconsistencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The construction of timelines of computer activity is a part of many digital investigations. These timelines of events are composed of traces of historical activity drawn from system logs and potentially from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work introduces a software tool (CAT Detect) for the detection of inconsistency within timelines of computer activity. We examine the impact of deliberate tampering through experiments conducted with our prototype software tool. Based on the results of these experiments, we discuss techniques which can be employed to deal with such temporal inconsistencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital forensic examiners often need to identify the type of a file or file fragment based only on the content of the file. Content-based file type identification schemes typically use a byte frequency distribution with statistical machine learning to classify file types. Most algorithms analyze the entire file content to obtain the byte frequency distribution, a technique that is inefficient and time consuming. This paper proposes two techniques for reducing the classification time. The first technique selects a subset of features based on the frequency of occurrence. The second speeds classification by sampling several blocks from the file. Experimental results demonstrate that up to a fifteen-fold reduction in file size analysis time can be achieved with limited impact on accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the companion paper, a fourth-order element formulation in an updated Lagrangian formulation was presented to handle geometric non-linearities. The formulation of the present paper extends this to include material non-linearity by proposing a refined plastic hinge approach to analyse large steel framed structures with many members, for which contemporary algorithms based on the plastic zone approach can be problematic computationally. This concept is an advancement of conventional plastic hinge approaches, as the refined plastic hinge technique allows for gradual yielding, being recognized as distributed plasticity across the element section, a condition of full plasticity, as well as including strain hardening. It is founded on interaction yield surfaces specified analytically in terms of force resultants, and achieves accurate and rapid convergence for large frames for which geometric and material non-linearity are significant. The solutions are shown to be efficacious in terms of a balance of accuracy and computational expediency. In addition to the numerical efficiency, the present versatile approach is able to capture different kinds of material and geometric non-linearities on general applications of steel structures, and thereby it offers an efficacious and accurate means of assessing non-linear behaviour of the structures for engineering practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Level design is often characterised as “where the rubber hits the road” in game development. It is a core area of games design, alongside design of game rules and narrative. However, there is a lack of literature dedicated to documenting teaching games design, let alone the more specialised topic of level design. Furthermore, there is a lack of formal frameworks for best practice in level design, as professional game developers often rely on intuition and previous experience. As a result, there is little for games design teachers to draw on when presented with the opportunity to teach a level design unit. In this paper, we discuss the design and implementation of a games level design unit in which students use the StarCraft II Galaxy Editor. We report on two cycles of an action research project, reflecting upon our experiences with respect to student feedback and peer review, and outlining our plans for improving the unit in years to come.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this chapter we continue the exposition of crypto topics that was begun in the previous chapter. This chapter covers secret sharing, threshold cryptography, signature schemes, and finally quantum key distribution and quantum cryptography. As in the previous chapter, we have focused only on the essentials of each topic. We have selected in the bibliography a list of representative items, which can be consulted for further details. First we give a synopsis of the topics that are discussed in this chapter. Secret sharing is concerned with the problem of how to distribute a secret among a group of participating individuals, or entities, so that only predesignated collections of individuals are able to recreate the secret by collectively combining the parts of the secret that were allocated to them. There are numerous applications of secret-sharing schemes in practice. One example of secret sharing occurs in banking. For instance, the combination to a vault may be distributed in such a way that only specified collections of employees can open the vault by pooling their portions of the combination. In this way the authority to initiate an action, e.g., the opening of a bank vault, is divided for the purposes of providing security and for added functionality, such as auditing, if required. Threshold cryptography is a relatively recently studied area of cryptography. It deals with situations where the authority to initiate or perform cryptographic operations is distributed among a group of individuals. Many of the standard operations of single-user cryptography have counterparts in threshold cryptography. Signature schemes deal with the problem of generating and verifying electronic) signatures for documents.Asubclass of signature schemes is concerned with the shared-generation and the sharedverification of signatures, where a collaborating group of individuals are required to perform these actions. A new paradigm of security has recently been introduced into cryptography with the emergence of the ideas of quantum key distribution and quantum cryptography. While classical cryptography employs various mathematical techniques to restrict eavesdroppers from learning the contents of encrypted messages, in quantum cryptography the information is protected by the laws of physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Macrophonics II presents new Australian work emerging from the leading edge of performance interface research. The program addresses the emerging dialogue between traditional media and emerging digital media, as well as dialogues across a broad range of musical traditions. Recent technological developments are causing a complete reevaluation of the relationships between media and genres in art, and Macrophonics II presents a cross-section of responses to this situation. Works in the program foreground an approach to performance that integrates sensors with novel performance control devices, and/or examine how machines can be made musical in performance. The program presents works by Australian artists Donna Hewitt, Julian Knowles and Wade Marynowsky, with choreography by Avril Huddy and dance performance by Lizzie and Zaimon Vilmanis. From sensor-based microphones and guitars, through performance a/v, to post-rock dronescapes, movement inspired works and experimental electronica, Macrophonics II provides a broad and engaging survey of new performance approaches in mediatised environments. Initial R&D for the work was supported by a range of institutions internationally, including the Australia Council for the Arts, Arts Queensland, STEIM (Holland) and the Nes Artist Residency (Iceland).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the nature of interfaces to support people in accessing their files at tabletop displays embedded in the environment. To do this, we designed a study comparing people's interaction with two very different classes of file system access interface: Focus, explicitly designed for tabletops, and the familiar hierarchical Windows Explorer. In our within-subjects double-crossover study, participants collaborated on 4 planning tasks. Based on video, logs, questionnaires and interviews, we conclude that both classes of interface have a place. Notably, Focus contributed to improved collaboration and more efficient use of the workspace than with Explorer. Our results inform a set of recommendations for future interfaces enabling this important class of interaction -- supporting access to files for collaboration at tabletop devices embedded in an ubicomp environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-objective optimization is an active field of research with broad applicability in aeronautics. This report details a variant of the original NSGA-II software aimed to improve the performances of such a widely used Genetic Algorithm in finding the optimal Pareto-front of a Multi-Objective optimization problem for the use of UAV and aircraft design and optimsaiton. Original NSGA-II works on a population of predetermined constant size and its computational cost to evaluate one generation is O(mn^2 ), being m the number of objective functions and n the population size. The basic idea encouraging this work is that of reduce the computational cost of the NSGA-II algorithm by making it work on a population of variable size, in order to obtain better convergence towards the Pareto-front in less time. In this work some test functions will be tested with both original NSGA-II and VPNSGA-II algorithms; each test will be timed in order to get a measure of the computational cost of each trial and the results will be compared.