953 resultados para computer forensics tools
Resumo:
Digital forensics concerns the analysis of electronic artifacts to reconstruct events such as cyber crimes. This research produced a framework to support forensic analyses by identifying associations in digital evidence using metadata. It showed that metadata based associations can help uncover the inherent relationships between heterogeneous digital artifacts thereby aiding reconstruction of past events by identifying artifact dependencies and time sequencing. It also showed that metadata association based analysis is amenable to automation by virtue of the ubiquitous nature of metadata across forensic disk images, files, system and application logs and network packet captures. The results prove that metadata based associations can be used to extract meaningful relationships between digital artifacts, thus potentially benefiting real-life forensics investigations.
Resumo:
Water management is vital for mine sites both for production and sustainability related issues. Effective water management is a complex task since the role of water on mine sites is multifaceted. Computers models are tools that represent mine site water interaction and can be used by mine sites to inform or evaluate their water management strategies. There exist several types of models that can be used to represent mine site water interactions. This paper presents three such models: an operational model, an aggregated systems model and a generic systems model. For each model the paper provides a description and example followed by an analysis of its advantages and disadvantages. The paper hypotheses that since no model is optimal for all situations, each model should be applied in situations where it is most appropriate based upon the scale of water interactions being investigated, either unit (operation), inter-site (aggregated systems) or intra-site (generic systems).
Resumo:
Molecular biology is a scientific discipline which has changed fundamentally in character over the past decade to rely on large scale datasets – public and locally generated - and their computational analysis and annotation. Undergraduate education of biologists must increasingly couple this domain context with a data-driven computational scientific method. Yet modern programming and scripting languages and rich computational environments such as R and MATLAB present significant barriers to those with limited exposure to computer science, and may require substantial tutorial assistance over an extended period if progress is to be made. In this paper we report our experience of undergraduate bioinformatics education using the familiar, ubiquitous spreadsheet environment of Microsoft Excel. We describe a configurable extension called QUT.Bio.Excel, a custom ribbon, supporting a rich set of data sources, external tools and interactive processing within the spreadsheet, and a range of problems to demonstrate its utility and success in addressing the needs of students over their studies.
Resumo:
To provide card holder authentication while they are conducting an electronic transaction using mobile devices, VISA and MasterCard independently proposed two electronic payment protocols: Visa 3D Secure and MasterCard Secure Code. The protocols use pre-registered passwords to provide card holder authentication and Secure Socket Layer/ Transport Layer Security (SSL/TLS) for data confidentiality over wired networks and Wireless Transport Layer Security (WTLS) between a wireless device and a Wireless Application Protocol (WAP) gateway. The paper presents our analysis of security properties in the proposed protocols using formal method tools: Casper and FDR2. We also highlight issues concerning payment security in the proposed protocols.
Resumo:
Semiconductor III-V quantum dots (QDs) are particularly enticing components for the integration of optically promising III-V materials with the silicon technology prevalent in the microelectronics industry. However, defects due to deviations from a stoichiometric composition [group III: group V = 1] may lead to impaired device performance. This paper investigates the initial stages of formation of InSb and GaAs QDs on Si(1 0 0) through hybrid numerical simulations. Three situations are considered: a neutral gas environment (NG), and two ionized gas environments, namely a localized ion source (LIS) and a background plasma (BP) case. It is shown that when the growth is conducted in an ionized gas environment, a stoichiometric composition may be obtained earlier in the QD as compared to a NG. Moreover, the stoichiometrization time, tst, is shorter for the BP case compared to the LIS scenario. A discussion of the effect of ion/plasma-based tools as well as a range of process conditions on the final island size distribution is also included. Our results suggest a way to obtain a deterministic level of control over nanostructure properties (in particular, elemental composition and size) during the initial stages of growth which is a crucial step towards achieving highly tailored QDs suitable for implementation in advanced technological devices.
Resumo:
Over about the last decade, people involved in game development have noted the need for more formal models and tools to support the design phase of games. It is argued that the present lack of such formal tools is currently hindering knowledge transfer among designers. Formal visual languages, on the other hand, can help to more effectively express, abstract and communicate game design concepts. Moreover, formal tools can assist in the prototyping phase, allowing designers to reason about and simulate game mechanics on an abstract level. In this paper we present an initial investigation into whether workflow patterns – which have already proven to be effective for modeling business processes – are a suitable way to model task succession in games. Our preliminary results suggest that workflow patterns show promise in this regard but some limitations, especially in regard to time constraints, currently restrict their potential.
Resumo:
To prevent unauthorized access to protected trusted platform module (TPM) objects, authorization protocols, such as the object-specific authorization protocol (OSAP), have been introduced by the trusted computing group (TCG). By using OSAP, processes trying to gain access to the protected TPM objects need to prove their knowledge of relevant authorization data before access to the objects can be granted. Chen and Ryan’s 2009 analysis has demonstrated OSAP’s authentication vulnerability in sessions with shared authorization data. They also proposed the Session Key Authorization Protocol (SKAP) with fewer stages as an alternative to OSAP. Chen and Ryan’s analysis of SKAP using ProVerif proves the authentication property. The purpose of this paper was to examine the usefulness of Colored Petri Nets (CPN) and CPN Tools for security analysis. Using OSAP and SKAP as case studies, we construct intruder and authentication property models in CPN. CPN Tools is used to verify the authentication property using a Dolev–Yao-based model. Verification of the authentication property in both models using the state space tool produces results consistent with those of Chen and Ryan.
Resumo:
The term “Human error” can simply be defined as an error which made by a human. In fact, Human error is an explanation of malfunctions, unintended consequents from operating a system. There are many factors that cause a person to have an error due to the unwanted error of human. The aim of this paper is to investigate the relationship of human error as one of the factors to computer related abuses. The paper beings by computer-relating to human errors and followed by mechanism mitigate these errors through social and technical perspectives. We present the 25 techniques of computer crime prevention, as a heuristic device that assists. A last section discussing the ways of improving the adoption of security, and conclusion.
Resumo:
Computer modelling has been used extensively in some processes in the sugar industry to achieve significant gains. This paper reviews the investigations carried out over approximately the last twenty five years,including the successes but also areas where problems and delays have been encountered. In that time the capability of both hardware and software have increased dramatically. For some processes such as cane cleaning, cane billet preparation, and sugar drying, the application of computer modelling towards improved equipment design and operation has been quite limited. A particular problem has been the large number of particles and particle interactions in these applications, which, if modelled individually, is computationally very intensive. Despite the problems, some attempts have already been made and knowledge gained on tackling these issues. Even if the detailed modelling is wanting, a model can provide some useful insights into the processes. Some options to attack these more intensive problems include the use of commercial software packages, which are usually very robust and allow the addition of user-supplied subroutines to adapt the software to particular problems. Suppliers of such software usually charge a fee per CPU licence, which is often problematic for large problems that require the use of many CPUs. Another option to consider is using open source software that has been developed with the capability to access large parallel resources. Such software has the added advantage of access to the full internal coding. This paper identifies and discusses the detail of software options with the potential capability to achieve improvements in the sugar industry.
Resumo:
Organisations have recently looked to design to become more customer oriented and co-create a new kind of value and service offering. This requires changes in the organisation mindset, involving the entire company, innovation processes and often its business model. One tool that has been successful in facilitating this has been Osterwalder and Pigneur (2010) ‘Business Model Canvas’ and more importantly the design process that supports the use of this tool. The aim of this paper is to explore the role design tools play in the translation and facilitation process of innovation in firms. Six ‘Design Innovation Catalysts’ (Wrigley, 2013) were interviewed in regards to their approach and use of design tools in order to better facilitate innovation. Results highlight the value of tools expands beyond their intended use to include; facilitation of communicating, permission to think creatively, and learning and teaching through visualisation. Findings from this research build upon the role of the Design Innovation Catalyst and provide additional implications for organisations.
Resumo:
This case study applied Weick's (1979) notion of sensemaking to support timely quality doctoral completion. Taking a socio-cultural perspective the paper explored how drivers can be applied to inform better fit (Durham, 1991). Global research themes, including growth in student numbers, timely completion and generation and distribution of research outcomes, are considered. It is argued that accessible and interactive web interfaces should be informed by quality assurance measures and key performance indicators. The contribution made is a better understanding of how phenomena and contexts can be applied to generate quality management of research training environments and research outcomes in universities.
Resumo:
Interaction topologies in service-oriented systems are usually classified into two styles: choreographies and orchestrations. In a choreography, services interact in a peer-to-peer manner and no service plays a privileged role. In contrast, interactions in an orchestration occur between one particular service, the orchestrator, and a number of subordinated services. Each of these topologies has its trade-offs. This paper considers the problem of migrating a service-oriented system from a choreography style to an orchestration style. Specifically, the paper presents a tool chain for synthesising orchestrators from choreographies. Choreographies are initially represented as communicating state machines. Based on this representation, an algorithm is presented that synthesises the behaviour of an orchestrator, which is also represented as a state machine. Concurrent regions are then identified in the synthesised state machine to obtain a more compact representation in the form of a Petri net. Finally, it is shown how the resulting Petri nets can be transformed into notations supported by commercial tools, such as the Business Process Modelling Notation (BPMN).
Resumo:
S. japonicum infection is believed to be endemic in 28 of the 80 provinces of the Philippines and the most recent data on schistosomiasis prevalence have shown considerable variability between provinces. In order to increase the efficient allocation of parasitic disease control resources in the country, we aimed to describe the small scale spatial variation in S. japonicum prevalence across the Philippines, quantify the role of the physical environment in driving the spatial variation of S. japonicum, and develop a predictive risk map of S. japonicum infection. Data on S. japonicum infection from 35,754 individuals across the country were geo-located at the barangay level and included in the analysis. The analysis was then stratified geographically for Luzon, the Visayas and Mindanao. Zero-inflated binomial Bayesian geostatistical models of S. japonicum prevalence were developed and diagnostic uncertainty was incorporated. Results of the analysis show that in the three regions, males and individuals aged ≥ 20 years had significantly higher prevalence of S. japonicum compared with females and children <5 years. The role of the environmental variables differed between regions of the Philippines. S. japonicum infection was widespread in the Visayas whereas it was much more focal in Luzon and Mindanao. This analysis revealed significant spatial variation in prevalence of S. japonicum infection in the Philippines. This suggests that a spatially targeted approach to schistosomiasis interventions, including mass drug administration, is warranted. When financially possible, additional schistosomiasis surveys should be prioritized to areas identified to be at high risk, but which were underrepresented in our dataset.