225 resultados para Forensics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital forensics isn't commonly a part of an undergraduate university degree, but Deakin University in Australia recently introduced the subject as part of an IT security course. As instructors, we've found that digital forensics complements our other security offerings because it affords insights into why and how security fails. A basic part of this course is an ethics agreement signed by students and submitted to the unit instructor. This agreement, approved by Deakin University's legal office and consistent with Barbara Endicott-Popovsky's approach, requires students to maintain a professional and ethical attitude to the subject matter and its applications. Assignments regularly cast students in the role of forensic professional. Our teaching team emphasizes throughout the course that professional conduct establishes credibility with employers and customers as well as colleagues, and is required to perform the job effectively. This article describes our experiences with this course.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network forensics is a branch of digital forensics which has evolved recently as a very important discipline used in monitoring and analysing network traffic-particularly for the purposes of tracing intrusions and attacks. This paper presents an analysis of the tools and techniques used in network forensic analysis. It further examines the application of network forensics to vital areas such as malware and network attack detection; IP traceback and honeypots; and intrusion detection. Further, the paper addresses new and emerging areas of network forensic development which include critical infrastructure forensics, wireless network forensics, as well as its application to social networking. © 2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Illegal hunting is one of the major threats to vertebrate populations in tropical regions. This unsustainable practice has serious consequences not only for the target populations, but also for the dynamics and structure of tropical ecosystems. Generally, in cases of suspected illegal hunting, the only evidence available is pieces of meat, skin or bone. In these cases, species identification can only be reliably determined using molecular technologies. Here, we reported an investigative study of three cases of suspected wildlife poaching in which molecular biology techniques were employed to identify the hunted species from remains of meat.Findings: By applying cytochrome b (cyt-b) and cytochrome oxidase subunit I (COI) molecular markers, the suspected illegal poaching was confirmed by the identification of three wild species, capybara (Hydrochoerus hydrochaeris), Chaco Chachalaca (Ortalis canicollis) and Pampas deer (Ozotoceros bezoarticus). In Brazil, hunting is a criminal offense, and based on this evidence, the defendants were found guilty and punished with fines; they may still be sentenced to prison for a period of 6 to 12 months.Conclusions: The genetic analysis used in this investigative study was suitable to diagnose the species killed and solve these criminal investigations. Molecular forensic techniques can therefore provide an important tool that enables local law enforcement agencies to apprehend illegal poachers. © 2012 Sanches et al.; licensee BioMed Central Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This PhD thesis discusses the impact of Cloud Computing infrastructures on Digital Forensics in the twofold role of target of investigations and as a helping hand to investigators. The Cloud offers a cheap and almost limitless computing power and storage space for data which can be leveraged to commit either new or old crimes and host related traces. Conversely, the Cloud can help forensic examiners to find clues better and earlier than traditional analysis applications, thanks to its dramatically improved evidence processing capabilities. In both cases, a new arsenal of software tools needs to be made available. The development of this novel weaponry and its technical and legal implications from the point of view of repeatability of technical assessments is discussed throughout the following pages and constitutes the unprecedented contribution of this work

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cybernetics revolution of the last years improved a lot our lives, having an immediate access to services and a huge amount of information over the Internet. Nowadays the user is increasingly asked to insert his sensitive information on the Internet, leaving its traces everywhere. But there are some categories of people that cannot risk to reveal their identities on the Internet. Even if born to protect U.S. intelligence communications online, nowadays Tor is the most famous low-latency network, that guarantees both anonymity and privacy of its users. The aim of this thesis project is to well understand how the Tor protocol works, not only studying its theory, but also implementing those concepts in practice, having a particular attention for security topics. In order to run a Tor private network, that emulates the real one, a virtual testing environment has been configured. This behavior allows to conduct experiments without putting at risk anonymity and privacy of real users. We used a Tor patch, that stores TLS and circuit keys, to be given as inputs to a Tor dissector for Wireshark, in order to obtain decrypted and decoded traffic. Observing clear traffic allowed us to well check the protocol outline and to have a proof of the format of each cell. Besides, these tools allowed to identify a traffic pattern, used to conduct a traffic correlation attack to passively deanonymize hidden service clients. The attacker, controlling two nodes of the Tor network, is able to link a request for a given hidden server to the client who did it, deanonymizing him. The robustness of the traffic pattern and the statistics, such as the true positive rate, and the false positive rate, of the attack are object of a potential future work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-invasive documentation methods such as surface scanning and radiological imaging are gaining in importance in the forensic field. These three-dimensional technologies provide digital 3D data, which are processed and handled in the computer. However, the sense of touch gets lost using the virtual approach. The haptic device enables the use of the sense of touch to handle and feel digital 3D data. The multifunctional application of a haptic device for forensic approaches is evaluated and illustrated in three different cases: the representation of bone fractures of the lower extremities, by traffic accidents, in a non-invasive manner; the comparison of bone injuries with the presumed injury-inflicting instrument; and in a gunshot case, the identification of the gun by the muzzle imprint, and the reconstruction of the holding position of the gun. The 3D models of the bones are generated from the Computed Tomography (CT) images. The 3D models of the exterior injuries, the injury-inflicting tools and the bone injuries, where a higher resolution is necessary, are created by the optical surface scan. The haptic device is used in combination with the software FreeForm Modelling Plus for touching the surface of the 3D models to feel the minute injuries and the surface of tools, to reposition displaced bone parts and to compare an injury-causing instrument with an injury. The repositioning of 3D models in a reconstruction is easier, faster and more precisely executed by means of using the sense of touch and with the user-friendly movement in the 3D space. For representation purposes, the fracture lines of bones are coloured. This work demonstrates that the haptic device is a suitable and efficient application in forensic science. The haptic device offers a new way in the handling of digital data in the virtual 3D space.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A handwritten list of forensic and syllogistic questions compiled between 1789 and 1791 on two pages, and a half-page list of the questions for June 1800.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A handwritten list of the forensic questions delivered on Commencement and Exhibition Days between 1811 and 1822.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A handwritten list of the forensic questions delivered on Commencement and Exhibition Days between 1823 and 1824.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current debate within forensic authorship analysis has tended to polarise those who argue that analysis methods should reflect a strong cognitive theory of idiolect and others who see less of a need to look behind the stylistic variation of the texts they are examining. This chapter examines theories of idiolect and asks how useful or necessary they are to the practice of forensic authorship analysis. Taking a specific text messaging case the chapter demonstrates that methodologically rigorous, theoretically informed authorship analysis need not appeal to cognitive theories of idiolect in order to be valid. By considering text messaging forensics, lessons will be drawn which can contribute to wider debates on the role of theories of idiolect in forensic casework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Passive samplers are not only a versatile tool to integrate environmental concentrations of pollutants, but also to avoid the use of live sentinel organisms for environmental monitoring. This study introduced the use of magnetic silicone polymer composites (Fe-PDMS) as passive sampling media to pre-concentrate a wide range of analytes from environmental settings. The composite samplers were assessed for their accumulation properties by performing lab experiments with two model herbicides (Atrazine and Irgarol 1051) and evaluated for their uptake properties from environmental settings (waters and sediments). The Fe-PDMS composites showed good accumulation of herbicides and pesticides from both freshwater and saltwater settings and the accumulation mechanism was positively correlated with the log Kow value of individual analytes. Results from the studies show that these composites could be easily used for a wide number of applications such as monitoring, cleanup, and/or bioaccumulation modeling, and as a non-intrusive and nondestructive monitoring tool for environmental forensic purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work explores the development of MemTri. A memory forensics triage tool that can assess the likelihood of criminal activity in a memory image, based on evidence data artefacts generated by several applications. Fictitious illegal suspect activity scenarios were performed on virtual machines to generate 60 test memory images for input into MemTri. Four categories of applications (i.e. Internet Browsers, Instant Messengers, FTP Client and Document Processors) are examined for data artefacts located through the use of regular expressions. These identified data artefacts are then analysed using a Bayesian Network, to assess the likelihood that a seized memory image contained evidence of illegal activity. Currently, MemTri is under development and this paper introduces only the basic concept as well as the components that the application is built on. A complete description of MemTri coupled with extensive experimental results is expected to be published in the first semester of 2017.