19 resultados para digital forensic tool testing

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our paper presents the work of the Cuneiform Digital Forensic Project (CDFP), an interdisciplinary project at The University of Birmingham, concerned with the development of a multimedia database to support scholarly research into cuneiform, wedge-shaped writing imprinted onto clay tablets and indeed the earliest real form of writing. We describe the evolutionary design process and dynamic research and developmental cycles associated with the database. Unlike traditional publications, the electronic publication of resources offers the possibility of almost continuous revisions with the integration and support of new media and interfaces. However, if on-line resources are to win the favor and confidence of their respective communities there must be a clear distinction between published and maintainable resources, and, developmental content. Published material should, ideally, be supported via standard web-browser interfaces with fully integrated tools so that users receive a reliable, homogenous and intuitive flow of information and media relevant to their needs. We discuss the inherent dynamics of the design and publication of our on-line resource, starting with the basic design and maintenance aspects of the electronic database, which includes photographic instances of cuneiform signs, and shows how the continuous review process identifies areas for further research and development, for example, the “sign processor” graphical search tool and three-dimensional content, the results of which then feedback into the maintained resource.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been recognised that throughout the UK, rural economies have a significant potential for growth but despite the potential for growth, many rural businesses face barriers that prohibit their expansion. In this study, we focus on one particular group of rural small- to medium-sized enterprises (SMEs): food and drink producers. Through user engagement activities, we identify the issues and needs associated with distributing products to the market, in order to understand the main issues which prevent rural food and drink SMEs from expansion, and to establish the requirements for a digital solution to this challenge.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Joint Research Centre (JRC) of the European Commission has developed, in consultation with many partners, the DOPA as a global reference information system to support decision making on protected areas (PAs) and biodiversity conservation. The DOPA brings together the World Database on Protected Areas with other reference datasets on species, habitats, ecoregions, threats and pressures, to deliver critical indicators at country level and PA level that can inform gap analyses, PA planning and reporting. These indicators are especially relevant to Aichi Targets 11 and 12, and have recently contributed to CBD country dossiers and capacity building on these targets. DOPA also includes eConservation, a new module that provides a means to share and search information on conservation projects, and thus allows users to see “who is doing what where”. So far over 5000 projects from the World Bank, GEF, CEPF, EU LIFE Programme, CBD LifeWeb Initiative and others have been included, and these projects can be searched in an interactive mapping interface based on criteria such as location, objectives, timeframe, budget, the organizations involved, target species etc. This seminar will provide an introduction to DOPA and eConservation, highlight how these services are used by the CBD and others, and include ample time for discussion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The judicial interest in ‘scientific’ evidence has driven recent work to quantify results for forensic linguistic authorship analysis. Through a methodological discussion and a worked example this paper examines the issues which complicate attempts to quantify results in work. The solution suggested to some of the difficulties is a sampling and testing strategy which helps to identify potentially useful, valid and reliable markers of authorship. An important feature of the sampling strategy is that these markers identified as being generally valid and reliable are retested for use in specific authorship analysis cases. The suggested approach for drawing quantified conclusions combines discriminant function analysis and Bayesian likelihood measures. The worked example starts with twenty comparison texts for each of three potential authors and then uses a progressively smaller comparison corpus, reducing to fifteen, ten, five and finally three texts per author. This worked example demonstrates how reducing the amount of data affects the way conclusions can be drawn. With greater numbers of reference texts quantified and safe attributions are shown to be possible, but as the number of reference texts reduces the analysis shows how the conclusion which should be reached is that no attribution can be made. The testing process at no point results in instances of a misattribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Queueing theory is an effective tool in the analysis of canputer camrunication systems. Many results in queueing analysis have teen derived in the form of Laplace and z-transform expressions. Accurate inversion of these transforms is very important in the study of computer systems, but the inversion is very often difficult. In this thesis, methods for solving some of these queueing problems, by use of digital signal processing techniques, are presented. The z-transform of the queue length distribution for the Mj GY jl system is derived. Two numerical methods for the inversion of the transfom, together with the standard numerical technique for solving transforms with multiple queue-state dependence, are presented. Bilinear and Poisson transform sequences are presented as useful ways of representing continuous-time functions in numerical computations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A system for the NDI' testing of the integrity of conposite materials and of adhesive bonds has been developed to meet industrial requirements. The vibration techniques used were found to be applicable to the development of fluid measuring transducers. The vibrational spectra of thin rectangular bars were used for the NDT work. A machined cut in a bar had a significant effect on the spectrum but a genuine crack gave an unambiguous response at high amplitudes. This was the generation of fretting crack noise at frequencies far above that of the drive. A specially designed vibrational decrement meter which, in effect, measures mechanical energy loss enabled a numerical classification of material adhesion to be obtained. This was used to study bars which had been flame or plasma sprayed with a variety of materials. It has become a useful tool in optimising coating methods. A direct industrial application was to classify piston rings of high performance I.C. engines. Each consists of a cast iron ring with a channel into which molybdenum, a good bearing surface, is sprayed. The NDT classification agreed quite well with the destructive test normally used. The techniques and equipment used for the NOT work were applied to the development of the tuning fork transducers investigated by Hassan into commercial density and viscosity devices. Using narrowly spaced, large area tines a thin lamina of fluid is trapped between them. It stores a large fraction of the vibrational energy which, acting as an inertia load reduces the frequency. Magnetostrictive and piezoelectric effects together or in combination enable the fork to be operated through a flange. This allows it to be used in pipeline or 'dipstick' applications. Using a different tine geometry the viscosity loading can be predoninant. This as well as the signal decrement of the density transducer makes a practical viscometer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is undertaken in the attempt to understand the processes at work at the cutting edge of the twist drill. Extensive drill life testing performed by the University has reinforced a survey of previously published information. This work demonstrated that there are two specific aspects of drilling which have not previously been explained comprehensively. The first concerns the interrelating of process data between differing drilling situations, There is no method currently available which allows the cutting geometry of drilling to be defined numerically so that such comparisons, where made, are purely subjective. Section one examines this problem by taking as an example a 4.5mm drill suitable for use with aluminium. This drill is examined using a prototype solid modelling program to explore how the required numerical information may be generated. The second aspect is the analysis of drill stiffness. What aspects of drill stiffness provide the very great difference in performance between short flute length, medium flute length and long flute length drills? These differences exist between drills of identical point geometry and the practical superiority of short drills has been known to shop floor drilling operatives since drilling was first introduced. This problem has been dismissed repeatedly as over complicated but section two provides a first approximation and shows that at least for smaller drills of 4. 5mm the effects are highly significant. Once the cutting action of the twist drill is defined geometrically there is a huge body of machinability data that becomes applicable to the drilling process. Work remains to interpret the very high inclination angles of the drill cutting process in terms of cutting forces and tool wear but aspects of drill design may already be looked at in new ways with the prospect of a more analytical approach rather than the present mix of experience and trial and error. Other problems are specific to the twist drill, such as the behaviour of the chips in the flute. It is now possible to predict the initial direction of chip flow leaving the drill cutting edge. For the future the parameters of further chip behaviour may also be explored within this geometric model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ion implantation modifies the surface composition and properties of materials by bombardment with high energy ions. The low temperature of the process ensures the avoidance of distortion and degradation of the surface or bulk mechanical properties of components. In the present work nitrogen ion implantation at 90 keV and doses above 1017 ions/cm2 has been carried out on AISI M2, D2 and 420 steels and engineering coatings such as hard chromium, electroless Ni-P and a brush plated Co-W alloy. Evaluation of wear and frictional properties of these materials was performed with a lubricated Falex wear test at high loads up to 900 N and a dry pin-on-disc apparatus at loads up to 40 N. It was found that nitrogen implantation reduced the wear of AISI 420 stainless steel by a factor of 2.5 under high load lubricated conditions and by a factor of 5.5 in low load dry testing. Lower but significant reductions in wear were achieved for AISI M2 and D2 steels. Wear resistance of coating materials was improved by up to 4 times in lubricated wear of hard Cr coatings implanted at the optimum dose but lower improvements were obtained for the Co-W alloy coating. However, hardened electroless Ni-P coatings showed no enhancement in wear properties. The benefits obtained in wear behaviour for the above materials were generally accompanied by a significant decrease in the running-in friction. Nitrogen implantation hardened the surface of steels and Cr and Co-W coatings. An ultra-microhardness technique showed that the true hardness of implanted layers was greater than the values obtained by conventional micro-hardness methods, which often result in penetration below the implanted depth. Scanning electron microscopy revealed that implantation reduced the ploughing effect during wear and a change in wear mechanism from an abrasive-adhesive type to a mild oxidative mode was evident. Retention of nitrogen after implantation was studied by Nuclear Reaction Analysis and Auger Electron Spectroscopy. It was shown that maximum nitrogen retention occurs in hard Cr coatings and AISI 420 stainless steel, which explains the improvements obtained in wear resistance and hardness. X-ray photoelectron spectroscopy on these materials revealed that nitrogen is almost entirely bound to Cr, forming chromium nitrides. It was concluded that nitrogen implantation at 90 keV and doses above 3x1017 ions/cm2 produced the most significant improvements in mechanical properties in materials containing nitride formers by precipitation strengthening, improving the load bearing capacity of the surface and changing the wear mechanism from adhesive-abrasive to oxidative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recall of personal experiences relevant to a claim of food allergy or food intolerance is assessed by a psychologically validated tool for evidence that the suspected food could have caused the adverse symptom suffered. The tool looks at recall from memory of a particular episode or episodes when food was followed by symptoms resulting in self-diagnosis of food allergy or intolerance compared to merely theoretical knowledge that such symptoms could arise after eating the food. If there is detailed recall of events that point to the food as a potential cause of the symptom and the symptom is sufficiently serious, the tool user is recommended to seek testing at an allergy clinic or by the appropriate specialist for a non-allergic sensitivity. If what is recalled does not support the logical possibility of a causal connection between eating that food and occurrence of the symptom, then the user of the tool is pointed to other potential sources of the problem. The user is also recommended to investigate remedies other than avoidance of the food that had been blamed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trauma and damage to the delicate structures of the inner ear frequently occurs during insertion of electrode array into the cochlea. This is strongly related to the excessive manual insertion force of the surgeon without any tool/tissue interaction feedback. The research is examined tool-tissue interaction of large prototype scale (12.5:1) digit embedded with distributive tactile sensor based upon cochlear electrode and large prototype scale (4.5:1) cochlea phantom for simulating the human cochlear which could lead to small scale digit requirements. This flexible digit classified the tactile information from the digit-phantom interaction such as contact status, tip penetration, obstacles, relative shape and location, contact orientation and multiple contacts. The digit, distributive tactile sensors embedded with silicon-substrate is inserted into the cochlea phantom to measure any digit/phantom interaction and position of the digit in order to minimize tissue and trauma damage during the electrode cochlear insertion. The digit is pre-curved in cochlea shape so that the digit better conforms to the shape of the scala tympani to lightly hug the modiolar wall of a scala. The digit have provided information on the characteristics of touch, digit-phantom interaction during the digit insertion. The tests demonstrated that even devices of such a relative simple design with low cost have potential to improve cochlear implants surgery and other lumen mapping applications by providing tactile feedback information by controlling the insertion through sensing and control of the tip of the implant during the insertion. In that approach, the surgeon could minimize the tissue damage and potential damage to the delicate structures within the cochlear caused by current manual electrode insertion of the cochlear implantation. This approach also can be applied diagnosis and path navigation procedures. The digit is a large scale stage and could be miniaturized in future to include more realistic surgical procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From the accusation of plagiarism in The Da Vinci Code, to the infamous hoaxer in the Yorkshire Ripper case, the use of linguistic evidence in court and the number of linguists called to act as expert witnesses in court trials has increased rapidly in the past fifteen years. An Introduction to Forensic Linguistics: Language in Evidence provides a timely and accessible introduction to this rapidly expanding subject. Using knowledge and experience gained in legal settings – Malcolm Coulthard in his work as an expert witness and Alison Johnson in her work as a West Midlands police officer – the two authors combine an array of perspectives into a distinctly unified textbook, focusing throughout on evidence from real and often high profile cases including serial killer Harold Shipman, the Bridgewater Four and the Birmingham Six. Divided into two sections, 'The Language of the Legal Process' and 'Language as Evidence', the book covers the key topics of the field. The first section looks at legal language, the structures of legal genres and the collection and testing of evidence from the initial police interview through to examination and cross-examination in the courtroom. The second section focuses on the role of the forensic linguist, the forensic phonetician and the document examiner, as well as examining in detail the linguistic investigation of authorship and plagiarism. With research tasks, suggested reading and website references provided at the end of each chapter, An Introduction to Forensic Linguistics: Language in Evidence is the essential textbook for courses in forensic linguistics and language of the law.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are several unresolved problems in forensic authorship profiling, including a lack of research focusing on the types of texts that are typically analysed in forensic linguistics (e.g. threatening letters, ransom demands) and a general disregard for the effect of register variation when testing linguistic variables for use in profiling. The aim of this dissertation is therefore to make a first step towards filling these gaps by testing whether established patterns of sociolinguistic variation appear in malicious forensic texts that are controlled for register. This dissertation begins with a literature review that highlights a series of correlations between language use and various social factors, including gender, age, level of education and social class. This dissertation then presents the primary data set used in this study, which consists of a corpus of 287 fabricated malicious texts from 3 different registers produced by 96 authors stratified across the 4 social factors listed above. Since this data set is fabricated, its validity was also tested through a comparison with another corpus consisting of 104 naturally occurring malicious texts, which showed that no important differences exist between the language of the fabricated malicious texts and the authentic malicious texts. The dissertation then reports the findings of the analysis of the corpus of fabricated malicious texts, which shows that the major patterns of sociolinguistic variation identified in previous research are valid for forensic malicious texts and that controlling register variation greatly improves the performance of profiling. In addition, it is shown that through regression analysis it is possible to use these patterns of linguistic variation to profile the demographic background of authors across the four social factors with an average accuracy of 70%. Overall, the present study therefore makes a first step towards developing a principled model of forensic authorship profiling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.