945 resultados para digital forensic tool testing
Resumo:
A test oracle provides a means for determining whether an implementation behaves according to its specification. A passive test oracle checks that the correct behaviour has been implemented, but does not implement the behaviour itself. In previous work, we have presented a method that allows us to derive passive C++ test oracles from formal specifications written in Object-Z. We describe the "Warlock" prototype tool that supports the method. Warlock is built on top of an existing Object-Z type checker and generates oracle code for a substantial subset of the Object-Z language. We describe the architecture of Warlock and its application to a number of Object-Z specifications. We also discuss its current limitations.
Resumo:
The judicial interest in ‘scientific’ evidence has driven recent work to quantify results for forensic linguistic authorship analysis. Through a methodological discussion and a worked example this paper examines the issues which complicate attempts to quantify results in work. The solution suggested to some of the difficulties is a sampling and testing strategy which helps to identify potentially useful, valid and reliable markers of authorship. An important feature of the sampling strategy is that these markers identified as being generally valid and reliable are retested for use in specific authorship analysis cases. The suggested approach for drawing quantified conclusions combines discriminant function analysis and Bayesian likelihood measures. The worked example starts with twenty comparison texts for each of three potential authors and then uses a progressively smaller comparison corpus, reducing to fifteen, ten, five and finally three texts per author. This worked example demonstrates how reducing the amount of data affects the way conclusions can be drawn. With greater numbers of reference texts quantified and safe attributions are shown to be possible, but as the number of reference texts reduces the analysis shows how the conclusion which should be reached is that no attribution can be made. The testing process at no point results in instances of a misattribution.
Resumo:
Queueing theory is an effective tool in the analysis of canputer camrunication systems. Many results in queueing analysis have teen derived in the form of Laplace and z-transform expressions. Accurate inversion of these transforms is very important in the study of computer systems, but the inversion is very often difficult. In this thesis, methods for solving some of these queueing problems, by use of digital signal processing techniques, are presented. The z-transform of the queue length distribution for the Mj GY jl system is derived. Two numerical methods for the inversion of the transfom, together with the standard numerical technique for solving transforms with multiple queue-state dependence, are presented. Bilinear and Poisson transform sequences are presented as useful ways of representing continuous-time functions in numerical computations.
Resumo:
A system for the NDI' testing of the integrity of conposite materials and of adhesive bonds has been developed to meet industrial requirements. The vibration techniques used were found to be applicable to the development of fluid measuring transducers. The vibrational spectra of thin rectangular bars were used for the NDT work. A machined cut in a bar had a significant effect on the spectrum but a genuine crack gave an unambiguous response at high amplitudes. This was the generation of fretting crack noise at frequencies far above that of the drive. A specially designed vibrational decrement meter which, in effect, measures mechanical energy loss enabled a numerical classification of material adhesion to be obtained. This was used to study bars which had been flame or plasma sprayed with a variety of materials. It has become a useful tool in optimising coating methods. A direct industrial application was to classify piston rings of high performance I.C. engines. Each consists of a cast iron ring with a channel into which molybdenum, a good bearing surface, is sprayed. The NDT classification agreed quite well with the destructive test normally used. The techniques and equipment used for the NOT work were applied to the development of the tuning fork transducers investigated by Hassan into commercial density and viscosity devices. Using narrowly spaced, large area tines a thin lamina of fluid is trapped between them. It stores a large fraction of the vibrational energy which, acting as an inertia load reduces the frequency. Magnetostrictive and piezoelectric effects together or in combination enable the fork to be operated through a flange. This allows it to be used in pipeline or 'dipstick' applications. Using a different tine geometry the viscosity loading can be predoninant. This as well as the signal decrement of the density transducer makes a practical viscometer.
Resumo:
This work is undertaken in the attempt to understand the processes at work at the cutting edge of the twist drill. Extensive drill life testing performed by the University has reinforced a survey of previously published information. This work demonstrated that there are two specific aspects of drilling which have not previously been explained comprehensively. The first concerns the interrelating of process data between differing drilling situations, There is no method currently available which allows the cutting geometry of drilling to be defined numerically so that such comparisons, where made, are purely subjective. Section one examines this problem by taking as an example a 4.5mm drill suitable for use with aluminium. This drill is examined using a prototype solid modelling program to explore how the required numerical information may be generated. The second aspect is the analysis of drill stiffness. What aspects of drill stiffness provide the very great difference in performance between short flute length, medium flute length and long flute length drills? These differences exist between drills of identical point geometry and the practical superiority of short drills has been known to shop floor drilling operatives since drilling was first introduced. This problem has been dismissed repeatedly as over complicated but section two provides a first approximation and shows that at least for smaller drills of 4. 5mm the effects are highly significant. Once the cutting action of the twist drill is defined geometrically there is a huge body of machinability data that becomes applicable to the drilling process. Work remains to interpret the very high inclination angles of the drill cutting process in terms of cutting forces and tool wear but aspects of drill design may already be looked at in new ways with the prospect of a more analytical approach rather than the present mix of experience and trial and error. Other problems are specific to the twist drill, such as the behaviour of the chips in the flute. It is now possible to predict the initial direction of chip flow leaving the drill cutting edge. For the future the parameters of further chip behaviour may also be explored within this geometric model.
Resumo:
Ion implantation modifies the surface composition and properties of materials by bombardment with high energy ions. The low temperature of the process ensures the avoidance of distortion and degradation of the surface or bulk mechanical properties of components. In the present work nitrogen ion implantation at 90 keV and doses above 1017 ions/cm2 has been carried out on AISI M2, D2 and 420 steels and engineering coatings such as hard chromium, electroless Ni-P and a brush plated Co-W alloy. Evaluation of wear and frictional properties of these materials was performed with a lubricated Falex wear test at high loads up to 900 N and a dry pin-on-disc apparatus at loads up to 40 N. It was found that nitrogen implantation reduced the wear of AISI 420 stainless steel by a factor of 2.5 under high load lubricated conditions and by a factor of 5.5 in low load dry testing. Lower but significant reductions in wear were achieved for AISI M2 and D2 steels. Wear resistance of coating materials was improved by up to 4 times in lubricated wear of hard Cr coatings implanted at the optimum dose but lower improvements were obtained for the Co-W alloy coating. However, hardened electroless Ni-P coatings showed no enhancement in wear properties. The benefits obtained in wear behaviour for the above materials were generally accompanied by a significant decrease in the running-in friction. Nitrogen implantation hardened the surface of steels and Cr and Co-W coatings. An ultra-microhardness technique showed that the true hardness of implanted layers was greater than the values obtained by conventional micro-hardness methods, which often result in penetration below the implanted depth. Scanning electron microscopy revealed that implantation reduced the ploughing effect during wear and a change in wear mechanism from an abrasive-adhesive type to a mild oxidative mode was evident. Retention of nitrogen after implantation was studied by Nuclear Reaction Analysis and Auger Electron Spectroscopy. It was shown that maximum nitrogen retention occurs in hard Cr coatings and AISI 420 stainless steel, which explains the improvements obtained in wear resistance and hardness. X-ray photoelectron spectroscopy on these materials revealed that nitrogen is almost entirely bound to Cr, forming chromium nitrides. It was concluded that nitrogen implantation at 90 keV and doses above 3x1017 ions/cm2 produced the most significant improvements in mechanical properties in materials containing nitride formers by precipitation strengthening, improving the load bearing capacity of the surface and changing the wear mechanism from adhesive-abrasive to oxidative.
Resumo:
The recall of personal experiences relevant to a claim of food allergy or food intolerance is assessed by a psychologically validated tool for evidence that the suspected food could have caused the adverse symptom suffered. The tool looks at recall from memory of a particular episode or episodes when food was followed by symptoms resulting in self-diagnosis of food allergy or intolerance compared to merely theoretical knowledge that such symptoms could arise after eating the food. If there is detailed recall of events that point to the food as a potential cause of the symptom and the symptom is sufficiently serious, the tool user is recommended to seek testing at an allergy clinic or by the appropriate specialist for a non-allergic sensitivity. If what is recalled does not support the logical possibility of a causal connection between eating that food and occurrence of the symptom, then the user of the tool is pointed to other potential sources of the problem. The user is also recommended to investigate remedies other than avoidance of the food that had been blamed.
Resumo:
Trauma and damage to the delicate structures of the inner ear frequently occurs during insertion of electrode array into the cochlea. This is strongly related to the excessive manual insertion force of the surgeon without any tool/tissue interaction feedback. The research is examined tool-tissue interaction of large prototype scale (12.5:1) digit embedded with distributive tactile sensor based upon cochlear electrode and large prototype scale (4.5:1) cochlea phantom for simulating the human cochlear which could lead to small scale digit requirements. This flexible digit classified the tactile information from the digit-phantom interaction such as contact status, tip penetration, obstacles, relative shape and location, contact orientation and multiple contacts. The digit, distributive tactile sensors embedded with silicon-substrate is inserted into the cochlea phantom to measure any digit/phantom interaction and position of the digit in order to minimize tissue and trauma damage during the electrode cochlear insertion. The digit is pre-curved in cochlea shape so that the digit better conforms to the shape of the scala tympani to lightly hug the modiolar wall of a scala. The digit have provided information on the characteristics of touch, digit-phantom interaction during the digit insertion. The tests demonstrated that even devices of such a relative simple design with low cost have potential to improve cochlear implants surgery and other lumen mapping applications by providing tactile feedback information by controlling the insertion through sensing and control of the tip of the implant during the insertion. In that approach, the surgeon could minimize the tissue damage and potential damage to the delicate structures within the cochlear caused by current manual electrode insertion of the cochlear implantation. This approach also can be applied diagnosis and path navigation procedures. The digit is a large scale stage and could be miniaturized in future to include more realistic surgical procedures.
Resumo:
From the accusation of plagiarism in The Da Vinci Code, to the infamous hoaxer in the Yorkshire Ripper case, the use of linguistic evidence in court and the number of linguists called to act as expert witnesses in court trials has increased rapidly in the past fifteen years. An Introduction to Forensic Linguistics: Language in Evidence provides a timely and accessible introduction to this rapidly expanding subject. Using knowledge and experience gained in legal settings – Malcolm Coulthard in his work as an expert witness and Alison Johnson in her work as a West Midlands police officer – the two authors combine an array of perspectives into a distinctly unified textbook, focusing throughout on evidence from real and often high profile cases including serial killer Harold Shipman, the Bridgewater Four and the Birmingham Six. Divided into two sections, 'The Language of the Legal Process' and 'Language as Evidence', the book covers the key topics of the field. The first section looks at legal language, the structures of legal genres and the collection and testing of evidence from the initial police interview through to examination and cross-examination in the courtroom. The second section focuses on the role of the forensic linguist, the forensic phonetician and the document examiner, as well as examining in detail the linguistic investigation of authorship and plagiarism. With research tasks, suggested reading and website references provided at the end of each chapter, An Introduction to Forensic Linguistics: Language in Evidence is the essential textbook for courses in forensic linguistics and language of the law.
Resumo:
There are several unresolved problems in forensic authorship profiling, including a lack of research focusing on the types of texts that are typically analysed in forensic linguistics (e.g. threatening letters, ransom demands) and a general disregard for the effect of register variation when testing linguistic variables for use in profiling. The aim of this dissertation is therefore to make a first step towards filling these gaps by testing whether established patterns of sociolinguistic variation appear in malicious forensic texts that are controlled for register. This dissertation begins with a literature review that highlights a series of correlations between language use and various social factors, including gender, age, level of education and social class. This dissertation then presents the primary data set used in this study, which consists of a corpus of 287 fabricated malicious texts from 3 different registers produced by 96 authors stratified across the 4 social factors listed above. Since this data set is fabricated, its validity was also tested through a comparison with another corpus consisting of 104 naturally occurring malicious texts, which showed that no important differences exist between the language of the fabricated malicious texts and the authentic malicious texts. The dissertation then reports the findings of the analysis of the corpus of fabricated malicious texts, which shows that the major patterns of sociolinguistic variation identified in previous research are valid for forensic malicious texts and that controlling register variation greatly improves the performance of profiling. In addition, it is shown that through regression analysis it is possible to use these patterns of linguistic variation to profile the demographic background of authors across the four social factors with an average accuracy of 70%. Overall, the present study therefore makes a first step towards developing a principled model of forensic authorship profiling.
Resumo:
Neural Networks have been successfully employed in different biomedical settings. They have been useful for feature extractions from images and biomedical data in a variety of diagnostic applications. In this paper, they are applied as a diagnostic tool for classifying different levels of gastric electrical uncoupling in controlled acute experiments on dogs. Data was collected from 16 dogs using six bipolar electrodes inserted into the serosa of the antral wall. Each dog underwent three recordings under different conditions: (1) basal state, (2) mild surgically-induced uncoupling, and (3) severe surgically-induced uncoupling. For each condition half-hour recordings were made. The neural network was implemented according to the Learning Vector Quantization model. This is a supervised learning model of the Kohonen Self-Organizing Maps. Majority of the recordings collected from the dogs were used for network training. Remaining recordings served as a testing tool to examine the validity of the training procedure. Approximately 90% of the dogs from the neural network training set were classified properly. However, only 31% of the dogs not included in the training process were accurately diagnosed. The poor neural-network based diagnosis of recordings that did not participate in the training process might have been caused by inappropriate representation of input data. Previous research has suggested characterizing signals according to certain features of the recorded data. This method, if employed, would reduce the noise and possibly improve the diagnostic abilities of the neural network.
Resumo:
The article presents a new type of logs merging tool for multiple blade telecommunication systems based on the development of a new approach. The introduction of the new logs merging tool (the Log Merger) can help engineers to build a processes behavior timeline with a flexible system of information structuring used to assess the changes in the analyzed system. This logs merging system based on the experts experience and their analytical skills generates a knowledge base which could be advantageous in further decision-making expert system development. This paper proposes and discusses the design and implementation of the Log Merger, its architecture, multi-board analysis of capability and application areas. The paper also presents possible ways of further tool improvement e.g. - to extend its functionality and cover additional system platforms. The possibility to add an analysis module for further expert system development is also considered.
Resumo:
This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.
Resumo:
The key to prosperity in today's world is access to digital content and skills to create new content. Investigations of folklore artifacts is the topic of this article, presenting research related to the national program „Knowledge Technologies for Creation of Digital Presentation and Significant Repositories of Folklore Heritage” (FolkKnow). FolkKnow aims to build a digital multimedia archive "Bulgarian Folklore Heritage” (BFH) and virtual information portal with folk media library of digitized multimedia objects from a selected collection of the fund of Institute of Ethnology and Folklore Studies with Ethnographic Museum (IEFSEM) of the Bulgarian Academy of Science (BAS). The realization of the project FolkKnow gives opportunity for wide social applications of the multimedia collections, for the purposes of Interactive distance learning/self-learning, research activities in the field of Bulgarian traditional culture and for the cultural and ethno-tourism. We study, analyze and implement techniques and methods for digitization of multimedia objects and their annotation. In the paper are discussed specifics approaches used to building and protect a digital archive with multimedia content. Tasks can be systematized in the following guidelines: * Digitization of the selected samples * Analysis of the objects in order to determine the metadata of selected artifacts from selected collections and problem areas * Digital multimedia archive * Socially-oriented applications and virtual exhibitions artery * Frequency dictionary tool for texts with folklore themes * A method of modern technologies of protecting intellectual property and copyrights on digital content developed for use in digital exposures.
Resumo:
The purpose of this article is to evaluate the effectiveness of learning by doing as a practical tool for managing the training of students in "Library Management" at the ULSIT, Sofia, Bulgaria, by using the creation of project 'Data Base “Bulgarian Revival Towns” (CD), financed by Bulgarian Ministry of Education, Youth and Science (1/D002/144/13.10.2011) headed by Prof. DSc Ivanka Yankova, which aims to create new information resource for the towns which will serve the needs of scientific researches. By participating in generating the an array in the database through searching, selection and digitization of documents from these period, at the same time students get an opportunity to expand their skills to work effectively in a team, finding the interdisciplinary, a causal connection between the studied items, objects and subjects and foremost – practical experience in the field of digitization, information behavior, strategies for information search, etc. This method achieves good results for the accumulation of sustainable knowledge and it generates motivation to work in the field of library and information professions.