945 resultados para digital forensic tool testing
Resumo:
Background: Over the last 10 years, Australia's spontaneous vaginal birth rate has decreased approximately 1% each year and the caesarean section rate has increased approximately 1% each year. This trend has serious implications for the health of women and babies. As midwives we are the caretakers of normal birth and therefore partly responsible for its decline and the solution to its decline. Although antenatal education is in a potentially powerful position to promote normal birth, a structured review conducted as part of this thesis found that it does not realise that potential. Currently framed in pathogenesis, antenatal education in particular and maternity services in general, are in need of reframing. The theory of salutogenesis may offer a new lens as it focuses on health rather than illness. Sense of coherence is the cornerstone of salutogenesis and is a predictive indicator of health. What is unclear is the role pregnant women's sense of coherence plays in their birthing outcomes. This study explored associations between pregnant women's sense of coherence, their pregnancy choices, their anticipated labour choices, their labour and birthing outcomes as well as factors associated with modification to sense of coherence from the antenatal to postnatal periods. Methods: After a comprehensive review of the literature, questionnaire development and psychometric tool testing and modification, a longitudinal survey was conducted where eligible women completed a questionnaire before the 30th week of pregnancy (Phase One) and approximately 8 weeks after birth (Phase Two). Eligible women were less than 30 weeks pregnant with a single fetus, could read and write in English and lived in the Australian Capital Territory in Australia. Phase One provided information on women's sense of coherence scores, Edinburgh Postnatal Depression Scale (EPDS) scores, Support Behaviour Inventory (SBI) scores, pregnancy choices including care proVider, planned place of birth, planned birth type and anticipated epidural use and demographics. Phase Two provided information on women's sense of coherence scores, EPDS scores and their labour and birthing outcomes. Findings: 1074 women completed Phase One representing a 61.3% response rate. 753 women completed Phase Two representing a 70.1% retention rate between phases. Compared to women with low sense of coherence, women with high sense of coherence were older, reported fewer pregnancy conditions such as diabetes or hypertension, were less likely to have depressive symptoms, were more likely to feel well supported, were less likely to experience a caesarean section and more likely to experience an assisted vaginal birth. Sense of coherence was not associated with women's pregnancy choices. Higher EPDS scores, lower sense of coherence and greater satisfaction with birth were associated with an increase in women's sense of coherence from the antenatal to the postnatal period. Decreased birth satisfaction and experiencing epidural anaesthesia in labour and assisted vaginal birth were associated with a decrease in sense of coherence from the antenatal to the postnatal period. Conclusion: Strong sense of coherence in pregnant women halved the likelihood of experiencing caesarean section compared to women with low sense of coherence. Sense of coherence is a modifiable predictor of women's childbearing health and was found to be raised by birth satisfaction and lowered by birth dissatisfaction and labour interventions. These important findings add to the limited body of knowledge about sense of coherence and childbearing.
Resumo:
Concurrent programs are hard to test due to the inherent nondeterminism. This paper presents a method and tool support for testing concurrent Java components. Too[ support is offered through ConAn (Concurrency Analyser), a too] for generating drivers for unit testing Java classes that are used in a multithreaded context. To obtain adequate controllability over the interactions between Java threads, the generated driver contains threads that are synchronized by a clock. The driver automatically executes the calls in the test sequence in the prescribed order and compares the outputs against the expected outputs specified in the test sequence. The method and tool are illustrated in detail on an asymmetric producer-consumer monitor. Their application to testing over 20 concurrent components, a number of which are sourced from industry and were found to contain faults, is presented and discussed.
Resumo:
Due to usage conditions, hazardous environments or intentional causes, physical and virtual systems are subject to faults in their components, which may affect their overall behaviour. In a ‘black-box’ agent modelled by a set of propositional logic rules, in which just a subset of components is externally visible, such faults may only be recognised by examining some output function of the agent. A (fault-free) model of the agent’s system provides the expected output given some input. If the real output differs from that predicted output, then the system is faulty. However, some faults may only become apparent in the system output when appropriate inputs are given. A number of problems regarding both testing and diagnosis thus arise, such as testing a fault, testing the whole system, finding possible faults and differentiating them to locate the correct one. The corresponding optimisation problems of finding solutions that require minimum resources are also very relevant in industry, as is minimal diagnosis. In this dissertation we use a well established set of benchmark circuits to address such diagnostic related problems and propose and develop models with different logics that we formalise and generalise as much as possible. We also prove that all techniques generalise to agents and to multiple faults. The developed multi-valued logics extend the usual Boolean logic (suitable for faultfree models) by encoding values with some dependency (usually on faults). Such logics thus allow modelling an arbitrary number of diagnostic theories. Each problem is subsequently solved with CLP solvers that we implement and discuss, together with a new efficient search technique that we present. We compare our results with other approaches such as SAT (that require substantial duplication of circuits), showing the effectiveness of constraints over multi-valued logics, and also the adequacy of a general set constraint solver (with special inferences over set functions such as cardinality) on other problems. In addition, for an optimisation problem, we integrate local search with a constructive approach (branch-and-bound) using a variety of logics to improve an existing efficient tool based on SAT and ILP.
Resumo:
The research described in this thesis was developed as part o f the Information Management for Green Design (IMA GREE) Project. The 1MAGREE Project was founded by Enterprise Ireland under a Strategic Research Grant Scheme as a partnership project between Galway Mayo Institute o f Technology and C1MRU University College Galway. The project aimed to develop a CAD integrated software tool to support environmental information management for design, particularly for the electronics-manufacturing sector in Ireland.
Resumo:
This work describes a test tool that allows to make performance tests of different end-to-end available bandwidth estimation algorithms along with their different implementations. The goal of such tests is to find the best-performing algorithm and its implementation and use it in congestion control mechanism for high-performance reliable transport protocols. The main idea of this paper is to describe the options which provide available bandwidth estimation mechanism for highspeed data transport protocols and to develop basic functionality of such test tool with which it will be possible to manage entities of test application on all involved testing hosts, aided by some middleware.
Resumo:
Satellite remote sensing imagery is used for forestry, conservation and environmental applications, but insufficient spatial resolution, and, in particular, unavailability of images at the precise timing required for a given application, often prevent achieving a fully operational stage. Airborne remote sensing has the advantage of custom-tuned sensors, resolution and timing, but its price prevents using it as a routine technique for the mentioned fields. Some Unmanned Aerial Vehicles might provide a “third way” solution as low-cost techniques for acquiring remotely sensed information, under close control of the end-user, albeit at the expense of lower quality instrumentation and instability. This report evaluates a light remote sensing system based on a remotely-controlled mini-UAV (ATMOS-3) equipped with a color infra-red camera (VEGCAM-1) designed and operated by CATUAV. We conducted a testing mission over a Mediterranean landscape dominated by an evergreen woodland of Aleppo pine (Pinus halepensis) and (Holm) oak (Quercus ilex) in the Montseny National Park (Catalonia, NE Spain). We took advantage of state-of-the-art ortho-rectified digital aerial imagery (acquired by the Institut Cartogràfic de Catalunya over the area during the previous year) and used it as quality reference. In particular, we paid attention to: 1) Operationality of flight and image acquisition according to a previously defined plan; 2) Radiometric and geometric quality of the images; and 3) Operational use of the images in the context of applications. We conclude that the system has achieved an operational stage regarding flight activities, although with meteorological limits set by wind speed and turbulence. Appropriate landing areas can be sometimes limiting also, but the system is able to land on small and relatively rough terrains such as patches of grassland or short matorral, and we have operated the UAV as far as 7 km from the control unit. Radiometric quality is sufficient for interactive analysis, but probably insufficient for automated processing. A forthcoming camera is supposed to greatly improve radiometric quality and consistency. Conventional GPS positioning through time synchronization provides coarse orientation of the images, with no roll information.
Resumo:
The aim of this work is to evaluate the capabilities and limitations of chemometric methods and other mathematical treatments applied on spectroscopic data and more specifically on paint samples. The uniqueness of the spectroscopic data comes from the fact that they are multivariate - a few thousands variables - and highly correlated. Statistical methods are used to study and discriminate samples. A collection of 34 red paint samples was measured by Infrared and Raman spectroscopy. Data pretreatment and variable selection demonstrated that the use of Standard Normal Variate (SNV), together with removal of the noisy variables by a selection of the wavelengths from 650 to 1830 cm−1 and 2730-3600 cm−1, provided the optimal results for infrared analysis. Principal component analysis (PCA) and hierarchical clusters analysis (HCA) were then used as exploratory techniques to provide evidence of structure in the data, cluster, or detect outliers. With the FTIR spectra, the Principal Components (PCs) correspond to binder types and the presence/absence of calcium carbonate. 83% of the total variance is explained by the four first PCs. As for the Raman spectra, we observe six different clusters corresponding to the different pigment compositions when plotting the first two PCs, which account for 37% and 20% respectively of the total variance. In conclusion, the use of chemometrics for the forensic analysis of paints provides a valuable tool for objective decision-making, a reduction of the possible classification errors, and a better efficiency, having robust results with time saving data treatments.
Resumo:
Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library.
Resumo:
In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.
Resumo:
Almost 30 years ago, Bayesian networks (BNs) were developed in the field of artificial intelligence as a framework that should assist researchers and practitioners in applying the theory of probability to inference problems of more substantive size and, thus, to more realistic and practical problems. Since the late 1980s, Bayesian networks have also attracted researchers in forensic science and this tendency has considerably intensified throughout the last decade. This review article provides an overview of the scientific literature that describes research on Bayesian networks as a tool that can be used to study, develop and implement probabilistic procedures for evaluating the probative value of particular items of scientific evidence in forensic science. Primary attention is drawn here to evaluative issues that pertain to forensic DNA profiling evidence because this is one of the main categories of evidence whose assessment has been studied through Bayesian networks. The scope of topics is large and includes almost any aspect that relates to forensic DNA profiling. Typical examples are inference of source (or, 'criminal identification'), relatedness testing, database searching and special trace evidence evaluation (such as mixed DNA stains or stains with low quantities of DNA). The perspective of the review presented here is not exclusively restricted to DNA evidence, but also includes relevant references and discussion on both, the concept of Bayesian networks as well as its general usage in legal sciences as one among several different graphical approaches to evidence evaluation.
Resumo:
Whether for investigative or intelligence aims, crime analysts often face up the necessity to analyse the spatiotemporal distribution of crimes or traces left by suspects. This article presents a visualisation methodology supporting recurrent practical analytical tasks such as the detection of crime series or the analysis of traces left by digital devices like mobile phone or GPS devices. The proposed approach has led to the development of a dedicated tool that has proven its effectiveness in real inquiries and intelligence practices. It supports a more fluent visual analysis of the collected data and may provide critical clues to support police operations as exemplified by the presented case studies.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.