887 resultados para Computer forensic analysis
Resumo:
The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.
Resumo:
Markkinasegmentointi nousi esiin ensi kerran jo 50-luvulla ja se on ollut siitä lähtien yksi markkinoinnin peruskäsitteistä. Suuri osa segmentointia käsittelevästä tutkimuksesta on kuitenkin keskittynyt kuluttajamarkkinoiden segmentointiin yritys- ja teollisuusmarkkinoiden segmentoinnin jäädessä vähemmälle huomiolle. Tämän tutkimuksen tavoitteena on luoda segmentointimalli teollismarkkinoille tietotekniikan tuotteiden ja palveluiden tarjoajan näkökulmasta. Tarkoituksena on selvittää mahdollistavatko case-yrityksen nykyiset asiakastietokannat tehokkaan segmentoinnin, selvittää sopivat segmentointikriteerit sekä arvioida tulisiko tietokantoja kehittää ja kuinka niitä tulisi kehittää tehokkaamman segmentoinnin mahdollistamiseksi. Tarkoitus on luoda yksi malli eri liiketoimintayksiköille yhteisesti. Näin ollen eri yksiköiden tavoitteet tulee ottaa huomioon eturistiriitojen välttämiseksi. Tutkimusmetodologia on tapaustutkimus. Lähteinä tutkimuksessa käytettiin sekundäärisiä lähteitä sekä primäärejä lähteitä kuten case-yrityksen omia tietokantoja sekä haastatteluita. Tutkimuksen lähtökohtana oli tutkimusongelma: Voiko tietokantoihin perustuvaa segmentointia käyttää kannattavaan asiakassuhdejohtamiseen PK-yritys sektorilla? Tavoitteena on luoda segmentointimalli, joka hyödyntää tietokannoissa olevia tietoja tinkimättä kuitenkaan tehokkaan ja kannattavan segmentoinnin ehdoista. Teoriaosa tutkii segmentointia yleensä painottuen kuitenkin teolliseen markkinasegmentointiin. Tarkoituksena on luoda selkeä kuva erilaisista lähestymistavoista aiheeseen ja syventää näkemystä tärkeimpien teorioiden osalta. Tietokantojen analysointi osoitti selviä puutteita asiakastiedoissa. Peruskontaktitiedot löytyvät mutta segmentointia varten tietoa on erittäin rajoitetusti. Tietojen saantia jälleenmyyjiltä ja tukkureilta tulisi parantaa loppuasiakastietojen saannin takia. Segmentointi nykyisten tietojen varassa perustuu lähinnä sekundäärisiin tietoihin kuten toimialaan ja yrityskokoon. Näitäkään tietoja ei ole saatavilla kaikkien tietokannassa olevien yritysten kohdalta.
Resumo:
The evaluation of children's statements of sexual abuse cases in forensic cases is critically important and must and reliable. Criteria-based content analysis (CBCA) is the main component of the statement validity assessment (SVA), which is the most frequently used approach in this setting. This study investigated the inter-rater reliability (IRR) of CBCA in a forensic context. Three independent raters evaluated the transcripts of 95 statements of sexual abuse. IRR was calculated for each criterion, total score, and overall evaluation. The IRR was variable for the criteria, with several being unsatisfactory. But high IRR was found for the total CBCA scores (Kendall's W = 0.84) and for overall evaluation (Kendall's W = 0.65). Despite some shortcomings, SVA remains a robust method to be used in the comprehensive evaluation of children's statements of sexual abuse in the forensic setting. However, the low IRR of some CBCA criteria could justify some technical improvements.
Resumo:
Objective: We propose and validate a computer aided system to measure three different mandibular indexes: cortical width, panoramic mandibular index and, mandibular alveolar bone resorption index. Study Design: Repeatability and reproducibility of the measurements are analyzed and compared to the manual estimation of the same indexes. Results: The proposed computerized system exhibits superior repeatability and reproducibility rates compared to standard manual methods. Moreover, the time required to perform the measurements using the proposed method is negligible compared to perform the measurements manually. Conclusions: We have proposed a very user friendly computerized method to measure three different morphometric mandibular indexes. From the results we can conclude that the system provides a practical manner to perform these measurements. It does not require an expert examiner and does not take more than 16 seconds per analysis. Thus, it may be suitable to diagnose osteoporosis using dental panoramic radiographs
Resumo:
From 6 to 8 November 1982 one of the most catastrophic flash-flood events was recorded in the Eastern Pyrenees affecting Andorra and also France and Spain with rainfall accumulations exceeding 400 mm in 24 h, 44 fatalities and widespread damage. This paper aims to exhaustively document this heavy precipitation event and examines mesoscale simulations performed by the French Meso-NH non-hydrostatic atmospheric model. Large-scale simulations show the slow-evolving synoptic environment favourable for the development of a deep Atlantic cyclone which induced a strong southerly flow over the Eastern Pyrenees. From the evolution of the synoptic pattern four distinct phases have been identified during the event. The mesoscale analysis presents the second and the third phase as the most intense in terms of rainfall accumulations and highlights the interaction of the moist and conditionally unstable flows with the mountains. The presence of a SW low level jet (30 m s-1) around 1500 m also had a crucial role on focusing the precipitation over the exposed south slopes of the Eastern Pyrenees. Backward trajectories based on Eulerian on-line passive tracers indicate that the orographic uplift was the main forcing mechanism which triggered and maintained the precipitating systems more than 30 h over the Pyrenees. The moisture of the feeding flow mainly came from the Atlantic Ocean (7-9 g kg-1) and the role of the Mediterranean as a local moisture source was very limited (2-3 g kg-1) due to the high initial water vapour content of the parcels and the rapid passage over the basin along the Spanish Mediterranean coast (less than 12 h).
Resumo:
Currently available molecular biology tools allow forensic scientists to characterize DNA evidence found at crime scenes for a large variety of samples, including those of limited quantity and quality, and achieve high levels of individualization. Yet, standard forensic markers provide limited or no results when applied to mixed DNA samples where the contributors are present in very different proportions (unbalanced DNA mixtures). This becomes an issue mostly for the analysis of trace samples collected on the victim or from touched objects. To this end, we recently proposed an innovative type of genetic marker, named DIP-STR that relies on pairing deletion/insertion polymorphisms (DIP) with standard short tandem repeats (STR). This novel compound marker allows detection of the minor DNA contributor in a DNA mixture of any gender and cellular origin with unprecedented resolution (beyond a DNA ratio of 1:1000). To provide a novel analytical tool useful in practice to common forensic laboratories, this article describes the first set of 10 DIP-STR markers selected according to forensic technical standards. The novel DIP-STR regions are short (between 146 and 271 bp), include only highly polymorphic tri-, tetra- and pentanucleotide tandem repeats and are located on different chromosomes or chromosomal arms to provide statistically independent results. This novel set of DIP-STR can target the amplification of 0.03-0.1 ng of DNA when mixed with a 1000-fold excess of major DNA. DIP-STR relative allele frequencies are estimated based on a survey of 103 Swiss individuals. Finally, this study provides an estimate of the occurrence of informative alleles and a calculation of the corresponding random match probability of the detected minor DIP-STR genotype assessed across 10,506 pairwise conceptual mixtures.
Resumo:
Deliberate fires appear to be borderless and timeless events creating a serious security problem. There have been many attempts to develop approaches to tackle this problem, but unfortunately acting effectively against deliberate fires has proven a complex challenge. This article reviews the current situation relating to deliberate fires: what do we know, how serious is the situation, how is it being dealt with, and what challenges are faced when developing a systematic and global methodology to tackle the issues? The repetitive nature of some types of deliberate fires will also be discussed. Finally, drawing on the reality of repetition within deliberate fires and encouraged by successes obtained in previous repetitive crimes (such as property crimes or drug trafficking), we will argue that the use of the intelligence process cycle as a framework to allow a follow-up and systematic analysis of fire events is a relevant approach. This is the first article of a series of three articles. This first part is introducing the context and discussing the background issues in order to provide a better underpinning knowledge to managers and policy makers planning on tackling this issue. The second part will present a methodology developed to detect and identify repetitive fire events from a set of data, and the third part will discuss the analyses of these data to produce intelligence.
Resumo:
This paper is concerned with the contribution of forensic science to the legal process by helping reduce uncertainty. Although it is now widely accepted that uncertainty should be handled by probability because it is a safeguard against incoherent proceedings, there remain diverging and conflicting views on how probability ought to be interpreted. This is exemplified by the proposals in scientific literature that call for procedures of probability computation that are referred to as "objective," suggesting that scientists ought to use them in their reporting to recipients of expert information. I find such proposals objectionable. They need to be viewed cautiously, essentially because ensuing probabilistic statements can be perceived as making forensic science prescriptive. A motivating example from the context of forensic DNA analysis will be chosen to illustrate this. As a main point, it shall be argued that such constraining suggestions can be avoided by interpreting probability as a measure of personal belief, that is, subjective probability. Invoking references to foundational literature from mathematical statistics and philosophy of science, the discussion will explore the consequences of this interdisciplinary viewpoint for the practice of forensic expert reporting. It will be emphasized that-as an operational interpretation of probability-the subjectivist perspective enables forensic science to add value to the legal process, in particular by avoiding inferential impasses to which other interpretations of probability may lead. Moreover, understanding probability from a subjective perspective can encourage participants in the legal process to take on more responsibility in matters regarding the coherent handling of uncertainty. This would assure more balanced interactions at the interface between science and the law. This, in turn, provides support for ongoing developments that can be called the "probabilization" of forensic science.
Resumo:
Over the past few decades, age estimation of living persons has represented a challenging task for many forensic services worldwide. In general, the process for age estimation includes the observation of the degree of maturity reached by some physical attributes, such as dentition or several ossification centers. The estimated chronological age or the probability that an individual belongs to a meaningful class of ages is then obtained from the observed degree of maturity by means of various statistical methods. Among these methods, those developed in a Bayesian framework offer to users the possibility of coherently dealing with the uncertainty associated with age estimation and of assessing in a transparent and logical way the probability that an examined individual is younger or older than a given age threshold. Recently, a Bayesian network for age estimation has been presented in scientific literature; this kind of probabilistic graphical tool may facilitate the use of the probabilistic approach. Probabilities of interest in the network are assigned by means of transition analysis, a statistical parametric model, which links the chronological age and the degree of maturity by means of specific regression models, such as logit or probit models. Since different regression models can be employed in transition analysis, the aim of this paper is to study the influence of the model in the classification of individuals. The analysis was performed using a dataset related to the ossifications status of the medial clavicular epiphysis and results support that the classification of individuals is not dependent on the choice of the regression model.
Resumo:
Modelling the shoulder's musculature is challenging given its mechanical and geometric complexity. The use of the ideal fibre model to represent a muscle's line of action cannot always faithfully represent the mechanical effect of each muscle, leading to considerable differences between model-estimated and in vivo measured muscle activity. While the musculo-tendon force coordination problem has been extensively analysed in terms of the cost function, only few works have investigated the existence and sensitivity of solutions to fibre topology. The goal of this paper is to present an analysis of the solution set using the concepts of torque-feasible space (TFS) and wrench-feasible space (WFS) from cable-driven robotics. A shoulder model is presented and a simple musculo-tendon force coordination problem is defined. The ideal fibre model for representing muscles is reviewed and the TFS and WFS are defined, leading to the necessary and sufficient conditions for the existence of a solution. The shoulder model's TFS is analysed to explain the lack of anterior deltoid (DLTa) activity. Based on the analysis, a modification of the model's muscle fibre geometry is proposed. The performance with and without the modification is assessed by solving the musculo-tendon force coordination problem for quasi-static abduction in the scapular plane. After the proposed modification, the DLTa reaches 20% of activation.
Resumo:
The aim of this study is to define a new statistic, PVL, based on the relative distance between the likelihood associated with the simulation replications and the likelihood of the conceptual model. Our results coming from several simulation experiments of a clinical trial show that the PVL statistic range can be a good measure of stability to establish when a computational model verifies the underlying conceptual model. PVL improves also the analysis of simulation replications because only one statistic is associated with all the simulation replications. As well it presents several verification scenarios, obtained by altering the simulation model, that show the usefulness of PVL. Further simulation experiments suggest that a 0 to 20 % range may define adequate limits for the verification problem, if considered from the viewpoint of an equivalence test.
Resumo:
Partial-thickness tears of the supraspinatus tendon frequently occur at its insertion on the greater tubercule of the humerus, causing pain and reduced strength and range of motion. The goal of this work was to quantify the loss of loading capacity due to tendon tears at the insertion area. A finite element model of the supraspinatus tendon was developed using in vivo magnetic resonance images data. The tendon was represented by an anisotropic hyperelastic constitutive law identified with experimental measurements. A failure criterion was proposed and calibrated with experimental data. A partial-thickness tear was gradually increased, starting from the deep articular-sided fibres. For different values of tendon tear thickness, the tendon was mechanically loaded up to failure. The numerical model predicted a loss in loading capacity of the tendon as the tear thickness progressed. Tendon failure was more likely when the tendon tear exceeded 20%. The predictions of the model were consistent with experimental studies. Partial-thickness tears below 40% tear are sufficiently stable to persist physiotherapeutic exercises. Above 60% tear surgery should be considered to restore shoulder strength.
Resumo:
AbstractObjective:To compare the accuracy of computer-aided ultrasound (US) and magnetic resonance imaging (MRI) by means of hepatorenal gradient analysis in the evaluation of nonalcoholic fatty liver disease (NAFLD) in adolescents.Materials and Methods:This prospective, cross-sectional study evaluated 50 adolescents (aged 11–17 years), including 24 obese and 26 eutrophic individuals. All adolescents underwent computer-aided US, MRI, laboratory tests, and anthropometric evaluation. Sensitivity, specificity, positive and negative predictive values and accuracy were evaluated for both imaging methods, with subsequent generation of the receiver operating characteristic (ROC) curve and calculation of the area under the ROC curve to determine the most appropriate cutoff point for the hepatorenal gradient in order to predict the degree of steatosis, utilizing MRI results as the gold-standard.Results:The obese group included 29.2% girls and 70.8% boys, and the eutrophic group, 69.2% girls and 30.8% boys. The prevalence of NAFLD corresponded to 19.2% for the eutrophic group and 83% for the obese group. The ROC curve generated for the hepatorenal gradient with a cutoff point of 13 presented 100% sensitivity and 100% specificity. As the same cutoff point was considered for the eutrophic group, false-positive results were observed in 9.5% of cases (90.5% specificity) and false-negative results in 0% (100% sensitivity).Conclusion:Computer-aided US with hepatorenal gradient calculation is a simple and noninvasive technique for semiquantitative evaluation of hepatic echogenicity and could be useful in the follow-up of adolescents with NAFLD, population screening for this disease as well as for clinical studies.
Resumo:
The genetic impact associated to the Neolithic spread in Europe has been widely debated over the last 20 years. Within this context, ancient DNA studies have provided a more reliable picture by directly analyzing the protagonist populations at different regions in Europe. However, the lack of available data from the original Near Eastern farmers has limited the achieved conclusions, preventing the formulation of continental models of Neolithic expansion. Here we address this issue by presenting mitochondrial DNA data of the original Near-Eastern Neolithic communities with the aim of providing the adequate background for the interpretation of Neolithic genetic data from European samples. Sixty-three skeletons from the Pre Pottery Neolithic B (PPNB) sites of Tell Halula, Tell Ramad and Dja'de El Mughara dating between 8,700-6,600 cal. B.C. were analyzed, and 15 validated mitochondrial DNA profiles were recovered. In order to estimate the demographic contribution of the first farmers to both Central European and Western Mediterranean Neolithic cultures, haplotype and haplogroup diversities in the PPNB sample were compared using phylogeographic and population genetic analyses to available ancient DNA data from human remains belonging to the Linearbandkeramik-Alföldi Vonaldiszes Kerámia and Cardial/Epicardial cultures. We also searched for possible signatures of the original Neolithic expansion over the modern Near Eastern and South European genetic pools, and tried to infer possible routes of expansion by comparing the obtained results to a database of 60 modern populations from both regions. Comparisons performed among the 3 ancient datasets allowed us to identify K and N-derived mitochondrial DNA haplogroups as potential markers of the Neolithic expansion, whose genetic signature would have reached both the Iberian coasts and the Central European plain. Moreover, the observed genetic affinities between the PPNB samples and the modern populations of Cyprus and Crete seem to suggest that the Neolithic was first introduced into Europe through pioneer seafaring colonization.
Resumo:
This study aimed at comparing the efficiency of various sampling materials for the collection and subsequent analysis of organic gunshot residues (OGSR). To the best of our knowledge, it is the first time that sampling devices were investigated in detail for further quantitation of OGSR by LC-MS. Seven sampling materials, namely two "swab"-type and five "stub"-type collection materials, were tested. The investigation started with the development of a simple and robust LC-MS method able to separate and quantify molecules typically found in gunpowders, such as diphenylamine or ethylcentralite. The evaluation of sampling materials was then systematically carried out by first analysing blank extracts of the materials to check for potential interferences and determining matrix effects. Based on these results, the best four materials, namely cotton buds, polyester swabs, a tape from 3M and PTFE were compared in terms of collection efficiency during shooting experiments using a set of 9 mm Luger ammunition. It was found that the tape was capable of recovering the highest amounts of OGSR. As tape-lifting is the technique currently used in routine for inorganic GSR, OGSR analysis might be implemented without modifying IGSR sampling and analysis procedure.