945 resultados para digital forensic tool testing
Resumo:
Assigning probabilities to alleged relationships, given DNA profiles, requires, among other things, calculation of a likelihood ratio (LR). Such calculations usually assume independence of genes: this assumption is not appropriate when the tested individuals share recent ancestry due to population substructure. Adjusted LR formulae, incorporating the coancestry coefficient F(ST), are presented here for various two-person relationships, and the issue of mutations in parentage testing is also addressed.
Resumo:
The Virtual Lightbox for Museums and Archives (VLMA) is a tool for collecting and reusing, in a structured fashion, the online contents of museums and archive datasets. It is not restricted to datasets with visual components although VLMA includes a lightbox service that enables comparison and manipulation of visual information. With VLMA, one can browse and search collections, construct personal collections, annotate them, export these collections to XML or Impress (Open Office) presentation format, and share collections with other VLMA users. VLMA was piloted as an e-Learning tool as part of JISC’s e-Learning focus in its first phase (2004-2005) and in its second phase (2005-2006) it has incorporated new partner collections while improving and expanding interfaces and services. This paper concerns its development as a research and teaching tool, especially to teachers using museum collections, and discusses the recent development of VLMA.
Resumo:
The increasing use of social media, applications or platforms that allow users to interact online, ensures that this environment will provide a useful source of evidence for the forensics examiner. Current tools for the examination of digital evidence find this data problematic as they are not designed for the collection and analysis of online data. Therefore, this paper presents a framework for the forensic analysis of user interaction with social media. In particular, it presents an inter-disciplinary approach for the quantitative analysis of user engagement to identify relational and temporal dimensions of evidence relevant to an investigation. This framework enables the analysis of large data sets from which a (much smaller) group of individuals of interest can be identified. In this way, it may be used to support the identification of individuals who might be ‘instigators’ of a criminal event orchestrated via social media, or a means of potentially identifying those who might be involved in the ‘peaks’ of activity. In order to demonstrate the applicability of the framework, this paper applies it to a case study of actors posting to a social media Web site.
Resumo:
Background 29 autoimmune diseases, including Rheumatoid Arthritis, gout, Crohn’s Disease, and Systematic Lupus Erythematosus affect 7.6-9.4% of the population. While effective therapy is available, many patients do not follow treatment or use medications as directed. Digital health and Web 2.0 interventions have demonstrated much promise in increasing medication and treatment adherence, but to date many Internet tools have proven disappointing. In fact, most digital interventions continue to suffer from high attrition in patient populations, are burdensome for healthcare professionals, and have relatively short life spans. Objective Digital health tools have traditionally centered on the transformation of existing interventions (such as diaries, trackers, stage-based or cognitive behavioral therapy programs, coupons, or symptom checklists) to electronic format. Advanced digital interventions have also incorporated attributes of Web 2.0 such as social networking, text messaging, and the use of video. Despite these efforts, there has not been little measurable impact in non-adherence for illnesses that require medical interventions, and research must look to other strategies or development methodologies. As a first step in investigating the feasibility of developing such a tool, the objective of the current study is to systematically rate factors of non-adherence that have been reported in past research studies. Methods Grounded Theory, recognized as a rigorous method that facilitates the emergence of new themes through systematic analysis, data collection and coding, was used to analyze quantitative, qualitative and mixed method studies addressing the following autoimmune diseases: Rheumatoid Arthritis, gout, Crohn’s Disease, Systematic Lupus Erythematosus, and inflammatory bowel disease. Studies were only included if they contained primary data addressing the relationship with non-adherence. Results Out of the 27 studies, four non-modifiable and 11 modifiable risk factors were discovered. Over one third of articles identified the following risk factors as common contributors to medication non-adherence (percent of studies reporting): patients not understanding treatment (44%), side effects (41%), age (37%), dose regimen (33%), and perceived medication ineffectiveness (33%). An unanticipated finding that emerged was the need for risk stratification tools (81%) with patient-centric approaches (67%). Conclusions This study systematically identifies and categorizes medication non-adherence risk factors in select autoimmune diseases. Findings indicate that patients understanding of their disease and the role of medication are paramount. An unexpected finding was that the majority of research articles called for the creation of tailored, patient-centric interventions that dispel personal misconceptions about disease, pharmacotherapy, and how the body responds to treatment. To our knowledge, these interventions do not yet exist in digital format. Rather than adopting a systems level approach, digital health programs should focus on cohorts with heterogeneous needs, and develop tailored interventions based on individual non-adherence patterns.
Resumo:
Species distribution models (SDM) are increasingly used to understand the factors that regulate variation in biodiversity patterns and to help plan conservation strategies. However, these models are rarely validated with independently collected data and it is unclear whether SDM performance is maintained across distinct habitats and for species with different functional traits. Highly mobile species, such as bees, can be particularly challenging to model. Here, we use independent sets of occurrence data collected systematically in several agricultural habitats to test how the predictive performance of SDMs for wild bee species depends on species traits, habitat type, and sampling technique. We used a species distribution modeling approach parametrized for the Netherlands, with presence records from 1990 to 2010 for 193 Dutch wild bees. For each species, we built a Maxent model based on 13 climate and landscape variables. We tested the predictive performance of the SDMs with independent datasets collected from orchards and arable fields across the Netherlands from 2010 to 2013, using transect surveys or pan traps. Model predictive performance depended on species traits and habitat type. Occurrence of bee species specialized in habitat and diet was better predicted than generalist bees. Predictions of habitat suitability were also more precise for habitats that are temporally more stable (orchards) than for habitats that suffer regular alterations (arable), particularly for small, solitary bees. As a conservation tool, SDMs are best suited to modeling rarer, specialist species than more generalist and will work best in long-term stable habitats. The variability of complex, short-term habitats is difficult to capture in such models and historical land use generally has low thematic resolution. To improve SDMs’ usefulness, models require explanatory variables and collection data that include detailed landscape characteristics, for example, variability of crops and flower availability. Additionally, testing SDMs with field surveys should involve multiple collection techniques.
Resumo:
The primary objective of this research study is to determine which form of testing, the PEST algorithm or an operator-controlled condition is most accurate and time efficient for administration of the gaze stabilization test
Resumo:
We present a catalogue of galaxy photometric redshifts and k-corrections for the Sloan Digital Sky Survey Data Release 7 (SDSS-DR7), available on the World Wide Web. The photometric redshifts were estimated with an artificial neural network using five ugriz bands, concentration indices and Petrosian radii in the g and r bands. We have explored our redshift estimates with different training sets, thus concluding that the best choice for improving redshift accuracy comprises the main galaxy sample (MGS), the luminous red galaxies and the galaxies of active galactic nuclei covering the redshift range 0 < z < 0.3. For the MGS, the photometric redshift estimates agree with the spectroscopic values within rms = 0.0227. The distribution of photometric redshifts derived in the range 0 < z(phot) < 0.6 agrees well with the model predictions. k-corrections were derived by calibration of the k-correct_v4.2 code results for the MGS with the reference-frame (z = 0.1) (g - r) colours. We adopt a linear dependence of k-corrections on redshift and (g - r) colours that provide suitable distributions of luminosity and colours for galaxies up to redshift z(phot) = 0.6 comparable to the results in the literature. Thus, our k-correction estimate procedure is a powerful, low computational time algorithm capable of reproducing suitable results that can be used for testing galaxy properties at intermediate redshifts using the large SDSS data base.
Resumo:
The widespread use of service-oriented architectures (SOAs) and Web services in commercial software requires the adoption of development techniques to ensure the quality of Web services. Testing techniques and tools concern quality and play a critical role in accomplishing quality of SOA based systems. Existing techniques and tools for traditional systems are not appropriate to these new systems, making the development of Web services testing techniques and tools required. This article presents new testing techniques to automatically generate a set of test cases and data for Web services. The techniques presented here explore data perturbation of Web services messages upon data types, integrity and consistency. To support these techniques, a tool (GenAutoWS) was developed and applied to real problems. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Personalized communication is when the marketing message is adapted to each individual by using information from a databaseand utilizing it in the various, different media channels available today. That gives the marketer the possibility to create a campaign that cuts through today’s clutter of marketing messages and gets the recipients attention. PODi is a non-profit organization that was started with the aim of contributing knowledge in the field of digital printingtechnologies. They have created a database of case studies showing companies that have successfully implemented personalizedcommunication in their marketing campaigns. The purpose of the project was therefore to analyze PODi case studies with the main objective of finding out if/how successfully the PODi-cases have been and what made them so successful. To collect the data found in the PODi cases the authors did a content analysis with a sample size of 140 PODi cases from the year 2008 to 2010. The study was carried out by analyzing the cases' measurable ways of success: response rate, conversion rate, visited PURL (personalized URL:s) and ROI (Return On Investment). In order to find out if there were any relationships to be found between the measurable result and what type of industry, campaign objective and media vehicle that was used in the campaign, the authors put up different research uestions to explore that. After clustering and merging the collected data the results were found to be quite spread but shows that the averages of response rates, visited PURL and conversion rates were consistently very high. In the study the authors also collected and summarized what the companies themselves claim to be the reasons for success with their marketing campaigns. The resultshows that the creation of a personalized campaign is complex and dependent on many different variables. It is for instance ofgreat importance to have a well thought-out plan with the campaign and to have good data and insights about the customer in order to perform creative personalization. It is also important to make it easy for the recipient to reply, to use several media vehicles for multiple touch points and to have an attractive and clever design.
Resumo:
This thesis is about new digital moving image recording technologies and how they augment the distribution of creativity and the flexibility in moving image production systems, but also impose constraints on how images flow through the production system. The central concept developed in this thesis is ‘creative space’ which links quality and efficiency in moving image production to time for creative work, capacity of digital tools, user skills and the constitution of digital moving image material. The empirical evidence of this thesis is primarily based on semi-structured interviews conducted with Swedish film and TV production representatives.This thesis highlights the importance of pre-production technical planning and proposes a design management support tool (MI-FLOW) as a way to leverage functional workflows that is a prerequisite for efficient and cost effective moving image production.
Resumo:
This licentiate thesis has the main focus on evaluation of the wear of coated and uncoated polycrystalline cubic boron nitride cutting tool used in cutting operations against hardened steel. And to exam the surface finish and integrity of the work material used. Harder work material, higher cutting speed and cost reductions result in the development of harder and more wear resistance cutting tools. Although PCBN cutting tools have been used in over 30 years, little work have been done on PVD coated PCBN cutting tools. Therefore hard turning and hard milling experiments with PVD coated and uncoated cutting tools have been performed and evaluated. The coatings used in the present study are TiSiN and TiAlN. The wear scar and surface integrity have been examined with help of several different characterization techniques, for example scanning electron microscopy and Auger electron spectroscopy. The results showed that the PCBN cutting tools used displayed crater wear, flank wear and edge micro chipping. While the influence of the coating on the crater and flank wear was very small and the coating showed a high tendency to spalling. Scratch testing of coated PCBN showed that, the TiAlN coating resulted in major adhesive fractures. This displays the importance of understanding the effect of different types of lapping/grinding processes in the pre-treatment of hard and super hard substrate materials and the amount and type of damage that they can create. For the cutting tools used in turning, patches of a adhered layer, mainly consisting of FexOy were shown at both the crater and flank. And for the cutting tools used in milling a tribofilm consisting of SixOy covered the crater. A combination of tribochemical reactions, adhesive wear and mild abrasive wear is believed to control the flank and crater wear of the PCBN cutting tools. On a microscopic scale the difference phases of the PCBN cutting tool used in turning showed different wear characteristics. The machined surface of the work material showed a smooth surface with a Ra-value in the range of 100-200 nm for the turned surface and 100-150 nm for the milled surface. With increasing crater and flank wear in combination with edge chipping the machined surface becomes rougher and showed a higher Ra-value. For the cutting tools used in milling the tendency to micro edge chipping was significant higher when milling the tools steels showing a higher hard phase content and a lower heat conductivity resulting in higher mechanical and thermal stresses at the cutting edge.
Resumo:
Background: Tens of millions of patients worldwide suffer from avoidable disabling injuries and death every year. Measuring the safety climate in health care is an important step in improving patient safety. The most commonly used instrument to measure safety climate is the Safety Attitudes Questionnaire (SAQ). The aim of the present study was to establish the validity and reliability of the translated version of the SAQ. Methods: The SAQ was translated and adapted to the Swedish context. The survey was then carried out with 374 respondents in the operating room (OR) setting. Data was received from three hospitals, a total of 237 responses. Cronbach's alpha and confirmatory factor analysis (CFA) was used to evaluate the reliability and validity of the instrument. Results: The Cronbach's alpha values for each of the factors of the SAQ ranged between 0.59 and 0.83. The CFA and its goodness-of-fit indices (SRMR 0.055, RMSEA 0.043, CFI 0.98) showed good model fit. Intercorrelations between the factors safety climate, teamwork climate, job satisfaction, perceptions of management, and working conditions showed moderate to high correlation with each other. The factor stress recognition had no significant correlation with teamwork climate, perception of management, or job satisfaction. Conclusions: Therefore, the Swedish translation and psychometric testing of the SAQ (OR version) has good construct validity. However, the reliability analysis suggested that some of the items need further refinement to establish sound internal consistency. As suggested by previous research, the SAQ is potentially a useful tool for evaluating safety climate. However, further psychometric testing is required with larger samples to establish the psychometric properties of the instrument for use in Sweden.
Resumo:
A tool for standardized calculation of solar collector performance has been developed in cooperation between SP Technical Research Institute of Sweden, DTU Denmark and SERC Dalarna University. The tool is designed to calculate the annual performance of solar collectors at representative locations in Europe. The collector parameters used as input in the tool are compiled from tests according to EN12975, without any intermediate conversions. The main target group for this tool is test institutes and certification bodies that are intended to use it for conversion of collector model parameters (derived from performance tests) into a more user friendly quantity: the annual energy output. The energy output presented in the tool is expressed as kWh per collector module. A simplified treatment of performance for PVT collectors is added based on the assumption that the thermal part of the PVT collector can be tested and modeled as a thermal collector, when the PV electric part is active with an MPP tracker in operation. The thermal collector parameters from this operation mode are used for the PVT calculations. © 2012 The Authors.
Resumo:
An operational complexity model (OCM) is proposed to enable the complexity of both the cognitive and the computational components of a process to be determined. From the complexity of formation of a set of traces via a specified route a measure of the probability of that route can be determined. By determining the complexities of alternative routes leading to the formation of the same set of traces, the odds ratio indicating the relative plausibility of the alternative routes can be found. An illustrative application to a BitTorrent piracy case is presented, and the results obtained suggest that the OCM is capable of providing a realistic estimate of the odds ratio for two competing hypotheses. It is also demonstrated that the OCM can be straightforwardly refined to encompass a variety of circumstances.
Resumo:
This article describes analyzing Interlibrary Loan data to help inform collection management decision and offers guidance for formulating policies for discerning borrowed titles indicative of gaps in the library from special-interest pursuits beyond the scope of the university curriculum.