945 resultados para digital forensic tool testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nearly every industry, including hospitality, has adopted database marketing techniques. Why have they become so popular and what advantages do they offer for hospitality companies? The authors examine these issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detection canines represent the fastest and most versatile means of illicit material detection. This research endeavor in its most simplistic form is the improvement of detection canines through training, training aids, and calibration. This study focuses on developing a universal calibration compound for which all detection canines, regardless of detection substance, can be tested daily to ensure that they are working with acceptable parameters. Surrogate continuation aids (SCAs) were developed for peroxide based explosives along with the validation of the SCAs already developed within the International Forensic Research Institute (IFRI) prototype surrogate explosives kit. Storage parameters of the SCAs were evaluated to give recommendations to the detection canine community on the best possible training aid storage solution that minimizes the likelihood of contamination. Two commonly used and accepted detection canine imprinting methods were also evaluated for the speed in which the canine is trained and their reliability. As a result of the completion of this study, SCAs have been developed for explosive detection canine use covering: peroxide based explosives, TNT based explosives, nitroglycerin based explosives, tagged explosives, plasticized explosives, and smokeless powders. Through the use of these surrogate continuation aids a more uniform and reliable system of training can be implemented in the field than is currently used today. By examining the storage parameters of the SCAs, an ideal storage system has been developed using three levels of containment for the reduction of possible contamination. The developed calibration compound will ease the growing concerns over the legality and reliability of detection canine use by detailing the daily working parameters of the canine, allowing for Daubert rules of evidence admissibility to be applied. Through canine field testing, it has been shown that the IFRI SCAs outperform other commercially available training aids on the market. Additionally, of the imprinting methods tested, no difference was found in the speed in which the canines are trained or their reliability to detect illicit materials. Therefore, if the recommendations discovered in this study are followed, the detection canine community will greatly benefit through the use of scientifically validated training techniques and training aids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first part of the study examined the effect of industry risk changes on perceived audit risk at the financial statement level and whether these changes depended on individual differences such as experience and tolerance for ambiguity. ^ Forty-eight auditors from two offices of one of the “Big 5” CPA firms participated in this study. The ANOVA results supported the effect of industry risk in the assessment of audit risk at the financial statement level. Higher industry risk was associated with higher perceived audit risk. Tolerance for ambiguity was also significant in explaining the changes in the assessment of audit risk. Auditors with a high tolerance for ambiguity perceived lower audit risk than auditors with a low tolerance for ambiguity. Although ANOVA results did not find experience to be significant, a t-test for experience showed it to be marginally significant and inversely related to audit risk. ^ The second part of this study examined whether differences in perceived audit risk at the financial statement level altered the extent, nature or timing of the planned auditing procedures. The results of the MANOVA suggested an overall audit risk effect at the financial statement level. Perceived audit risk was significant in explaining the variation in the number of hours planned for the total cycle and the number of hours p1anned for the tests of balances and details. Perceived audit risk was not significant in determining the analytical review procedures planned, but assessed inherent risk at the cycle level was significant. The higher the inherent risk the more analytical procedures were planned. Perceived audit risk was not significant in explaining the timing of the procedures, but individual differences were significant. The results showed that experienced auditors and those with a high tolerance for ambiguity were less likely to postpone the performance of the interim procedures or the time at which the majority of audit work would be done. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. ^ The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study evaluated the early development and pilot-testing of Project IMPACT, a case management intervention for victims of stalking. The Design and Development framework (Rothman & Thomas, 1994) was used as a guide for program development and evaluation. Nine research questions examined the processes and outcomes associated with program implementation. ^ The sample included all 36 clients who participated in Project IMPACT between February of 2000 and June of 2001, as well as the victim advocates who provided them with services. Quantitative and qualitative data were drawn from client case files, participant observation field notes and interview transcriptions. Quantitative data were entered into three databases where: (1) clients were the units of analysis (n = 36), (2) services were the units of analysis (n = 1146), and (3) goals were the units of analysis (n = 149). These data were analyzed using descriptive statistics, Pearson's Chi-square, Spearman's Rho, Phi, Cramer's V, Wilcoxon's Matched Pairs Signed-Ranked Test and McNemar's Test Statistic. Qualitative data were reduced via open, axial and selective coding methods. Grounded theory and case study frameworks were utilized to analyze these data. ^ Results showed that most clients noted an improved sense of well-being and safety, although residual symptoms of trauma remained for numerous individuals. Stalkers appeared to respond to criminal and civil justice-based interventions by reducing violent and threatening behaviors; however, covert behaviors continued. The study produced findings that provided preliminary support for the use of several intervention components including support services, psycho-education, safety planning, and boundary spanning. The psycho-education and safety planning in particular seemed to help clients cognitively reframe their perceptions of the stalking experience and gain a sense of increased safety and well-being. A 65% level of satisfactory goal achievement was observed overall, although goals involving justice-based organizations were associated with lower achievement. High service usage was related to low-income clients and those lacking in social support. Numerous inconsistencies in program implementation were found to be associated with the skills and experiences of victim advocates. Thus, recommendations were made to further refine, develop and evaluate the intervention. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rewards and sanctions associated with high-stakes testing may induce educators to participate in practices that will ensure the elimination of the scores of low-achieving students from the testing pool. Two ways in which scores may be eliminated is through retention or referral to special education. ^ This study examined the use of these practices at 179 elementary schools in Miami-Dade County Public Schools, the 4th largest school district in the country. Between- and within-subjects designs were analyzed using repeated measures analysis of variance to compare retention and referral to special education practices over a five-year period of time, two years prior to and two years after the implementation of Florida's high-stakes test, the Florida Comprehensive Assessment Test, FCAT. ^ Significant main effects for referral and retention over time were demonstrated. The use of retention steadily increased over the first three years, with its usage maintained during the fourth year. While the use of referral actually decreased from the first to second years, a significant change occurred after the implementation of the FCAT. ^ Examination of the use of these practices according to student and school characteristics revealed significant differences. Increases in the use of referral across time was significant for Black, non-Hispanic and Hispanic students, all limited English proficiency population categories, medium and low socioeconomic status category schools, all grade levels, and for schools with accountability grades of A. C, D and F with the most striking absolute increase occurring for F schools. Increases in the use of retention across time were significant for all ethnic groups, limited English proficiency categories, and socioeconomic status categories, for grades kindergarten through four and by gender. Significant increases occurred for schools with accountability performance grades of C, D and F; however the most dramatic increase occurred for the F schools. A direct relationship between performance category grade of school and their use of retention was demonstrated. The results suggest that schools changed their use of referral and retention in response to the implementation of the FCAT. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As researchers and practitioners move towards a vision of software systems that configure, optimize, protect, and heal themselves, they must also consider the implications of such self-management activities on software reliability. Autonomic computing (AC) describes a new generation of software systems that are characterized by dynamically adaptive self-management features. During dynamic adaptation, autonomic systems modify their own structure and/or behavior in response to environmental changes. Adaptation can result in new system configurations and capabilities, which need to be validated at runtime to prevent costly system failures. However, although the pioneers of AC recognize that validating autonomic systems is critical to the success of the paradigm, the architectural blueprint for AC does not provide a workflow or supporting design models for runtime testing. ^ This dissertation presents a novel approach for seamlessly integrating runtime testing into autonomic software. The approach introduces an implicit self-test feature into autonomic software by tailoring the existing self-management infrastructure to runtime testing. Autonomic self-testing facilitates activities such as test execution, code coverage analysis, timed test performance, and post-test evaluation. In addition, the approach is supported by automated testing tools, and a detailed design methodology. A case study that incorporates self-testing into three autonomic applications is also presented. The findings of the study reveal that autonomic self-testing provides a flexible approach for building safe, reliable autonomic software, while limiting the development and performance overhead through software reuse. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low-rise buildings are often subjected to high wind loads during hurricanes that lead to severe damage and cause water intrusion. It is therefore important to estimate accurate wind pressures for design purposes to reduce losses. Wind loads on low-rise buildings can differ significantly depending upon the laboratory in which they were measured. The differences are due in large part to inadequate simulations of the low-frequency content of atmospheric velocity fluctuations in the laboratory and to the small scale of the models used for the measurements. A new partial turbulence simulation methodology was developed for simulating the effect of low-frequency flow fluctuations on low-rise buildings more effectively from the point of view of testing accuracy and repeatability than is currently the case. The methodology was validated by comparing aerodynamic pressure data for building models obtained in the open-jet 12-Fan Wall of Wind (WOW) facility against their counterparts in a boundary-layer wind tunnel. Field measurements of pressures on Texas Tech University building and Silsoe building were also used for validation purposes. The tests in partial simulation are freed of integral length scale constraints, meaning that model length scales in such testing are only limited by blockage considerations. Thus the partial simulation methodology can be used to produce aerodynamic data for low-rise buildings by using large-scale models in wind tunnels and WOW-like facilities. This is a major advantage, because large-scale models allow for accurate modeling of architectural details, testing at higher Reynolds number, using greater spatial resolution of the pressure taps in high pressure zones, and assessing the performance of aerodynamic devices to reduce wind effects. The technique eliminates a major cause of discrepancies among measurements conducted in different laboratories and can help to standardize flow simulations for testing residential homes as well as significantly improving testing accuracy and repeatability. Partial turbulence simulation was used in the WOW to determine the performance of discontinuous perforated parapets in mitigating roof pressures. The comparisons of pressures with and without parapets showed significant reductions in pressure coefficients in the zones with high suctions. This demonstrated the potential of such aerodynamic add-on devices to reduce uplift forces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as ƒ-test is performed during each node's split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The elemental analysis of soil is useful in forensic and environmental sciences. Methods were developed and optimized for two laser-based multi-element analysis techniques: laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and laser-induced breakdown spectroscopy (LIBS). This work represents the first use of a 266 nm laser for forensic soil analysis by LIBS. Sample preparation methods were developed and optimized for a variety of sample types, including pellets for large bulk soil specimens (470 mg) and sediment-laden filters (47 mg), and tape-mounting for small transfer evidence specimens (10 mg). Analytical performance for sediment filter pellets and tape-mounted soils was similar to that achieved with bulk pellets. An inter-laboratory comparison exercise was designed to evaluate the performance of the LA-ICP-MS and LIBS methods, as well as for micro X-ray fluorescence (μXRF), across multiple laboratories. Limits of detection (LODs) were 0.01-23 ppm for LA-ICP-MS, 0.25-574 ppm for LIBS, 16-4400 ppm for μXRF, and well below the levels normally seen in soils. Good intra-laboratory precision (≤ 6 % relative standard deviation (RSD) for LA-ICP-MS; ≤ 8 % for μXRF; ≤ 17 % for LIBS) and inter-laboratory precision (≤ 19 % for LA-ICP-MS; ≤ 25 % for μXRF) were achieved for most elements, which is encouraging for a first inter-laboratory exercise. While LIBS generally has higher LODs and RSDs than LA-ICP-MS, both were capable of generating good quality multi-element data sufficient for discrimination purposes. Multivariate methods using principal components analysis (PCA) and linear discriminant analysis (LDA) were developed for discriminations of soils from different sources. Specimens from different sites that were indistinguishable by color alone were discriminated by elemental analysis. Correct classification rates of 94.5 % or better were achieved in a simulated forensic discrimination of three similar sites for both LIBS and LA-ICP-MS. Results for tape-mounted specimens were nearly identical to those achieved with pellets. Methods were tested on soils from USA, Canada and Tanzania. Within-site heterogeneity was site-specific. Elemental differences were greatest for specimens separated by large distances, even within the same lithology. Elemental profiles can be used to discriminate soils from different locations and narrow down locations even when mineralogy is similar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New designer drugs are constantly emerging onto the illicit drug market and it is often difficult to validate and maintaincomprehensive analytical methods for accurate detection of these compounds. Generally, toxicology laboratories utilize a screening method, such as immunoassay, for the presumptive identification of drugs of abuse. When a positive result occurs, confirmatory methods, such as gas chromatography (GC) or liquid chromatography (LC) coupled with mass spectrometry (MS), are required for more sensitive and specific analyses. In recent years, the need to study the activities of these compounds in screening assays as well as to develop confirmatory techniques to detect them in biological specimens has been recognized. Severe intoxications and fatalities have been encountered with emerging designer drugs, presenting analytical challenges for detection and identification of such novel compounds. The first major task of this research was to evaluate the performance of commercially available immunoassays to determine if designer drugs were cross-reactive. The second major task was to develop and validate a confirmatory method, using LC-MS, to identify and quantify these designer drugs in biological specimens.^ Cross-reactivity towards the cathinone derivatives was found to be minimal. Several other phenethylamines demonstrated cross-reactivity at low concentrations, but results were consistent with those published by the assay manufacturer or as reported in the literature. Current immunoassay-based screening methods may not be ideal for presumptively identifying most designer drugs, including the "bath salts." For this reason, an LC-MS based confirmatory method was developed for 32 compounds, including eight cathinone derivatives, with limits of quantification in the range of 1-10 ng/mL. The method was fully validated for selectivity, matrix effects, stability, recovery, precision, and accuracy. In order to compare the screening and confirmatory techniques, several human specimens were analyzed to demonstrate the importance of using a specific analytical method, such as LC-MS, to detect designer drugs in serum as immunoassays lack cross-reactivity with the novel compounds. Overall, minimal cross-reactivity was observed, highlighting the conclusion that these presumptive screens cannot detect many of the designer drugs and that a confirmatory technique, such as the LC-MS, is required for the comprehensive forensic toxicological analysis of designer drugs.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I would like to thank Dr. Philip Stoddard for his patience and guidance throughout the past four years. He has not only taught me about behavior and electricity, but he has also taught me how to think scientifically. Vielka Salazar for making herself available to answer my questions and to help me with my projects. Montserrat Alfaro for providing me with support under times of frustration. Fabian A. Pal, who has often made himself available when I needed help to finish my projects, for being supportive, and for believing in me and my abilities. Most importantly, I would like to thank my parents who have shown tremendous support and patience during the past years. I would also like to thank the Honors Committee, specially Dr. Richards for taking the time to review my thesis and helping me modify it. Finally, I would like to thank the MARC program for providing me with financial assistance and the opportunity to perform this project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capillary electrophoresis (CE) is a modern analytical technique, which is electrokinetic separation generated by high voltage and taken place inside the small capillaries. In this dissertation, several advanced capillary electrophoresis methods are presented using different approaches of CE and UV and mass spectrometry are utilized as the detection methods. ^ Capillary electrochromatography (CEC), as one of the CE modes, is a recent developed technique which is a hybrid of capillary electrophoresis and high performance liquid chromatography (HPLC). Capillary electrochromatography exhibits advantages of both techniques. In Chapter 2, monolithic capillary column are fabricated using in situ photoinitiation polymerization method. The column was then applied for the separation of six antidepressant compounds. ^ Meanwhile, a simple chiral separation method is developed and presented in Chapter 3. Beta cycodextrin was utilized to achieve the goal of chiral separation. Not only twelve cathinone analytes were separated, but also isomers of several analytes were enantiomerically separated. To better understand the molecular information on the analytes, the TOF-MS system was coupled with the CE. A sheath liquid and a partial filling technique (PFT) were employed to reduce the contamination of MS ionization source. Accurate molecular information was obtained. ^ It is necessary to propose, develop, and optimize new techniques that are suitable for trace-level analysis of samples in forensic, pharmaceutical, and environmental applications. Capillary electrophoresis (CE) was selected for this task, as it requires lower amounts of samples, it simplifies sample preparation, and it has the flexibility to perform separations of neutral and charged molecules as well as enantiomers. ^ Overall, the study demonstrates the versatility of capillary electrophoresis methods in forensic, pharmaceutical, and environmental applications.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Florida International University Libraries’ Web site’s new look was launched in Fall 2001. As a result of the new look, a group formed to undertake a usability study on the top page of the site. The group tested three target groups to determine the usability of the top page. The study pointed out some revisions for the top page; however, more importantly, it suggested areas for future research.