945 resultados para digital forensic tool testing
Resumo:
A number of patterning methods including conventional photo-lithography and E-beam lithography have been employed to pattern devices with critical dimensions of submicrometer levels. The methods of device fabrication by lithography and multilevel processing are usually specific to the chemical and physical properties of the etchants and materials used, and require a number of processing steps. As an alternative, focused ion beam (FIB) lithography is a unique and straightforward tool to rapidly develop nanomagnetic prototyping devices. This feature of FIB is critical to conduct the basic study necessary to advance the state-of-the-art in magnetic recording. ^ The dissertation develops a specific design of nanodevices and demonstrates FIB-fabricated stable and reproducible magnetic nanostructures with a critical dimension of about 10 nm. The project included the fabrication of a patterned single and multilayer magnetic media with areal densities beyond 10 Terabit/in 2. Each block had perpendicular or longitudinal magnetic anisotropy and a single domain structure. The purpose was to demonstrate how the ability of FIB to directly etch nanoscale patterns allowed exploring (even in the academic environment) the true physics of various types of nanostructures. ^ Another goal of this study was the investigation of FIB patterned magnetic media with a set of characterization tools: e.g. Spinstand Guzik V2002, magnetic force microscopy, scanning electron microscopy with energy dispersive system and wavelength dispersive system. ^ In the course of this work, a unique prototype of a record high density patterned magnetic media device capable of 10 terabit/in 2 was built. The read/write testing was performed by a Guzik spinstand. The readback signals were recorded and analyzed by a digital oscilloscope. A number of different configurations for writing and reading information from a magnetic medium were explored. The prototype transducers for this work were fabricated via FIB trimming of different magnetic recording heads. ^
Resumo:
The primary purpose of this study was to examine the influences of literacy variables on high-stakes test performance including: (a) student achievement on the Metropolitan Achievement Test, Seventh Edition (MAT-7) as correlated to the high-stakes test such as the FCAT examination and (b) the English language proficiency attained by English Language Learners (ELL) students when participating in, or exiting from English Speakers of Other Languages (ESOL) program as determined by the Limited English Proficient (LEP) committee. ^ Two one-sample Chi-square tests were conducted to investigate the relationship between passing the MAT-7 Reading and Language examinations and the FCAT-SSS Reading Comprehension and FCAT-NRT examinations. In addition, 2x2 Analyses of Variance (ANOVAs) were conducted to address the relationship between the time ELL students spent in the ESOL program and the level of achievement on MAT-7 Reading and Language examinations and the FCAT-SSS Reading Comprehension and FCAT-NRT. ^ Findings of this study indicated that more ELL students exit the program based on the LEP committee decisions than by passing the MAT-7. The majority of ELL students failed the 10th grade FCAT, the passing of which is needed for graduation. A significant number of ELL students failed, even when passing the MAT-7 or being duly exited through the decision of the LEP committee. The data also indicated that ELL students who exited the ESOL program in six semesters or fewer had higher FCAT scores than those who exited the program in seven semesters or more. The MAT-7 and the decision of the LEP committee were shown to be ineffective as predictors of success on the FCAT. ^ Further research to determine the length of time a student in the ESOL program uses English to read, write, and speak should be conducted. Additionally, the development of a new assessment instrument to better predict student success should be considered. However, it should be noted that the results of this study are limited to the context in which it was conducted and does not warrant generalizations beyond that context. ^
Resumo:
This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.
Resumo:
This study identifies and describes HIV Voluntary Counseling and Testing (VCT) of middle aged and older Latinas. The rate of new cases of HIV in people age 45 and older is rapidly increasing, with a 40.6% increase in the numbers of older Latinas infected with HIV between 1998 and 2002. Despite this increase, there is paucity of research on this population. This research seeks to address the gap through a secondary data analysis of Latina women. The aim of this study is twofold: (1) Develop and empirically test a multivariate model of VCT utilization for middle aged and older Latinas; (2) To test how the three individual components of the Andersen Behavioral Model impact VCT for middle aged and older Latinas. The study is organized around the three major domains of the Andersen Behavioral Model of service use that include: (a) predisposing factors; (b) enabling characteristics and (c) need. Logistic regression using structural equation modeling techniques were used to test multivariate relationships of variables on VCT for a sample of 135 middle age and older Latinas residing in Miami-Dade County, Florida. Over 60% of participants had been tested for HIV. Provider endorsement was found to he the strongest predictor of VCT (odds ration [OR] 6.38), followed by having a clinic as a regular source of healthcare (OR=3.88). Significant negative associations with VCT included self rated health status (OR=.592); Age (OR=.927); Spanish proficiency (OR=.927); number of sexual partners (OR=.613) and consumption of alcohol during sexual activity (.549). As this line of inquiry provides a critical glimpse into the VCT of older Latinas, recommendations for enhanced service provision and research will he offered.
Resumo:
The purpose of this investigation was to develop and implement a general purpose VLSI (Very Large Scale Integration) Test Module based on a FPGA (Field Programmable Gate Array) system to verify the mechanical behavior and performance of MEM sensors, with associated corrective capabilities; and to make use of the evolving System-C, a new open-source HDL (Hardware Description Language), for the design of the FPGA functional units. System-C is becoming widely accepted as a platform for modeling, simulating and implementing systems consisting of both hardware and software components. In this investigation, a Dual-Axis Accelerometer (ADXL202E) and a Temperature Sensor (TMP03) were used for the test module verification. Results of the test module measurement were analyzed for repeatability and reliability, and then compared to the sensor datasheet. Further study ideas were identified based on the study and results analysis. ASIC (Application Specific Integrated Circuit) design concepts were also being pursued.
Resumo:
The purpose of this research was to demonstrate the applicability of reduced-size STR (Miniplex) primer sets to challenging samples and to provide the forensic community with new information regarding the analysis of degraded and inhibited DNA. The Miniplex primer sets were validated in accordance with guidelines set forth by the Scientific Working Group on DNA Analysis Methods (SWGDAM) in order to demonstrate the scientific validity of the kits. The Miniplex sets were also used in the analysis of DNA extracted from human skeletal remains and telogen hair. In addition, a method for evaluating the mechanism of PCR inhibition was developed using qPCR. The Miniplexes were demonstrated to be a robust and sensitive tool for the analysis of DNA with as low as 100 pg of template DNA. They also proved to be better than commercial kits in the analysis of DNA from human skeletal remains, with 64% of samples tested producing full profiles, compared to 16% for a commercial kit. The Miniplexes also produced amplification of nuclear DNA from human telogen hairs, with partial profiles obtained from as low as 60 pg of template DNA. These data suggest smaller PCR amplicons may provide a useful alternative to mitochondrial DNA for forensic analysis of degraded DNA from human skeletal remains, telogen hairs, and other challenging samples. In the evaluation of inhibition by qPCR, the effect of amplicon length and primer melting temperature was evaluated in order to determine the binding mechanisms of different PCR inhibitors. Several mechanisms were indicated by the inhibitors tested, including binding of the polymerase, binding to the DNA, and effects on the processivity of the polymerase during primer extension. The data obtained from qPCR illustrated a method by which the type of inhibitor could be inferred in forensic samples, and some methods of reducing inhibition for specific inhibitors were demonstrated. An understanding of the mechanism of the inhibitors found in forensic samples will allow analysts to select the proper methods for inhibition removal or the type of analysis that can be performed, and will increase the information that can be obtained from inhibited samples.
Resumo:
The integration of automation (specifically Global Positioning Systems (GPS)) and Information and Communications Technology (ICT) through the creation of a Total Jobsite Management Tool (TJMT) in construction contractor companies can revolutionize the way contractors do business. The key to this integration is the collection and processing of real-time GPS data that is produced on the jobsite for use in project management applications. This research study established the need for an effective planning and implementation framework to assist construction contractor companies in navigating the terrain of GPS and ICT use. An Implementation Framework was developed using the Action Research approach. The framework consists of three components, as follows: (i) ICT Infrastructure Model, (ii) Organizational Restructuring Model, and (iii) Cost/Benefit Analysis. The conceptual ICT infrastructure model was developed for the purpose of showing decision makers within highway construction companies how to collect, process, and use GPS data for project management applications. The organizational restructuring model was developed to assist companies in the analysis and redesign of business processes, data flows, core job responsibilities, and their organizational structure in order to obtain the maximum benefit at the least cost in implementing GPS as a TJMT. A cost-benefit analysis which identifies and quantifies the cost and benefits (both direct and indirect) was performed in the study to clearly demonstrate the advantages of using GPS as a TJMT. Finally, the study revealed that in order to successfully implement a program to utilize GPS data as a TJMT, it is important for construction companies to understand the various implementation and transitioning issues that arise when implementing this new technology and business strategy. In the study, Factors for Success were identified and ranked to allow a construction company to understand the factors that may contribute to or detract from the prospect for success during implementation. The Implementation Framework developed as a result of this study will serve to guide highway construction companies in the successful integration of GPS and ICT technologies for use as a TJMT.
Resumo:
We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.
Resumo:
This dissertation established a state-of-the-art programming tool for designing and training artificial neural networks (ANNs) and showed its applicability to brain research. The developed tool, called NeuralStudio, allows users without programming skills to conduct studies based on ANNs in a powerful and very user friendly interface. A series of unique features has been implemented in NeuralStudio, such as ROC analysis, cross-validation, network averaging, topology optimization, and optimization of the activation function’s slopes. It also included a Support Vector Machines module for comparison purposes. Once the tool was fully developed, it was applied to two studies in brain research. In the first study, the goal was to create and train an ANN to detect epileptic seizures from subdural EEG. This analysis involved extracting features from the spectral power in the gamma frequencies. In the second application, a unique method was devised to link EEG recordings to epileptic and nonepileptic subjects. The contribution of this method consisted of developing a descriptor matrix that can be used to represent any EEG file regarding its duration and the number of electrodes. The first study showed that the inter-electrode mean of the spectral power in the gamma frequencies and its duration above a specific threshold performs better than the other frequencies in seizure detection, exhibiting an accuracy of 95.90%, a sensitivity of 92.59%, and a specificity of 96.84%. The second study yielded that Hjorth’s parameter activity is sufficient to accurately relate EEG to epileptic and non-epileptic subjects. After testing, accuracy, sensitivity and specificity of the classifier were all above 0.9667. Statistical tests measured the superiority of activity at over 99.99 % certainty. It was demonstrated that (1) the spectral power in the gamma frequencies is highly effective in locating seizures from EEG and (2) activity can be used to link EEG recordings to epileptic and non-epileptic subjects. These two studies required high computational load and could be addressed thanks to NeuralStudio. From a medical perspective, both methods proved the merits of NeuralStudio in brain research applications. For its outstanding features, NeuralStudio has been recently awarded a patent (US patent No. 7502763).
Resumo:
Damages during extreme wind events highlight the weaknesses of mechanical fasteners at the roof-to-wall connections in residential timber frame buildings. The allowable capacity of the metal fasteners is based on results of unidirectional component testing that do not simulate realistic tri-axial aerodynamic loading effects. The first objective of this research was to simulate hurricane effects and study hurricane-structure interaction at full-scale, facilitating better understanding of the combined impacts of wind, rain, and debris on inter-component connections at spatial and temporal scales. The second objective was to evaluate the performance of a non-intrusive roof-to-wall connection system using fiber reinforced polymer (FRP) materials and compare its load capacity to the capacity of an existing metal fastener under simulated aerodynamic loads. ^ The Wall of Wind (WoW) testing performed using FRP connections on a one-story gable-roof timber structure instrumented with a variety of sensors, was used to create a database on aerodynamic and aero-hydrodynamic loading on roof-to-wall connections tested under several parameters: angles of attack, wind-turbulence content, internal pressure conditions, with and without effects of rain. Based on the aerodynamic loading results obtained from WoW tests, sets of three force components (tri-axial mean loads) were combined into a series of resultant mean forces, which were used to test the FRP and metal connections in the structures laboratory up to failure. A new component testing system and test protocol were developed for testing fasteners under simulated triaxial loading as opposed to uni-axial loading. The tri-axial and uni-axial test results were compared for hurricane clips. Also, comparison was made between tri-axial load capacity of FRP and metal connections. ^ The research findings demonstrate that the FRP connection is a viable option for use in timber roof-to-wall connection system. Findings also confirm that current testing methods of mechanical fasteners tend to overestimate the actual load capacities of a connector. Additionally, the research also contributes to the development a new testing protocol for fasteners using tri-axial simultaneous loads based on the aerodynamic database obtained from the WoW testing. ^
Resumo:
Stereotype threat (Steele & Aronson, 1995) refers to the risk of confirming a negative stereotype about one’s group in a particular performance domain. The theory assumes that performance in the stereotyped domain is most negatively affected when individuals are more highly identified with the domain in question. As federal law has increased the importance of standardized testing at the elementary level, it can be reasonably hypothesized that the standardized test performance of African American children will be depressed when they are aware of negative societal stereotypes about the academic competence of African Americans. This sequential mixed-methods study investigated whether the standardized testing experiences of African American children in an urban elementary school are related to their level of stereotype awareness. The quantitative phase utilized data from 198 African American children at an urban elementary school. Both ex-post facto and experimental designs were employed. Experimental conditions were diagnostic and non-diagnostic testing experiences. The qualitative phase utilized data from a series of six focus group interviews conducted with a purposefully selected group of 4 African American children. The interview data were supplemented with data from 30 hours of classroom observations. Quantitative findings indicated that the stereotype threat condition evoked by diagnostic testing depresses the reading test performance of stereotype-aware African American children (F[1, 194] = 2.21, p < .01). This was particularly true of students who are most highly domain-identified with reading (F[1, 91] = 19.18, p < .01). Moreover, findings indicated that only stereotype-aware African American children who were highly domain-identified were more likely to experience anxiety in the diagnostic condition (F[1, 91] = 5.97, p < .025). Qualitative findings revealed 4 themes regarding how African American children perceive and experience the factors related to stereotype threat: (1) a narrow perception of education as strictly test preparation, (2) feelings of stress and anxiety related to the state test, (3) concern with what “others” think (racial salience), and (4) stereotypes. A new conceptual model for stereotype threat is presented, and future directions including implications for practice and policy are discussed.
Resumo:
In topographically flat wetlands, where shallow water table and conductive soil may develop as a result of wet and dry seasons, the connection between surface water and groundwater is not only present, but perhaps the key factor dominating the magnitude and direction of water flux. Due to their complex characteristics, modeling waterflow through wetlands using more realistic process formulations (integrated surface-ground water and vegetative resistance) is an actual necessity. This dissertation focused on developing an integrated surface – subsurface hydrologic simulation numerical model by programming and testing the coupling of the USGS MODFLOW-2005 Groundwater Flow Process (GWF) package (USGS, 2005) with the 2D surface water routing model: FLO-2D (O’Brien et al., 1993). The coupling included the necessary procedures to numerically integrate and verify both models as a single computational software system that will heretofore be referred to as WHIMFLO-2D (Wetlands Hydrology Integrated Model). An improved physical formulation of flow resistance through vegetation in shallow waters based on the concept of drag force was also implemented for the simulations of floodplains, while the use of the classical methods (e.g., Manning, Chezy, Darcy-Weisbach) to calculate flow resistance has been maintained for the canals and deeper waters. A preliminary demonstration exercise WHIMFLO-2D in an existing field site was developed for the Loxahatchee Impoundment Landscape Assessment (LILA), an 80 acre area, located at the Arthur R. Marshall Loxahatchee National Wild Life Refuge in Boynton Beach, Florida. After applying a number of simplifying assumptions, results have illustrated the ability of the model to simulate the hydrology of a wetland. In this illustrative case, a comparison between measured and simulated stages level showed an average error of 0.31% with a maximum error of 2.8%. Comparison of measured and simulated groundwater head levels showed an average error of 0.18% with a maximum of 2.9%. The coupling of FLO-2D model with MODFLOW-2005 model and the incorporation of the dynamic effect of flow resistance due to vegetation performed in the new modeling tool WHIMFLO-2D is an important contribution to the field of numerical modeling of hydrologic flow in wetlands.
Resumo:
This dissertation introduces a novel automated book reader as an assistive technology tool for persons with blindness. The literature shows extensive work in the area of optical character recognition, but the current methodologies available for the automated reading of books or bound volumes remain inadequate and are severely constrained during document scanning or image acquisition processes. The goal of the book reader design is to automate and simplify the task of reading a book while providing a user-friendly environment with a realistic but affordable system design. This design responds to the main concerns of (a) providing a method of image acquisition that maintains the integrity of the source (b) overcoming optical character recognition errors created by inherent imaging issues such as curvature effects and barrel distortion, and (c) determining a suitable method for accurate recognition of characters that yields an interface with the ability to read from any open book with a high reading accuracy nearing 98%. This research endeavor focuses in its initial aim on the development of an assistive technology tool to help persons with blindness in the reading of books and other bound volumes. But its secondary and broader aim is to also find in this design the perfect platform for the digitization process of bound documentation in line with the mission of the Open Content Alliance (OCA), a nonprofit Alliance at making reading materials available in digital form. The theoretical perspective of this research relates to the mathematical developments that are made in order to resolve both the inherent distortions due to the properties of the camera lens and the anticipated distortions of the changing page curvature as one leafs through the book. This is evidenced by the significant increase of the recognition rate of characters and a high accuracy read-out through text to speech processing. This reasonably priced interface with its high performance results and its compatibility to any computer or laptop through universal serial bus connectors extends greatly the prospects for universal accessibility to documentation.
Resumo:
Current artificial heart valves are classified as mechanical and bioprosthetic. An appealing pathway that promises to overcome the shortcomings of commercially available heart valves is offered by the interdisciplinary approach of cardiovascular tissue engineering. However, the mechanical properties of the Tissue Engineering Heart Valves (TEHV) are limited and generally fail in the long-term use. To meet this performance challenge novel biodegradable triblock copolymer poly(ethylene oxide)-polypropylene oxide)-poly(ethylene oxide) (PEO-PPO-PEO or F108) crosslinked to Silk Fibroin (F108-SilkC) to be used as tri-leaflet heart valve material was investigated. ^ Synthesis of ten polymers with varying concentration and thickness (55 µm, 75 µm and 100 µm) was achieved via a covalent crosslinking scheme using bifunctional polyethylene glycol diglycidyl ether (PEGDE). Static and fatigue testing were used to assess mechanical properties of films, and hydrodynamic testing was performed to determine performance under a simulated left ventricular flow regime. The crosslinked copolymer (F108-Silk C) showed greater flexibility and resilience, but inferior ultimate tensile strength, by increasing concentration of PEGDE. Concentration molar ratio of 80:1 (F108: Silk) and thickness of 75 µm showed longer fatigue life for both tension-tension and bending fatigue tests. Four valves out of twelve designed satisfactorily complied with minimum performance requirement ISO 5840, 2005. ^ In conclusion, it was demonstrated that the applicability of a degradable polymer in conjugation with silk fibroin for tissue engineering cardiovascular use, specifically for aortic valve leaflet design, met the performance demands. Thinner thicknesses (t<75 µm) in conjunction with stiffness lower than 320 MPa (80:1, F108: Silk) are essential for the correct functionality of proposed heart valve biomaterial F108-SilkC. Fatigue tests were demonstrated to be a useful tool to characterize biomaterials that undergo cyclic loading. ^
Resumo:
There is limited scientific knowledge on the composition of human odor from different biological specimens and the effect that physiological and psychological health conditions could have on them. There is currently no direct comparison of the volatile organic compounds (VOCs) emanating from different biological specimens collected from healthy individuals as well as individuals with certain diagnosed medical conditions. Therefore the question of matching VOCs present in human odor across various biological samples and across health statuses remains unanswered. The main purpose of this study was to use analytical instrumental methods to compare the VOCs from different biological specimens from the same individual and to compare the populations evaluated in this project. The goals of this study were to utilize headspace solid-phase microextraction gas chromatography mass spectrometry (HS-SPME-GC/MS) to evaluate its potential for profiling VOCs from specimens collected using standard forensic and medical methods over three different populations: healthy group with no diagnosed medical or psychological condition, one group with diagnosed type 2 diabetes, and one group with diagnosed major depressive disorder. The pre-treatment methods of collection materials developed for the study allowed for the removal of targeted VOCs from the sampling kits prior to sampling, extraction and analysis. Optimized SPME-GC/MS conditions has been demonstrated to be capable of sampling, identifying and differentiating the VOCs present in the five biological specimens collected from different subjects and yielded excellent detection limits for the VOCs from buccal swab, breath, blood, and urine with average limits of detection of 8.3 ng. Visual, Spearman rank correlation, and PCA comparisons of the most abundant and frequent VOCs from each specimen demonstrated that each specimen has characteristic VOCs that allow them to be differentiated for both healthy and diseased individuals. Preliminary comparisons of VOC profiles of healthy individuals, patients with type 2 diabetes, and patients with major depressive disorder revealed compounds that could be used as potential biomarkers to differentiate between healthy and diseased individuals. Finally, a human biological specimen compound database has been created compiling the volatile compounds present in the emanations of human hand odor, oral fluids, breath, blood, and urine.