11 resultados para digital forensic tool testing

em DRUM (Digital Repository at the University of Maryland)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic demand increases are pushing aging ground transportation infrastructures to their theoretical capacity. The result of this demand is traffic bottlenecks that are a major cause of delay on urban freeways. In addition, the queues associated with those bottlenecks increase the probability of a crash while adversely affecting environmental measures such as emissions and fuel consumption. With limited resources available for network expansion, traffic professionals have developed active traffic management systems (ATMS) in an attempt to mitigate the negative consequences of traffic bottlenecks. Among these ATMS strategies, variable speed limits (VSL) and ramp metering (RM) have been gaining international interests for their potential to improve safety, mobility, and environmental measures at freeway bottlenecks. Though previous studies have shown the tremendous potential of variable speed limit (VSL) and VSL paired with ramp metering (VSLRM) control, little guidance has been developed to assist decision makers in the planning phase of a congestion mitigation project that is considering VSL or VSLRM control. To address this need, this study has developed a comprehensive decision/deployment support tool for the application of VSL and VSLRM control in recurrently congested environments. The decision tool will assist practitioners in deciding the most appropriate control strategy at a candidate site, which candidate sites have the most potential to benefit from the suggested control strategy, and how to most effectively design the field deployment of the suggested control strategy at each implementation site. To do so, the tool is comprised of three key modules, (1) Decision Module, (2) Benefits Module, and (3) Deployment Guidelines Module. Each module uses commonly known traffic flow and geometric parameters as inputs to statistical models and empirically based procedures to provide guidance on the application of VSL and VSLRM at each candidate site. These models and procedures were developed from the outputs of simulated experiments, calibrated with field data. To demonstrate the application of the tool, a list of real-world candidate sites were selected from the Maryland State Highway Administration Mobility Report. Here, field data from each candidate site was input into the tool to illustrate the step-by-step process required for efficient planning of VSL or VSLRM control. The output of the tool includes the suggested control system at each site, a ranking of the sites based on the expected benefit-to-cost ratio, and guidelines on how to deploy the VSL signs, ramp meters, and detectors at the deployment site(s). This research has the potential to assist traffic engineers in the planning of VSL and VSLRM control, thus enhancing the procedure for allocating limited resources for mobility and safety improvements on highways plagued by recurrent congestion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large component-based systems are often built from many of the same components. As individual component-based software systems are developed, tested and maintained, these shared components are repeatedly manipulated. As a result there are often significant overlaps and synergies across and among the different test efforts of different component-based systems. However, in practice, testers of different systems rarely collaborate, taking a test-all-by-yourself approach. As a result, redundant effort is spent testing common components, and important information that could be used to improve testing quality is lost. The goal of this research is to demonstrate that, if done properly, testers of shared software components can save effort by avoiding redundant work, and can improve the test effectiveness for each component as well as for each component-based software system by using information obtained when testing across multiple components. To achieve this goal I have developed collaborative testing techniques and tools for developers and testers of component-based systems with shared components, applied the techniques to subject systems, and evaluated the cost and effectiveness of applying the techniques. The dissertation research is organized in three parts. First, I investigated current testing practices for component-based software systems to find the testing overlap and synergy we conjectured exists. Second, I designed and implemented infrastructure and related tools to facilitate communication and data sharing between testers. Third, I designed two testing processes to implement different collaborative testing algorithms and applied them to large actively developed software systems. This dissertation has shown the benefits of collaborative testing across component developers who share their components. With collaborative testing, researchers can design algorithms and tools to support collaboration processes, achieve better efficiency in testing configurations, and discover inter-component compatibility faults within a minimal time window after they are introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the continued miniaturization and increasing performance of electronic devices, new technical challenges have arisen. One such issue is delamination occurring at critical interfaces inside the device. This major reliability issue can occur during the manufacturing process or during normal use of the device. Proper evaluation of the adhesion strength of critical interfaces early in the product development cycle can help reduce reliability issues and time-to-market of the product. However, conventional adhesion strength testing is inherently limited in the face of package miniaturization, which brings about further technical challenges to quantify design integrity and reliability. Although there are many different interfaces in today's advanced electronic packages, they can be generalized into two main categories: 1) rigid to rigid connections with a thin flexible polymeric layer in between, or 2) a thin film membrane on a rigid structure. Knowing that every technique has its own advantages and disadvantages, multiple testing methods must be enhanced and developed to be able to accommodate all the interfaces encountered for emerging electronic packaging technologies. For evaluating the adhesion strength of high adhesion strength interfaces in thin multilayer structures a novel adhesion test configuration called “single cantilever adhesion test (SCAT)” is proposed and implemented for an epoxy molding compound (EMC) and photo solder resist (PSR) interface. The test method is then shown to be capable of comparing and selecting the stronger of two potential EMC/PSR material sets. Additionally, a theoretical approach for establishing the applicable testing domain for a four-point bending test method was presented. For evaluating polymeric films on rigid substrates, major testing challenges are encountered for reducing testing scatter and for factoring in the potentially degrading effect of environmental conditioning on the material properties of the film. An advanced blister test with predefined area test method was developed that considers an elasto-plastic analytical solution and implemented for a conformal coating used to prevent tin whisker growth. The advanced blister testing with predefined area test method was then extended by employing a numerical method for evaluating the adhesion strength when the polymer’s film properties are unknown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focuses on design challenges caused by secondary impacts to printed wiring assemblies (PWAs) within hand-held electronics due to accidental drop or impact loading. The continuing increase of functionality, miniaturization and affordability has resulted in a decrease in the size and weight of handheld electronic products. As a result, PWAs have become thinner and the clearances between surrounding structures have decreased. The resulting increase in flexibility of the PWAs in combination with the reduced clearances requires new design rules to minimize and survive possible internal collisions impacts between PWAs and surrounding structures. Such collisions are being termed ‘secondary impact’ in this study. The effect of secondary impact on board-level drop reliability of printed wiring boards (PWBs) assembled with MEMS microphone components, is investigated using a combination of testing, response and stress analysis, and damage modeling. The response analysis is conducted using a combination of numerical finite element modeling and simplified analytic models for additional parametric sensitivity studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Career decision-making self-efficacy and the Big Five traits of neuroticism, extraversion, and conscientiousness were examined as predictors of career indecision in a sample of 181 undergraduates. Participants completed an online survey. I predicted that the Big Five traits and career decision-making self-efficacy would (a) interrelate moderately and (b) each relate significantly and moderately to career indecision. In addition, I predicted that career decision-making self-efficacy would partially mediate the relationships between the Big Five traits and career indecision, while the Big Five traits were predicted to moderate the relationship between career decision-making self-efficacy and career indecision. Finally, I predicted that career decision-making self-efficacy would account for a greater amount of unique variance in career indecision than the Big Five traits. All predicted correlations were significant. Career decision-making self-efficacy fully mediated the relationship of Extraversion to career indecision and partially mediated the relationships of Neuroticism and Conscientiousness to career indecision. Conscientiousness was found to moderate the relationship of career decision-making self-efficacy to career indecision such that the negative relation between self-efficacy and career indecision was stronger in the presence of high conscientiousness. This study builds upon existing research on the prediction of career indecision by examining potential mediating and moderating relationships.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The call to access and preserve the state records that document crimes committed by the state during Guatemala’s civil war has become an archival imperative entangled with neoliberal human rights discourses of “truth, justice, and memory.” 200,000 people were killed and disappeared in Guatemala’s civil war including acts of genocide in which 85% of massacres involved sexual violence committed against Mayan women. This dissertation argues that in an attempt to tell the official story of the civil war, American Human Rights organizations and academic institutions have constructed a normative identity whose humanity is attached to a scientific and evidentiary value as well as an archival status representing the materiality and institutionality of the record. Consequently, Human Rights discourses grounded in Western knowledges, in particular archival science and law, which prioritize the appearance of truth erase the material and epistemological experience of indigenous women during wartimes. As a result, the subjectivity that has surfaced on the record as most legible has mostly pertained to non-indigenous, middle class, urban, leftist men who were victims of enforced disappearance not genocide. This dissertation investigates this conflicting narrative that remembers a non-indigenous revolutionary masculine hero and grants him justice in human rights courtrooms simply because of a document attesting to his death. A main research question addressed in this project is why the promise of "truth and justice" under the name of human rights becomes a contentious site for gendered indigenous bodies? I conduct a discursive and rhetorical analysis of documentary film, declassified Guatemalan police and military records such as Operation Sofia, a military log known for “documenting the genocide” during rural counterinsurgencies executed by the military. I interrogate the ways in which racialized feminicides or the hyper-sexualized racial violence that has historically dehumanized indigenous women falls outside of discourses of vision constructed by Western positivist knowledges to reinscribe the ideal human right subject. I argue for alternative epistemological frames that recognize genocide as sexualized and gendered structures that have simultaneously produced racialized feminicides in order to disrupt the colonial structures of capitalism, patriarchy and heterosexuality. Ironically, these structures of power remain untouched by the dominant human rights discourse and its academic, NGO, and state collaborators that seek "truth and justice" in post-conflict Guatemala.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human immunodeficiency virus (HIV) is a condition in which immune cells become destroyed such that the body may become unable to fight off infections. Engaging in risk-taking behaviors (e.g., substance use) puts people at heightened risk for HIV infection, with mid-to-late adolescents at increasing risk (Leigh & Stall, 1993). Environmental and neurological reasons have been suggested for increased risk-taking among adolescents. First, family-level precursors such as parent-adolescent conflict have been significantly associated with and may pose risk for engaging in substance use and risk-taking (Duncan, Duncan, Biglan, & Ary, 1998). Thus, parent-adolescent conflict may be an important proximal influence on HIV risk behaviors (Lester et al., 2010; Rowe, Wang, Greenbaum, & Liddle, 2008). Yet, the temporal relation between parent-adolescent conflict and adolescent HIV risk-taking behaviors is still unknown. Second, at-risk adolescents may carry a neurobiological predisposition for engaging in trait-like expressions of disinhibited behavior and other risk-taking behaviors (Iacono, Malone, & McGue, 2008). When exposed to interpersonally stressful situations, their likelihood of engagement in HIV risk behaviors may increase. To investigate the role of parent-adolescent conflict in adolescent HIV risk-taking behaviors, 49 adolescents ages 14-17 and their parent were randomly assigned to complete a standardized discussion task to discuss a control topic or a conflict topic. Immediately after the discussion, adolescents completed a laboratory risk-taking measure. In a follow-up visit, eligible adolescents underwent electrophysiological (EEG) recording while completing a task designed to assess the presence of a neurobiological marker for behavioral disinhibition which I hypothesized would moderate the links between conflict and risk-taking. First, findings indicated that during the discussion task, adolescents in the conflict condition evidenced a significantly greater psychophysiological stress response relative to adolescents in the control condition. Second, a neurobiological marker of behavioral disinhibition moderated the relation between discussion condition and adolescent risk-taking, such that adolescents evidencing relatively high levels of a neurobiological marker related to sensation-seeking evidenced greater levels of risk-taking following the conflict condition, relative to the control condition. Lastly, I observed no significant relation between parent-adolescent conflict, the neurobiological marker of behavioral disinhibition and adolescent engagement in real-world risk-taking behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The literature on the determination of flammability limits was reviewed and experts on the ASTM E681 standard were interviewed to identify new means of improving the reproducibility of the ASTM E681 test. Venting was identified as a variable of flammability limits not yet addressed. Limitations of the current system for sealing and venting (a rubber stopper) were identified and addressed by the development of a custom burst disc. The burst disc was evaluated for its ability to hold and maintain a vacuum, its ability to vent at pressures of interest, and for its venting phenomena. The burst disc was deemed to be a satisfactory alternative to the rubber stopper and is recommended to be included in the ASTM E681 standard.