4 resultados para Interaction Techniques

em DRUM (Digital Repository at the University of Maryland)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous studies of the dual-mode scramjet isolator, a critical component in preventing inlet unstart and/or vehicle loss by containing a collection of flow disturbances called a shock train, have been performed since the dual-mode propulsion cycle was introduced in the 1960s. Low momentum corner flow and other three-dimensional effects inherent to rectangular isolators have, however, been largely ignored in experimental studies of the boundary layer separation driven isolator shock train dynamics. Furthermore, the use of two dimensional diagnostic techniques in past works, be it single-perspective line-of-sight schlieren/shadowgraphy or single axis wall pressure measurements, have been unable to resolve the three-dimensional flow features inside the rectangular isolator. These flow characteristics need to be thoroughly understood if robust dual-mode scramjet designs are to be fielded. The work presented in this thesis is focused on experimentally analyzing shock train/boundary layer interactions from multiple perspectives in aspect ratio 1.0, 3.0, and 6.0 rectangular isolators with inflow Mach numbers ranging from 2.4 to 2.7. Secondary steady-state Computational Fluid Dynamics studies are performed to compare to the experimental results and to provide additional perspectives of the flow field. Specific issues that remain unresolved after decades of isolator shock train studies that are addressed in this work include the three-dimensional formation of the isolator shock train front, the spatial and temporal low momentum corner flow separation scales, the transient behavior of shock train/boundary layer interaction at specific coordinates along the isolator's lateral axis, and effects of the rectangular geometry on semi-empirical relations for shock train length prediction. A novel multiplane shadowgraph technique is developed to resolve the structure of the shock train along both the minor and major duct axis simultaneously. It is shown that the shock train front is of a hybrid oblique/normal nature. Initial low momentum corner flow separation spawns the formation of oblique shock planes which interact and proceed toward the center flow region, becoming more normal in the process. The hybrid structure becomes more two-dimensional as aspect ratio is increased but corner flow separation precedes center flow separation on the order of 1 duct height for all aspect ratios considered. Additional instantaneous oil flow surface visualization shows the symmetry of the three-dimensional shock train front around the lower wall centerline. Quantitative synthetic schlieren visualization shows the density gradient magnitude approximately double between the corner oblique and center flow normal structures. Fast response pressure measurements acquired near the corner region of the duct show preliminary separation in the outer regions preceding centerline separation on the order of 2 seconds. Non-intrusive Focusing Schlieren Deflectometry Velocimeter measurements reveal that both shock train oscillation frequency and velocity component decrease as measurements are taken away from centerline and towards the side-wall region, along with confirming the more two dimensional shock train front approximation for higher aspect ratios. An updated modification to Waltrup \& Billig's original semi-empirical shock train length relation for circular ducts based on centerline pressure measurements is introduced to account for rectangular isolator aspect ratio, upstream corner separation length scale, and major- and minor-axis boundary layer momentum thickness asymmetry. The latter is derived both experimentally and computationally and it is shown that the major-axis (side-wall) boundary layer has lower momentum thickness compared to the minor-axis (nozzle bounded) boundary layer, making it more separable. Furthermore, it is shown that the updated correlation drastically improves shock train length prediction capabilities in higher aspect ratio isolators. This thesis suggests that performance analysis of rectangular confined supersonic flow fields can no longer be based on observations and measurements obtained along a single axis alone. Knowledge gained by the work performed in this study will allow for the development of more robust shock train leading edge detection techniques and isolator designs which can greatly mitigate the risk of inlet unstart and/or vehicle loss in flight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.