863 resultados para interactive visualization
Resumo:
Drawing on an empirical study of public transport, this paper studies interactive value formation at the provider—customer interface, from a practice—theory perspective. In contrast to the bulk of previous research, it argues that interactive value formation is not only associated with value co-creation but also with value co-destruction. In addition, the paper also identifies five interaction value practices — informing, greeting, delivering, charging, and helping — and theorizes how interactive value formation takes place as well as how value is intersubjectively assessed by actors at the provider—customer interface. Furthermore, the paper also distinguishes between four types of interactive value formation praxis corresponding with four subject positions which practitioners step into when engaging in interactive value formation.
Resumo:
Studies of fluid-structure interactions associated with flexible structures such as flapping wings require the capture and quantification of large motions of bodies that may be opaque. Motion capture of a free flying insect is considered by using three synchronized high-speed cameras. A solid finite element representation is used as a reference body and successive snapshots in time of the displacement fields are reconstructed via an optimization procedure. An objective function is formulated, and various shape difference definitions are considered. The proposed methodology is first studied for a synthetic case of a flexible cantilever structure undergoing large deformations, and then applied to a Manduca Sexta (hawkmoth) in free flight. The three-dimensional motions of this flapping system are reconstructed from image date collected by using three cameras. The complete deformation geometry of this system is analyzed. Finally, a computational investigation is carried out to understand the flow physics and aerodynamic performance by prescribing the body and wing motions in a fluid-body code. This thesis work contains one of the first set of such motion visualization and deformation analyses carried out for a hawkmoth in free flight. The tools and procedures used in this work are widely applicable to the studies of other flying animals with flexible wings as well as synthetic systems with flexible body elements.
Resumo:
This quantitative study examines the impact of teacher practices on student achievement in classrooms where the English is Fun Interactive Radio Instruction (IRI) programs were being used. A contemporary IRI design using a dual-audience approach, the English is Fun IRI programs delivered daily English language instruction to students in grades 1 and 2 in Delhi and Rajasthan through 120 30-minute programs via broadcast radio (the first audience) while modeling pedagogical techniques and behaviors for their teachers (the second audience). Few studies have examined how the dual-audience approach influences student learning. Using existing data from 32 teachers and 696 students, this study utilizes a multivariate multilevel model to examine the role of the primary expectations for teachers (e.g., setting up the IRI classroom, following instructions from the radio characters and ensuring students are participating) and the role of secondary expectations for teachers (e.g., modeling pedagogies and facilitating learning beyond the instructions) in promoting students’ learning in English listening skills, knowledge of vocabulary and use of sentences. The study finds that teacher practice on both sets of expectations mattered, but that practice in the secondary expectations mattered more. As expected, students made the smallest gains in the most difficult linguistic task (sentence use). The extent to which teachers satisfied the primary and secondary expectations was associated with gains in all three skills – confirming the relationship between students’ English proficiency and teacher practice in a dual-audience program. When it came to gains in students’ scores in sentence use, a teacher whose focus was greater on primary expectations had a negative effect on student performance in both states. In all, teacher practice clearly mattered but not in the same way for all three skills. An optimal scenario for teacher practice is presented in which gains in all three skills are maximized. These findings have important implications for the way the classroom teacher is cast in IRI programs that utilize a dual-audience approach and in the way IRI programs are contracted insofar as the role of the teacher in instruction is minimized and access is limited to instructional support from the IRI lessons alone.
Resumo:
Although large-scale public hypermedia structures such as the World Wide Web are popularly referred to as "cyberspace", the extent to which they constitute a space in the everyday sense of the word is questionable. This paper reviews recent work in the area of three dimensional (3D) visualization of the Web that has attempted to depict it in the form of a recognizable space; in other words, as a navigable landscape that may be visibly populated by its users. Our review begins by introducing a range of visualizations that address different aspects of using the Web. These include visualizations of Web structure, especially of links, that act as 3D maps; browsing history; searches; evolution of the Web; and the presence and activities of multiple users. We then summarize the different techniques that are employed by these visualizations. We conclude with a discussion of key challenges for the future.
Resumo:
Individuals living in highly networked societies publish a large amount of personal, and potentially sensitive, information online. Web investigators can exploit such information for a variety of purposes, such as in background vetting and fraud detection. However, such investigations require a large number of expensive man hours and human effort. This paper describes InfoScout, a search tool which is intended to reduce the time it takes to identify and gather subject centric information on the Web. InfoScout collects relevance feedback information from the investigator in order to rerank search results, allowing the intended information to be discovered more quickly. Users may still direct their search as they see fit, issuing ad-hoc queries and filtering existing results by keywords. Design choices are informed by prior work and industry collaboration.
Resumo:
We revisit the visibility problem, which is traditionally known in Computer Graphics and Vision fields as the process of computing a (potentially) visible set of primitives in the computational model of a scene. We propose a hybrid solution that uses a dry structure (in the sense of data reduction), a triangulation of the type , to accelerate the task of searching for visible primitives. We came up with a solution that is useful for real-time, on-line, interactive applications as 3D visualization. In such applications the main goal is to load the minimum amount of primitives from the scene during the rendering stage, as possible. For this purpose, our algorithm executes the culling by using a hybrid paradigm based on viewing-frustum, back-face culling and occlusion models. Results have shown substantial improvement over these traditional approaches if applied separately. This novel approach can be used in devices with no dedicated processors or with low processing power, as cell phones or embedded displays, or to visualize data through the Internet, as in virtual museums applications.
Resumo:
In this study the relationship between heterogeneous nucleate boiling surfaces and deposition of suspended metallic colloidal particles, popularly known as crud or corrosion products in process industries, on those heterogeneous sites is investigated. Various researchers have reported that hematite is a major constituent of crud which makes it the primary material of interest; however the models developed in this work are irrespective of material choice. Qualitative hypotheses on the deposition process under boiling as proposed by previous researchers have been tested, which fail to provide explanations for several physical mechanisms observed and analyzed. In this study a quantitative model of deposition rate has been developed on the basis of bubble dynamics and colloid-surface interaction potential. Boiling from a heating surface aids in aggregation of the metallic particulates viz. nano-particles, crud particulate, etc. suspended in a liquid, which helps in transporting them to heating surfaces. Consequently, clusters of particles deposit onto the heating surfaces due to various interactive forces, resulting in formation of porous or impervious layers. The deposit layer grows or recedes depending upon variations in interparticle and surface forces, fluid shear, fluid chemistry, etc. This deposit layer in turn affects the rate of bubble generation, formation of porous chimneys, critical heat flux (CHF) of surfaces, activation and deactivation of nucleation sites on the heating surfaces. Several problems are posed due to the effect of boiling on colloidal deposition, which range from research initiatives involving nano-fluids as a heat transfer medium to industrial applications such as light water nuclear reactors. In this study, it is attempted to integrate colloid and surface science with vapor bubble dynamics, boiling heat transfer and evaporation rate. Pool boiling experiments with dilute metallic colloids have been conducted to investigate several parameters impacting the system. The experimental data available in the literature is obtained by flow experiments, which do not help in correlating boiling mechanism with the deposition amount or structure. With the help of experimental evidences and analysis, previously proposed hypothesis for particle transport to the contact line due to hydrophobicity has been challenged. The experimental observations suggest that deposition occurs around the bubble surface contact line and extends underneath area of the bubble microlayer as well. During the evaporation the concentration gradient of a non-volatile species is created, which induces osmotic pressure. The osmotic pressure developed inside the microlayer draws more particles inside the microlayer region or towards contact line. The colloidal escape time is slower than the evaporation time, which leads to the aggregation of particles in the evaporating micro-layer. These aggregated particles deposit onto or are removed from the heating surface, depending upon their total interaction potential. Interaction potential has been computed with the help of surface charge and van der Waals potential for the materials in aqueous solutions. Based upon the interaction-force boundary layer thickness, which is governed by debye radius (or ionic concentration and pH), a simplified quantitative model for the attachment kinetics is proposed. This attachment kinetics model gives reasonable results in predicting attachment rate against data reported by previous researchers. The attachment kinetics study has been done for different pH levels and particle sizes for hematite particles. Quantification of colloidal transport under boiling scenarios is done with the help of overall average evaporation rates because generally waiting times for bubbles at the same position is much larger than growth times. In other words, from a larger measurable scale perspective, frequency of bubbles dictates the rate of collection of particles rather than evaporation rate during micro-layer evaporation of one bubble. The combination of attachment kinetics and colloidal transport kinetics has been used to make a consolidated model for prediction of the amount of deposition and is validated with the help of high fidelity experimental data. In an attempt to understand and explain boiling characteristics, high speed visualization of bubble dynamics from a single artificial large cavity and multiple naturally occurring cavities is conducted. A bubble growth and departure dynamics model is developed for artificial active sites and is validated with the experimental data. The variation of bubble departure diameter with wall temperature is analyzed with experimental results and shows coherence with earlier studies. However, deposit traces after boiling experiments show that bubble contact diameter is essential to predict bubble departure dynamics, which has been ignored previously by various researchers. The relationship between porosity of colloid deposits and bubbles under the influence of Jakob number, sub-cooling and particle size has been developed. This also can be further utilized in variational wettability of the surface. Designing porous surfaces can having vast range of applications varying from high wettability, such as high critical heat flux boilers, to low wettability, such as efficient condensers.
Resumo:
This article will examine how media organisations are increasingly experimenting and innovating with interactive transmedia forms to explore issues around displacement and the ongoing migration crisis. I plan to interview a number of key industry figures with a view to understand how and why journalists and producers are expanding the scope of factual storytelling beyond traditional media platforms. It will include a number of industry case studies.
Resumo:
We build a system to support search and visualization on heterogeneous information networks. We first build our system on a specialized heterogeneous information network: DBLP. The system aims to facilitate people, especially computer science researchers, toward a better understanding and user experience about academic information networks. Then we extend our system to the Web. Our results are much more intuitive and knowledgeable than the simple top-k blue links from traditional search engines, and bring more meaningful structural results with correlated entities. We also investigate the ranking algorithm, and we show that the personalized PageRank and proposed Hetero-personalized PageRank outperform the TF-IDF ranking or mixture of TF-IDF and authority ranking. Our work opens several directions for future research.
Resumo:
International audience
Resumo:
Presentation from the MARAC conference in Roanoke, VA on October 7–10, 2015. S7 - The Interactive Experience: Exploring Technologies for Creating Touchscreen Exhibits.
Resumo:
Presentation from the MARAC conference in Roanoke, VA on October 7–10, 2015. S7 - The Interactive Experience: Exploring Technologies for Creating Touchscreen Exhibits.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade Gama, Programa de Pós-Graduação em Engenharia Biomédica, 2015.
Resumo:
Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.