904 resultados para Thermo dynamic analysis
Resumo:
This paper describes an experiment developed to study the performance of virtual agent animated cues within digital interfaces. Increasingly, agents are used in virtual environments as part of the branding process and to guide user interaction. However, the level of agent detail required to establish and enhance efficient allocation of attention remains unclear. Although complex agent motion is now possible, it is costly to implement and so should only be routinely implemented if a clear benefit can be shown. Pevious methods of assessing the effect of gaze-cueing as a solution to scene complexity have relied principally on two-dimensional static scenes and manual peripheral inputs. Two experiments were run to address the question of agent cues on human-computer interfaces. Both experiments measured the efficiency of agent cues analyzing participant responses either by gaze or by touch respectively. In the first experiment, an eye-movement recorder was used to directly assess the immediate overt allocation of attention by capturing the participant’s eyefixations following presentation of a cueing stimulus. We found that a fully animated agent could speed up user interaction with the interface. When user attention was directed using a fully animated agent cue, users responded 35% faster when compared with stepped 2-image agent cues, and 42% faster when compared with a static 1-image cue. The second experiment recorded participant responses on a touch screen using same agent cues. Analysis of touch inputs confirmed the results of gaze-experiment, where fully animated agent made shortest time response with a slight decrease on the time difference comparisons. Responses to fully animated agent were 17% and 20% faster when compared with 2-image and 1-image cue severally. These results inform techniques aimed at engaging users’ attention in complex scenes such as computer games and digital transactions within public or social interaction contexts by demonstrating the benefits of dynamic gaze and head cueing directly on the users’ eye movements and touch responses.
Resumo:
Reflective modulators based on the combination of an electroabsorption modulator (EAM) and semiconductor optical amplifier (SOA) are attractive devices for applications in long reach carrier distributed passive optical networks (PONs) due to the gain provided by the SOA and the high speed and low chirp modulation of the EAM. Integrated R-EAM-SOAs have experimentally shown two unexpected and unintuitive characteristics which are not observed in a single pass transmission SOA: the clamping of the output power of the device around a maximum value and low patterning distortion despite the SOA being in a regime of gain saturation. In this thesis a detailed analysis is carried out using both experimental measurements and modelling in order to understand these phenomena. For the first time it is shown that both the internal loss between SOA and R-EAM and the SOA gain play an integral role in the behaviour of gain saturated R-EAM-SOAs. Internal loss and SOA gain are also optimised for use in a carrier distributed PONs in order to access both the positive effect of output power clamping, and hence upstream dynamic range reduction, combined with low patterning operation of the SOA Reflective concepts are also gaining interest for metro transport networks and short reach, high bit rate, inter-datacentre links. Moving the optical carrier generation away from the transmitter also has potential advantages for these applications as it avoids the need for cooled photonics being placed directly on hot router line-cards. A detailed analysis is carried out in this thesis on a novel colourless reflective duobinary modulator, which would enable wavelength flexibility in a power-efficient reflective metro node.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
Twitter has changed the dynamic of the academic conference. Before Twitter, delegate participation was primarily dependent on attendance and feedback was limited to post-event survey. With Twitter, delegates have become active participants. They pass comment, share reactions and critique presentations, all the while generating a running commentary. This study examines this phenomenon using the Academic & Special Libraries (A&SL) conference 2015 (hashtag #asl2015) as a case study. A post-conference survey was undertaken asking delegates how and why they used Twitter at #asl2015. A content and conceptual analysis of tweets was conducted using Topsy and Storify. This analysis examined how delegates interacted with presentations, which sessions generated most activity on the timeline and the type of content shared. Actual tweet activity and volume per presentation was compared to survey responses. Finally, recommendations on Twitter engagement for conference organisers and presenters are provided.