975 resultados para User Performance
Resumo:
The design of interfaces to facilitate user search has become critical for search engines, ecommercesites, and intranets. This study investigated the use of targeted instructional hints to improve search by measuring the quantitative effects of users' performance and satisfaction. The effects of syntactic, semantic and exemplar search hints on user behavior were evaluated in an empirical investigation using naturalistic scenarios. Combining the three search hint components, each with two levels of intensity, in a factorial design generated eight search engine interfaces. Eighty participants participated in the study and each completed six realistic search tasks. Results revealed that the inclusion of search hints improved user effectiveness, efficiency and confidence when using the search interfaces, but with complex interactions that require specific guidelines for search interface designers. These design guidelines will allow search designers to create more effective interfaces for a variety of searchapplications.
Resumo:
As end-user computing becomes more pervasive, an organization's success increasingly depends on the ability of end-users, usually in managerial positions, to extract appropriate data from both internal and external sources. Many of these data sources include or are derived from the organization's accounting information systems. Managerial end-users with different personal characteristics and approaches are likely to compose queries of differing levels of accuracy when searching the data contained within these accounting information systems. This research investigates how cognitive style elements of personality influence managerial end-user performance in database querying tasks. A laboratory experiment was conducted in which participants generated queries to retrieve information from an accounting information system to satisfy typical information requirements. The experiment investigated the influence of personality on the accuracy of queries of varying degrees of complexity. Relying on the Myers–Briggs personality instrument, results show that perceiving individuals (as opposed to judging individuals) who rely on intuition (as opposed to sensing) composed queries more accurately. As expected, query complexity and academic performance also explain the success of data extraction tasks.
Resumo:
The main objective for this degree project is to implement an Application Availability Monitoring (AAM) system named Softek EnView for Fujitsu Services. The aim of implementing the AAM system is to proactively identify end user performance problems, such as application and site performance, before the actual end users experience them. No matter how well applications and sites are designed and nomatter how well they meet business requirements, they are useless to the end users if the performance is slow and/or unreliable. It is important for the customers to find out whether the end user problems are caused by the network or application malfunction. The Softek EnView was comprised of the following EnView components: Robot, Monitor, Reporter, Collector and Repository. The implemented system, however, is designed to use only some of these EnView elements: Robot, Reporter and depository. Robots can be placed at any key user location and are dedicated to customers, which means that when the number of customers increases, at the sametime the amount of Robots will increase. To make the AAM system ideal for the company to use, it was integrated with Fujitsu Services’ centralised monitoring system, BMC PATROL Enterprise Manager (PEM). That was actually the reason for deciding to drop the EnView Monitor element. After the system was fully implemented, the AAM system was ready for production. Transactions were (and are) written and deployed on Robots to simulate typical end user actions. These transactions are configured to run with certain intervals, which are defined collectively with customers. While they are driven against customers’ applicationsautomatically, transactions collect availability data and response time data all the time. In case of a failure in transactions, the robot immediately quits the transactionand writes detailed information to a log file about what went wrong and which element failed while going through an application. Then an alert is generated by a BMC PATROL Agent based on this data and is sent to the BMC PEM. Fujitsu Services’ monitoring room receives the alert, reacts to it according to the incident management process in ITIL and by alerting system specialists on critical incidents to resolve problems. As a result of the data gathered by the Robots, weekly reports, which contain detailed statistics and trend analyses of ongoing quality of IT services, is provided for the Customers.
Resumo:
The data structure of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. This research develops a methodology for evaluating, ex ante, the relative desirability of alternative data structures for end user queries. This research theorizes that the data structure that yields the lowest weighted average complexity for a representative sample of information requests is the most desirable data structure for end user queries. The theory was tested in an experiment that compared queries from two different relational database schemas. As theorized, end users querying the data structure associated with the less complex queries performed better Complexity was measured using three different Halstead metrics. Each of the three metrics provided excellent predictions of end user performance. This research supplies strong evidence that organizations can use complexity metrics to evaluate, ex ante, the desirability of alternate data structures. Organizations can use these evaluations to enhance the efficient and effective retrieval of information by creating data structures that minimize end user query complexity.
Resumo:
In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
This paper examines the effects of information request ambiguity and construct incongruence on end user's ability to develop SQL queries with an interactive relational database query language. In this experiment, ambiguity in information requests adversely affected accuracy and efficiency. Incongruities among the information request, the query syntax, and the data representation adversely affected accuracy, efficiency, and confidence. The results for ambiguity suggest that organizations might elicit better query development if end users were sensitized to the nature of ambiguities that could arise in their business contexts. End users could translate natural language queries into pseudo-SQL that could be examined for precision before the queries were developed. The results for incongruence suggest that better query development might ensue if semantic distances could be reduced by giving users data representations and database views that maximize construct congruence for the kinds of queries in typical domains. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Mestrado em Engenharia de Computação e Instrumentação Médica
Resumo:
De plus en plus de recherches sur les Interactions Humain-Machine (IHM) tentent d’effectuer des analyses fines de l’interaction afin de faire ressortir ce qui influence les comportements des utilisateurs. Tant au niveau de l’évaluation de la performance que de l’expérience des utilisateurs, on note qu’une attention particulière est maintenant portée aux réactions émotionnelles et cognitives lors de l’interaction. Les approches qualitatives standards sont limitées, car elles se fondent sur l’observation et des entrevues après l’interaction, limitant ainsi la précision du diagnostic. L’expérience utilisateur et les réactions émotionnelles étant de nature hautement dynamique et contextualisée, les approches d’évaluation doivent l’être de même afin de permettre un diagnostic précis de l’interaction. Cette thèse présente une approche d’évaluation quantitative et dynamique qui permet de contextualiser les réactions des utilisateurs afin d’en identifier les antécédents dans l’interaction avec un système. Pour ce faire, ce travail s’articule autour de trois axes. 1) La reconnaissance automatique des buts et de la structure de tâches de l’utilisateur, à l’aide de mesures oculométriques et d’activité dans l’environnement par apprentissage machine. 2) L’inférence de construits psychologiques (activation, valence émotionnelle et charge cognitive) via l’analyse des signaux physiologiques. 3) Le diagnostic de l‘interaction reposant sur le couplage dynamique des deux précédentes opérations. Les idées et le développement de notre approche sont illustrés par leur application dans deux contextes expérimentaux : le commerce électronique et l’apprentissage par simulation. Nous présentons aussi l’outil informatique complet qui a été implémenté afin de permettre à des professionnels en évaluation (ex. : ergonomes, concepteurs de jeux, formateurs) d’utiliser l’approche proposée pour l’évaluation d’IHM. Celui-ci est conçu de manière à faciliter la triangulation des appareils de mesure impliqués dans ce travail et à s’intégrer aux méthodes classiques d’évaluation de l’interaction (ex. : questionnaires et codage des observations).
Resumo:
The introduction of a new technology High Speed Downlink Packet Access (HSDPA) in the Release 5 of the 3GPP specifications raises the question about its performance capabilities. HSDPA is a promising technology which gives theoretical rates up to 14.4 Mbits. The main objective of this thesis is to discuss the system level performance of HSDPAMainly the thesis exploration focuses on the Packet Scheduler because it is the central entity of the HSDPA design. Due to its function, the Packet Scheduler has a direct impact on the HSDPA system performance. Similarly, it also determines the end user performance, and more specifically the relative performance between the users in the cell.The thesis analyzes several Packet Scheduling algorithms that can optimize the trade-off between system capacity and end user performance for the traffic classes targeted in this thesis.The performance evaluation of the algorithms in the HSDPA system are carried out under computer aided simulations that are assessed under realistic conditions to predict the results as precise on the algorithms efficiency. The simulation of the HSDPA system and the algorithms are coded in C/C++ language
Resumo:
IPTV is now offered by several operators in Europe, US and Asia using broadcast video over private IP networks that are isolated from Internet. IPTV services rely ontransmission of live (real-time) video and/or stored video. Video on Demand (VoD)and Time-shifted TV are implemented by IP unicast and Broadcast TV (BTV) and Near video on demand are implemented by IP multicast. IPTV services require QoS guarantees and can tolerate no more than 10-6 packet loss probability, 200 ms delay, and 50 ms jitter. Low delay is essential for satisfactory trick mode performance(pause, resume,fast forward) for VoD, and fast channel change time for BTV. Internet Traffic Engineering (TE) is defined in RFC 3272 and involves both capacity management and traffic management. Capacity management includes capacityplanning, routing control, and resource management. Traffic management includes (1)nodal traffic control functions such as traffic conditioning, queue management, scheduling, and (2) other functions that regulate traffic flow through the network orthat arbitrate access to network resources. An IPTV network architecture includes multiple networks (core network, metronetwork, access network and home network) that connects devices (super head-end, video hub office, video serving office, home gateway, set-top box). Each IP router in the core and metro networks implements some queueing and packet scheduling mechanism at the output link controller. Popular schedulers in IP networks include Priority Queueing (PQ), Class-Based Weighted Fair Queueing (CBWFQ), and Low Latency Queueing (LLQ) which combines PQ and CBWFQ.The thesis analyzes several Packet Scheduling algorithms that can optimize the tradeoff between system capacity and end user performance for the traffic classes. Before in the simulator FIFO,PQ,GPS queueing methods were implemented inside. This thesis aims to implement the LLQ scheduler inside the simulator and to evaluate the performance of these packet schedulers. The simulator is provided by ErnstNordström and Simulator was built in Visual C++ 2008 environmentand tested and analyzed in MatLab 7.0 under windows VISTA.
Resumo:
As digital systems move away from traditional desktop setups, new interaction paradigms are emerging that better integrate with users’ realworld surroundings, and better support users’ individual needs. While promising, these modern interaction paradigms also present new challenges, such as a lack of paradigm-specific tools to systematically evaluate and fully understand their use. This dissertation tackles this issue by framing empirical studies of three novel digital systems in embodied cognition – an exciting new perspective in cognitive science where the body and its interactions with the physical world take a central role in human cognition. This is achieved by first, focusing the design of all these systems on a contemporary interaction paradigm that emphasizes physical interaction on tangible interaction, a contemporary interaction paradigm; and second, by comprehensively studying user performance in these systems through a set of novel performance metrics grounded on epistemic actions, a relatively well established and studied construct in the literature on embodied cognition. The first system presented in this dissertation is an augmented Four-in-a-row board game. Three different versions of the game were developed, based on three different interaction paradigms (tangible, touch and mouse), and a repeated measures study involving 36 participants measured the occurrence of three simple epistemic actions across these three interfaces. The results highlight the relevance of epistemic actions in such a task and suggest that the different interaction paradigms afford instantiation of these actions in different ways. Additionally, the tangible version of the system supports the most rapid execution of these actions, providing novel quantitative insights into the real benefits of tangible systems. The second system presented in this dissertation is a tangible tabletop scheduling application. Two studies with single and paired users provide several insights into the impact of epistemic actions on the user experience when these are performed outside of a system’s sensing boundaries. These insights are clustered by the form, size and location of ideal interface areas for such offline epistemic actions to occur, as well as how can physical tokens be designed to better support them. Finally, and based on the results obtained to this point, the last study presented in this dissertation directly addresses the lack of empirical tools to formally evaluate tangible interaction. It presents a video-coding framework grounded on a systematic literature review of 78 papers, and evaluates its value as metric through a 60 participant study performed across three different research laboratories. The results highlight the usefulness and power of epistemic actions as a performance metric for tangible systems. In sum, through the use of such novel metrics in each of the three studies presented, this dissertation provides a better understanding of the real impact and benefits of designing and developing systems that feature tangible interaction.
Resumo:
Healthcare, Human Computer Interfaces (HCI), Security and Biometry are the most promising application scenario directly involved in the Body Area Networks (BANs) evolution. Both wearable devices and sensors directly integrated in garments envision a word in which each of us is supervised by an invisible assistant monitoring our health and daily-life activities. New opportunities are enabled because improvements in sensors miniaturization and transmission efficiency of the wireless protocols, that achieved the integration of high computational power aboard independent, energy-autonomous, small form factor devices. Application’s purposes are various: (I) data collection to achieve off-line knowledge discovery; (II) user notification of his/her activities or in case a danger occurs; (III) biofeedback rehabilitation; (IV) remote alarm activation in case the subject need assistance; (V) introduction of a more natural interaction with the surrounding computerized environment; (VI) users identification by physiological or behavioral characteristics. Telemedicine and mHealth [1] are two of the leading concepts directly related to healthcare. The capability to borne unobtrusiveness objects supports users’ autonomy. A new sense of freedom is shown to the user, not only supported by a psychological help but a real safety improvement. Furthermore, medical community aims the introduction of new devices to innovate patient treatments. In particular, the extension of the ambulatory analysis in the real life scenario by proving continuous acquisition. The wide diffusion of emerging wellness portable equipment extended the usability of wearable devices also for fitness and training by monitoring user performance on the working task. The learning of the right execution techniques related to work, sport, music can be supported by an electronic trainer furnishing the adequate aid. HCIs made real the concept of Ubiquitous, Pervasive Computing and Calm Technology introduced in the 1988 by Marc Weiser and John Seeley Brown. They promotes the creation of pervasive environments, enhancing the human experience. Context aware, adaptive and proactive environments serve and help people by becoming sensitive and reactive to their presence, since electronics is ubiquitous and deployed everywhere. In this thesis we pay attention to the integration of all the aspects involved in a BAN development. Starting from the choice of sensors we design the node, configure the radio network, implement real-time data analysis and provide a feedback to the user. We present algorithms to be implemented in wearable assistant for posture and gait analysis and to provide assistance on different walking conditions, preventing falls. Our aim, expressed by the idea to contribute at the development of a non proprietary solutions, driven us to integrate commercial and standard solutions in our devices. We use sensors available on the market and avoided to design specialized sensors in ASIC technologies. We employ standard radio protocol and open source projects when it was achieved. The specific contributions of the PhD research activities are presented and discussed in the following. • We have designed and build several wireless sensor node providing both sensing and actuator capability making the focus on the flexibility, small form factor and low power consumption. The key idea was to develop a simple and general purpose architecture for rapid analysis, prototyping and deployment of BAN solutions. Two different sensing units are integrated: kinematic (3D accelerometer and 3D gyroscopes) and kinetic (foot-floor contact pressure forces). Two kind of feedbacks were implemented: audio and vibrotactile. • Since the system built is a suitable platform for testing and measuring the features and the constraints of a sensor network (radio communication, network protocols, power consumption and autonomy), we made a comparison between Bluetooth and ZigBee performance in terms of throughput and energy efficiency. Test in the field evaluate the usability in the fall detection scenario. • To prove the flexibility of the architecture designed, we have implemented a wearable system for human posture rehabilitation. The application was developed in conjunction with biomedical engineers who provided the audio-algorithms to furnish a biofeedback to the user about his/her stability. • We explored off-line gait analysis of collected data, developing an algorithm to detect foot inclination in the sagittal plane, during walk. • In collaboration with the Wearable Lab – ETH, Zurich, we developed an algorithm to monitor the user during several walking condition where the user carry a load. The remainder of the thesis is organized as follows. Chapter I gives an overview about Body Area Networks (BANs), illustrating the relevant features of this technology and the key challenges still open. It concludes with a short list of the real solutions and prototypes proposed by academic research and manufacturers. The domain of the posture and gait analysis, the methodologies, and the technologies used to provide real-time feedback on detected events, are illustrated in Chapter II. The Chapter III and IV, respectively, shown BANs developed with the purpose to detect fall and monitor the gait taking advantage by two inertial measurement unit and baropodometric insoles. Chapter V reports an audio-biofeedback system to improve balance on the information provided by the use centre of mass. A walking assistant based on the KNN classifier to detect walking alteration on load carriage, is described in Chapter VI.
Resumo:
As digital systems move away from traditional desktop setups, new interaction paradigms are emerging that better integrate with users’ realworld surroundings, and better support users’ individual needs. While promising, these modern interaction paradigms also present new challenges, such as a lack of paradigm-specific tools to systematically evaluate and fully understand their use. This dissertation tackles this issue by framing empirical studies of three novel digital systems in embodied cognition – an exciting new perspective in cognitive science where the body and its interactions with the physical world take a central role in human cognition. This is achieved by first, focusing the design of all these systems on a contemporary interaction paradigm that emphasizes physical interaction on tangible interaction, a contemporary interaction paradigm; and second, by comprehensively studying user performance in these systems through a set of novel performance metrics grounded on epistemic actions, a relatively well established and studied construct in the literature on embodied cognition. The first system presented in this dissertation is an augmented Four-in-a-row board game. Three different versions of the game were developed, based on three different interaction paradigms (tangible, touch and mouse), and a repeated measures study involving 36 participants measured the occurrence of three simple epistemic actions across these three interfaces. The results highlight the relevance of epistemic actions in such a task and suggest that the different interaction paradigms afford instantiation of these actions in different ways. Additionally, the tangible version of the system supports the most rapid execution of these actions, providing novel quantitative insights into the real benefits of tangible systems. The second system presented in this dissertation is a tangible tabletop scheduling application. Two studies with single and paired users provide several insights into the impact of epistemic actions on the user experience when these are performed outside of a system’s sensing boundaries. These insights are clustered by the form, size and location of ideal interface areas for such offline epistemic actions to occur, as well as how can physical tokens be designed to better support them. Finally, and based on the results obtained to this point, the last study presented in this dissertation directly addresses the lack of empirical tools to formally evaluate tangible interaction. It presents a video-coding framework grounded on a systematic literature review of 78 papers, and evaluates its value as metric through a 60 participant study performed across three different research laboratories. The results highlight the usefulness and power of epistemic actions as a performance metric for tangible systems. In sum, through the use of such novel metrics in each of the three studies presented, this dissertation provides a better understanding of the real impact and benefits of designing and developing systems that feature tangible interaction.
Resumo:
The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
In this paper, a heterogeneous network composed of femtocells deployed within a macrocell network is considered, and a quality-of-service (QoS)-oriented fairness metric which captures important characteristics of tiered network architectures is proposed. Using homogeneous Poisson processes, the sum capacities in such networks are expressed in closed form for co-channel, dedicated channel, and hybrid resource allocation methods. Then a resource splitting strategy that simultaneously considers capacity maximization, fairness constraints, and QoS constraints is proposed. Detailed computer simulations utilizing 3GPP simulation assumptions show that a hybrid allocation strategy with a well-designed resource split ratio enjoys the best cell-edge user performance, with minimal degradation in the sum throughput of macrocell users when compared with that of co-channel operation.