986 resultados para Computer users
Resumo:
For several years, online educational tools such as Blackboard have been used by Universities to foster collaborative learning in an online setting. Such tools tend to be implemented in a top-down fashion, with the institution providing the tool to the students and instructing them to use it. Recently, however, a more informal, bottom up approach is increasingly being employed by the students themselves in the form of social networks such as Facebook. With over 9,000 registered Facebook users at the beginning of this study, rising to over 12,000 at the University of Reading alone, Facebook is becoming the de facto social network of choice for higher education students in the UK, and there was increasing anecdotal evidence that students were actively learning via Facebook rather than through BlackBoard. To test the validity of these anecdotes, a questionnaire was sent to students, asking them about their learning experiences via BlackBoard and Facebook. The results show that students are making use of the tools available to them even when there is no formal academic content, and that increased use of a social networking tool is correlated with a reported increase in learning as a result of that use.
Resumo:
Context-aware multimodal interactive systems aim to adapt to the needs and behavioural patterns of users and offer a way forward for enhancing the efficacy and quality of experience (QoE) in human-computer interaction. The various modalities that constribute to such systems each provide a specific uni-modal response that is integratively presented as a multi-modal interface capable of interpretation of multi-modal user input and appropriately responding to it through dynamically adapted multi-modal interactive flow management , This paper presents an initial background study in the context of the first phase of a PhD research programme in the area of optimisation of data fusion techniques to serve multimodal interactivite systems, their applications and requirements.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
Resumo:
The Global Ocean Data Assimilation Experiment (GODAE [http:// www.godae.org]) has spanned a decade of rapid technological development. The ever-increasing volume and diversity of oceanographic data produced by in situ instruments, remote-sensing platforms, and computer simulations have driven the development of a number of innovative technologies that are essential for connecting scientists with the data that they need. This paper gives an overview of the technologies that have been developed and applied in the course of GODAE, which now provide users of oceanographic data with the capability to discover, evaluate, visualize, download, and analyze data from all over the world. The key to this capability is the ability to reduce the inherent complexity of oceanographic data by providing a consistent, harmonized view of the various data products. The challenges of data serving have been addressed over the last 10 years through the cooperative skills and energies of many individuals.
Resumo:
A computer game was used to study psychophysiological reactions to emotion-relevant events. Two dimensions proposed by Scherer (1984a, 1984b) in his appraisal theory, the intrinsic pleasantness and goal conduciveness of game events, were studied in a factorial design. The relative level at which a player performed at the moment of an event was also taken into account. A total of 33 participants played the game while cardiac activity, skin conductance, skin temperature, and muscle activity as well as emotion self-reports were assessed. The self-reports indicate that game events altered levels of pride, joy, anger, and surprise. Goal conduciveness had little effect on muscle activity but was associated with significant autonomic effects, including changes to interbeat interval, pulse transit time, skin conductance, and finger temperature. The manipulation of intrinsic pleasantness had little impact on physiological responses. The results show the utility of attempting to manipulate emotion-constituent appraisals and measure their peripheral physiological signatures.
Resumo:
Following the 1995 “pill scare” relating to the risk of venous thrombosis from taking second- or third-generation oral contraceptives, the Committee on Safety of Medicines (CSM) withdrew their earlier recommended restrictions on the use of third-generation pills and published recommended wording to be used in patient information leaflets. However, the effectiveness of this wording has not been tested. An empirical study (with 186 pill users, past users, and non-users) was conducted to assess understanding, based on this wording, of the absolute and relative risk of thrombosis in pill users and in pregnancy. The results showed that less than 12% of women in the (higher education) group fully understood the absolute levels of risk from taking the pill and from being pregnant. Relative risk was also poorly understood, with less than 40% of participants showing full understanding, and 20% showing no understanding. We recommend that the CSM revisit the wording currently provided to millions of women in the UK.
Resumo:
In this paper we describe how we generated written explanations to ‘indirect users’ of a knowledge-based system in the domain of drug prescription. We call ‘indirect users’ the intended recipients of explanations, to distinguish them from the prescriber (the ‘direct’ user) who interacts with the system. The Explanation Generator was designed after several studies about indirect users' information needs and physicians' explanatory attitudes in this domain. It integrates text planning techniques with ATN-based surface generation. A double modeling component enables adapting the information content, order and style to the indirect user to whom explanation is addressed. Several examples of computer-generated texts are provided, and they are contrasted with the physicians' explanations to discuss advantages and limits of the approach adopted.
Resumo:
Three experiments examine the effect of different forms of computer-generated advice on concurrent and subsequent performance of individuals controlling a simulated intensive-care task. Experiment 1 investigates the effect of optional and compulsory advice and shows that both result in an improvement in subjects' performance while receiving the advice, and also in an improvement in subsequent unaided performance. However, although the advice compliance displayed by the optional advice group shows a strong correlation with subsequent unaided performance, compulsory advice has no extra benefit over the optional use of advice. Experiment 2 examines the effect of providing users with on-line explanations of the advice, as well as providing less specific advice. The results show that both groups perform at the same level on the task as the advice groups from Experiment 1, although subjects receiving explanations scored significantly higher on a written post-task questionnaire. Experiment 3 investigates in more detail the relationship between advice compliance and performance. The results reveal a complex relationship between natural ability on the task and the following of advice, in that people who use the advice more tend to perform either better or worse than the more moderate users. The theoretical and practical implications of these experiments are discussed.
Resumo:
The ultimate criterion of success for interactive expert systems is that they will be used, and used to effect, by individuals other than the system developers. A key ingredient of success in most systems is involving users in the specification and development of systems as they are being built. However, until recently, system designers have paid little attention to ascertaining user needs and to developing systems with corresponding functionality and appropriate interfaces to match those requirements. Although the situation is beginning to change, many developers do not know how to go about involving users, or else tackle the problem in an inadequate way. This paper discusses the need for user involvement and considers why many developers are still not involving users in an optimal way. It looks at the different ways in which users can be involved in the development process and describes how to select appropriate techniques and methods for studying users. Finally, it discusses some of the problems inherent in involving users in expert system development, and recommends an approach which incorporates both ethnographic analysis and formal user testing.
Resumo:
This paper presents recent research into the functions and value of sketch outputs during computer supported collaborative design. Sketches made primarily exploiting whiteboard technology are shown to support subjects engaged in remote collaborative design, particularly when constructed in ‘nearsynchronous’ communication. The authors define near-synchronous communication and speculate that it is compatible with the reflective and iterative nature of design activity. There appears to be significant similarities between the making of sketches in near-synchronous remote collaborative design and those made on paper in more traditional face-to-face settings With the current increase in the use of computer supported collaborative working (CSCW) in undergraduate and postgraduate design education it is proposed that sketches and sketching can make important contributions to design learning in this context
Resumo:
This paper presents the findings from a study into the current exploitation of computer-supported collaborative working (CSCW) in design for the built environment in the UK. The research is based on responses to a web-based questionnaire. Members of various professions, including civil engineers, architects, building services engineers, and quantity surveyors, were invited to complete the questionnaire. The responses reveal important trends in the breadth and size of project teams at the same time as new pressures are emerging regarding team integration and efficiency. The findings suggest that while CSCW systems may improve project management (e.g., via project documentation) and the exchange of information between team members, it has yet to significantly support those activities that characterize integrated collaborative working between disparate specialists. The authors conclude by combining the findings with a wider discussion of the application of CSCW to design activity-appealing for CSCW to go beyond multidisciplinary working to achieve interdisciplinary working.
Resumo:
This paper describes a novel methodology for observing and analysing collaborative design by using the concepts of cognitive dimensions related to concept-based misfit analysis. The study aims at gaining an insight into support for creative practice of graphical communication in collaborative design processes of designers while sketching within a shared white board and audio conferencing environment. Empirical data on design processes have been obtained from observation of groups of student designers solving an interior space-planning problem of a lounge-diner in a shared virtual environment. The results of the study provide recommendations for the design and development of interactive systems to support such collaborative design activities.