837 resultados para Computer interfaces
Resumo:
This paper discusses and compares the use of vision based and non-vision based technologies in developing intelligent environments. By reviewing the related projects that use vision based techniques in intelligent environment design, the achieved functions, technical issues and drawbacks of those projects are discussed and summarized, and the potential solutions for future improvement are proposed, which leads to the prospective direction of my PhD research.
Resumo:
Medical universities and teaching hospitals in Iraq are facing a lack of professional staff due to the ongoing violence that forces them to flee the country. The professionals are now distributed outside the country which reduces the chances for the staff and students to be physically in one place to continue the teaching and limits the efficiency of the consultations in hospitals. A survey was done among students and professional staff in Iraq to find the problems in the learning and clinical systems and how Information and Communication Technology could improve it. The survey has shown that 86% of the participants use the Internet as a learning resource and 25% for clinical purposes while less than 11% of them uses it for collaboration between different institutions. A web-based collaborative tool is proposed to improve the teaching and clinical system. The tool helps the users to collaborate remotely to increase the quality of the learning system as well as it can be used for remote medical consultation in hospitals.
Resumo:
Interactions using a standard computer mouse can be particularly difficult for novice and older adult users. Tasks that involve positioning the mouse over a target and double-clicking to initiate some action can be a real challenge for many users. Hence, this paper describes a study that investigates the double-click interactions of older and younger adults and presents data that can help inform the development of methods of assistance. Twelve older adults (mean age = 63.9 years) and 12 younger adults (mean age = 20.8 years) performed click and double-click target selections with a computer mouse. Initial results show that older users make approximately twice as many errors as younger users when attempting double-clicks. For both age groups, the largest proportion of errors was due to difficulties with keeping the cursor steady between button presses. Compared with younger adults, older adults experienced more difficulties with performing two button presses within a required time interval. Understanding these interactions better is a step towards improving accessibility, and may provide some suggestions for future directions of research in this area.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
A computer game was used to study psychophysiological reactions to emotion-relevant events. Two dimensions proposed by Scherer (1984a, 1984b) in his appraisal theory, the intrinsic pleasantness and goal conduciveness of game events, were studied in a factorial design. The relative level at which a player performed at the moment of an event was also taken into account. A total of 33 participants played the game while cardiac activity, skin conductance, skin temperature, and muscle activity as well as emotion self-reports were assessed. The self-reports indicate that game events altered levels of pride, joy, anger, and surprise. Goal conduciveness had little effect on muscle activity but was associated with significant autonomic effects, including changes to interbeat interval, pulse transit time, skin conductance, and finger temperature. The manipulation of intrinsic pleasantness had little impact on physiological responses. The results show the utility of attempting to manipulate emotion-constituent appraisals and measure their peripheral physiological signatures.
Resumo:
The EP2025 EDS project develops a highly parallel information server that supports established high-value interfaces. We describe the motivation for the project, the architecture of the system, and the design and application of its database and language subsystems. The Elipsys logic programming language, its advanced applications, EDS Lisp, and the Metal machine translation system are examined.
Resumo:
Mainframes, corporate and central servers are becoming information servers. The requirement for more powerful information servers is the best opportunity to exploit the potential of parallelism. ICL recognized the opportunity of the 'knowledge spectrum' namely to convert raw data into information and then into high grade knowledge. Parallel Processing and Data Management Its response to this and to the underlying search problems was to introduce the CAFS retrieval engine. The CAFS product demonstrates that it is possible to move functionality within an established architecture, introduce a different technology mix and exploit parallelism to achieve radically new levels of performance. CAFS also demonstrates the benefit of achieving this transparently behind existing interfaces. ICL is now working with Bull and Siemens to develop the information servers of the future by exploiting new technologies as available. The objective of the joint Esprit II European Declarative System project is to develop a smoothly scalable, highly parallel computer system, EDS. EDS will in the main be an SQL server and an information server. It will support the many data-intensive applications which the companies foresee; it will also support application-intensive and logic-intensive systems.
Resumo:
In order to gain a better understanding of online conceptual collaborative design processes this paper investigates how student designers make use of a shared virtual synchronous environment when engaged in conceptual design. The software enables users to talk to each other and share sketches when they are remotely located. The paper describes a novel methodology for observing and analysing collaborative design processes by adapting the concepts of grounded theory. Rather than concentrating on narrow aspects of the final artefacts, emerging “themes” are generated that provide a broader picture of collaborative design process and context descriptions. Findings on the themes of “grounding – mutual understanding” and “support creativity” complement findings from other research, while important themes associated with “near-synchrony” have not been emphasised in other research. From the study, a series of design recommendations are made for the development of tools to support online computer-supported collaborative work in design using a shared virtual environment.
Resumo:
This paper presents recent research into the functions and value of sketch outputs during computer supported collaborative design. Sketches made primarily exploiting whiteboard technology are shown to support subjects engaged in remote collaborative design, particularly when constructed in ‘nearsynchronous’ communication. The authors define near-synchronous communication and speculate that it is compatible with the reflective and iterative nature of design activity. There appears to be significant similarities between the making of sketches in near-synchronous remote collaborative design and those made on paper in more traditional face-to-face settings With the current increase in the use of computer supported collaborative working (CSCW) in undergraduate and postgraduate design education it is proposed that sketches and sketching can make important contributions to design learning in this context
Resumo:
This paper presents the findings from a study into the current exploitation of computer-supported collaborative working (CSCW) in design for the built environment in the UK. The research is based on responses to a web-based questionnaire. Members of various professions, including civil engineers, architects, building services engineers, and quantity surveyors, were invited to complete the questionnaire. The responses reveal important trends in the breadth and size of project teams at the same time as new pressures are emerging regarding team integration and efficiency. The findings suggest that while CSCW systems may improve project management (e.g., via project documentation) and the exchange of information between team members, it has yet to significantly support those activities that characterize integrated collaborative working between disparate specialists. The authors conclude by combining the findings with a wider discussion of the application of CSCW to design activity-appealing for CSCW to go beyond multidisciplinary working to achieve interdisciplinary working.
Resumo:
This paper describes a novel methodology for observing and analysing collaborative design by using the concepts of cognitive dimensions related to concept-based misfit analysis. The study aims at gaining an insight into support for creative practice of graphical communication in collaborative design processes of designers while sketching within a shared white board and audio conferencing environment. Empirical data on design processes have been obtained from observation of groups of student designers solving an interior space-planning problem of a lounge-diner in a shared virtual environment. The results of the study provide recommendations for the design and development of interactive systems to support such collaborative design activities.