562 resultados para brain, computer, interface
Resumo:
Situated on Youtube, and shown in various locations. In this video we show a 3D mock up of a personal house purchasing process. A path traversal metaphor is used to give a sense of progression along the process stages. The intention is to be able to use console devices like an Xbox to consume business processes. This is so businesses can expose their internal processes to consumers using sophisticated user interfaces. The demonstrator was developed using Microsoft XNA, with assistance from the Suncorp Bank and the Smart Services CRC. More information at: www.bpmve.org
Resumo:
Stigmergy is a biological term originally used when discussing insect or swarm behaviour, and describes a model supporting environment-based communication separating artefacts from agents. This phenomenon is demonstrated in the behavior of ants and their food foraging supported by pheromone trails, or similarly termites and their termite nest building process. What is interesting with this mechanism is that highly organized societies are formed without an apparent central management function. We see design features in Web sites that mimic stigmergic mechanisms as part of the User Interface and we have created generalizations of these patterns. Software development and Web site development techniques have evolved significantly over the past 20 years. Recent progress in this area proposes languages to model web applications to facilitate the nuances specific to these developments. These modeling languages provide a suitable framework for building reusable components encapsulating our design patterns of stigmergy. We hypothesize that incorporating stigmergy as a separate feature of a site’s primary function will ultimately lead to enhanced user coordination.
Resumo:
Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.
Resumo:
Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.
Resumo:
Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.
Resumo:
Many software applications extend their functionality by dynamically loading executable components into their allocated address space. Such components, exemplified by browser plugins and other software add-ons, not only enable reusability, but also promote programming simplicity, as they reside in the same address space as their host application, supporting easy sharing of complex data structures and pointers. However, such components are also often of unknown provenance and quality and may be riddled with accidental bugs or, in some cases, deliberately malicious code. Statistics show that such component failures account for a high percentage of software crashes and vulnerabilities. Enabling isolation of such fine-grained components is therefore necessary to increase the stability, security and resilience of computer programs. This thesis addresses this issue by showing how host applications can create isolation domains for individual components, while preserving the benefits of a single address space, via a new architecture for software isolation called LibVM. Towards this end, we define a specification which outlines the functional requirements for LibVM, identify the conditions under which these functional requirements can be met, define an abstract Application Programming Interface (API) that encompasses the general problem of isolating shared libraries, thus separating policy from mechanism, and prove its practicality with two concrete implementations based on hardware virtualization and system call interpositioning, respectively. The results demonstrate that hardware isolation minimises the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution’s correctness. This thesis concludes that, not only is it feasible to create such isolation domains for individual components, but that it should also be a fundamental operating system supported abstraction, which would lead to more stable and secure applications.
Resumo:
As buildings have become more advanced and complex, our ability to understand how they are operated and managed has diminished. Modern technologies have given us systems to look after us but it appears to have taken away our say in how we like our environment to be managed. The aim of this paper is to discuss our research concerning spaces that are sensitive to changing needs and allow building-users to have a certain level of freedom to understand and control their environment. We discuss why, what we call the Active Layer, is needed in modern buildings; how building inhabitants are to interact with it; and the development of interface prototypes to test consequences of having the Active Layer in our environment.
Resumo:
With a monolayer honeycomb-lattice of sp2-hybridized carbon atoms, graphene has demonstrated exceptional electrical, mechanical and thermal properties. One of its promising applications is to create graphene-polymer nanocomposites with tailored mechanical and physical properties. In general, the mechanical properties of graphene nanofiller as well as graphene-polymer interface govern the overall mechanical performance of graphene-polymer nanocomposites. However, the strengthening and toughening mechanisms in these novel nanocomposites have not been well understood. In this work, the deformation and failure of graphene sheet and graphene-polymer interface were investigated using molecular dynamics (MD) simulations. The effect of structural defects on the mechanical properties of graphene and graphene-polymer interface was investigated as well. The results showed that structural defects in graphene (e.g. Stone-Wales defect and multi-vacancy defect) can significantly deteriorate the fracture strength of graphene but may still make full utilization of corresponding strength of graphene and keep the interfacial strength and the overall mechanical performance of graphene-polymer nanocomposites.
Resumo:
Trauma education, both formal and informal, is an essential component of professional development for both nursing anf medical staff working in the Emergency Department. Ideally, this education will be multidisciplinary. As a result, the day to day aspects of emergency care such and team work and crew resource management are maintained.
Resumo:
Molecular-level computer simulations of restricted water diffusion can be used to develop models for relating diffusion tensor imaging measurements of anisotropic tissue to microstructural tissue characteristics. The diffusion tensors resulting from these simulations can then be analyzed in terms of their relationship to the structural anisotropy of the model used. As the translational motion of water molecules is essentially random, their dynamics can be effectively simulated using computers. In addition to modeling water dynamics and water-tissue interactions, the simulation software of the present study was developed to automatically generate collagen fiber networks from user-defined parameters. This flexibility provides the opportunity for further investigations of the relationship between the diffusion tensor of water and morphologically different models representing different anisotropic tissues.
Resumo:
The invention relates to a method for monitoring user activity on a mobile device, comprising an input and an output unit, comprising the following steps preferably in the following order: detecting and / or logging user activity on said input unit, identifying a foreground running application, hashing of a user-interface-element management list of the foreground running application, and creating a screenshot comprising items displayed on said input unit. The invention also relates to a method for analyzing user activity at a server, comprising the following step: obtaining at least one of an information about detected and / or logged user activity, an information about a foreground running application, a hashed user-interface-element management list and a screenshot from a mobile device. Further, a computer program product is provided, comprising one or more computer readable media having computer executable instructions for performing the steps of at least one of the aforementioned methods.
Resumo:
Public libraries and coworking spaces seek for means to facilitate peer collaboration, peer inspiration and cross-pollination of skills and creativity. However, social learning, inspiration and collaboration between coworkers do not come naturally. In particular in (semi-) public spaces, the behavioural norm among unacquainted coworkers is to work in individual silos without taking advantage of social learning or collaboration opportunities. This paper presents results from a pilot study of ‘Gelatine’ – a system that facilitates shared encounters between coworkers by allowing them to digitally ‘check in’ at a work space. Gelatine displays skills, areas of interest, and needs of currently present coworkers on a public screen. The results indicate that the system amplifies users’ sense of place and awareness of other coworkers, and serves as an interface for social learning through exploratory, opportunistic and serendipitous inspirations, as well as through helping users identify like-minded peers for follow-up face-to-face encounters. We discuss how Gelatine is perceived by users with different pre-entry motivations, and discuss users’ challenges as well as non-use of the system.
Resumo:
It has long been a concern that the wider uptake of the YAWL environment may have been hindered by the usability issues identified in the current Process Editor. As a consequence, it was decided that the Editor be completely rewritten to address those usability limitations. The result has been the implementation of a new YAWL Process Editor architecture that creates a clear separation between the User Interface component layer and the core processing back end, facilitating the redesign of the default user interface. This new architecture also supports the development of multiple User Interface front ends for specific contexts that take advantage of the core capabilities the new Editor architecture has to offer.
Resumo:
Designing across cultures requires considerable attention to inter-relational design methods that facilitate mutual exploration, learning and trust. Many Western design practices have been borne of a different model, utilizing approaches for the design team to rapidly gain insight into “users” in order to deliver concepts and prototypes, with little attention paid to different cultural understandings about being, knowledge, participation and life beyond the design project. This paper describes a project that intends to create and grow a sustainable set of technology assisted communication practices for the Warnindilyakwa people of Groote Eylandt in the form of digital noticeboards. Rather than academic practices of workshops, interviews, probes or theoretical discourses that emphasize an outside-in perspective, we emphasize building upon the local designs and practices. Our team combines bilingual members from the local Land Council in collaboration with academics from a remote urban university two thousand kilometers away. We contribute an approach of growing existing local practices and materials digitally in order to explore viable, innovative and sustainable technical solutions from this perspective.
Resumo:
This paper discusses computer mediated distance learning on a Master's level course in the UK and student perceptions of this as a quality learning environment.