926 resultados para brain, computer, interface
Resumo:
In power hardware in the loop (PHIL) simulations, a real-time simulated power system is interfaced to a piece of hardware, usually called hardware under test (HuT). A PHIL test can be realized using several simulation tools. Among them Real Time Digital Simulator (RTDS) is an ideal tool to perform complex power system simulations in near real-time. Stable operation of the entire system, along with the accuracy of simulation results are the main concerns regarding a PHIL simulation. In this paper, a simulated power network on RTDS will be interfaced to HuT through a voltage source converter (VSC). Issues around stability and other interface problems are studied and a new method to stabilize some unstable PHIL cases is proposed. PHIL simulation results in PSCAD and RSCAD are presented.
Resumo:
Previously, expected satiety (ES) has been measured using software and two-dimensional pictures presented on a computer screen. In this context, ES is an excellent predictor of self-selected portions, when quantified using similar images and similar software. In the present study we sought to establish the veracity of ES as a predictor of behaviours associated with real foods. Participants (N = 30) used computer software to assess their ES and ideal portion of three familiar foods. A real bowl of one food (pasta and sauce) was then presented and participants self-selected an ideal portion size. They then consumed the portion ad libitum. Additional measures of appetite, expected and actual liking, novelty, and reward, were also taken. Importantly, our screen-based measures of expected satiety and ideal portion size were both significantly related to intake (p < .05). By contrast, measures of liking were relatively poor predictors (p > .05). In addition, consistent with previous studies, the majority (90%) of participants engaged in plate cleaning. Of these, 29.6% consumed more when prompted by the experimenter. Together, these findings further validate the use of screen-based measures to explore determinants of portion-size selection and energy intake in humans.
Resumo:
This paper argues that relationships between countries and transnational corporations are not zero-sum games, but entail ‘complex governance’, where all actors must be considered in order to understand changes in the international system.
Resumo:
Background: Optimal adherence to antiretroviral therapy (ART) is necessary for people living with HIV/AIDS (PLHIV). There have been relatively few systematic analyses of factors that promote or inhibit adherence to antiretroviral therapy among PLHIV in Asia. This study assessed ART adherence and examined factors associated with suboptimal adherence in northern Viet Nam. Methods: Data from 615 PLHIV on ART in two urban and three rural outpatient clinics were collected by medical record extraction and from patient interviews using audio computer-assisted self-interview (ACASI). Results: The prevalence of suboptimal adherence was estimated to be 24.9% via a visual analogue scale (VAS) of past-month dose-missing and 29.1% using a modified Adult AIDS Clinical Trial Group scale for on-time dose-taking in the past 4 days. Factors significantly associated with the more conservative VAS score were: depression (p < 0.001), side-effect experiences (p < 0.001), heavy alcohol use (p = 0.001), chance health locus of control (p = 0.003), low perceived quality of information from care providers (p = 0.04) and low social connectedness (p = 0.03). Illicit drug use alone was not significantly associated with suboptimal adherence, but interacted with heavy alcohol use to reduce adherence (p < 0.001). Conclusions: This is the largest survey of ART adherence yet reported from Asia and the first in a developing country to use the ACASI method in this context. The evidence strongly indicates that ART services in Viet Nam should include screening and treatment for depression, linkage with alcohol and/or drug dependence treatment, and counselling to address the belief that chance or luck determines health outcomes.
Resumo:
This paper treats the blast response of a pile foundation in saturated sand using explicit nonlinear finite element analysis, considering complex material behavior of soil and soil–pile interaction. Blast wave propagation in the soil is studied and the horizontal deformation of pile and effective stresses in the pile are presented. Results indicate that the upper part of the pile to be vulnerable and the pile response decays with distance from the explosive. The findings of this research provide valuable information on the effects of underground explosions on pile foundation and will guide future development, validation and application of computer models.
Resumo:
Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.
Resumo:
In this paper we describe tag-based interaction afforded by a tag-based interface in online and mobile banking, and present our preliminary usability evaluation findings. We conducted a pilot usability study with a group of banking users by comparing the present 'conventional' interface and tag-based interface. The results show that participants perceive the tag-based interface as more usable in both online and mobile contexts. Participants also rated the tag-based interface better despite their unfamiliarity and perceived it as more user-friendly. Additionally, the results highlight that tag-based interaction is more effective in the mobile context especially to inexperienced mobile banking users. This in turn could have a positive effect on the adoption and acceptance of mobile banking in general and also specifically in Australia. We discuss our findings in more detail in the later sections of this paper and conclude with a discussion on future work.
Resumo:
Computer games have become a commonplace but engaging activity among students. They enjoy playing computer games as they can perform larger-than-life activities virtually such as jumping from great heights, flying planes, and racing cars; actions that are otherwise not possible in real life. Computer games also offer user interactivity which gives them a certain appeal. Considering this appeal, educators should consider integrating computer games into student learning and to encourage students to author computer games of their own. It is thought that students can be engaged in learning by authoring and using computer games and can also gain essential skills such as collaboration, teamwork, problem solving and deductive reasoning. The research in this study revolves around building student engagement through the task of authoring computer games. The study aims to demonstrate how the creation and sharing of student-authored educational games might facilitate student engagement and how ICT (information and communication technology) plays a supportive role in student learning. Results from this study may lead to the broader integration of computer games into student learning and contribute to similar studies. In this qualitative case study, based in a state school in a low socio-economic area west of Brisbane, Australia, students were selected in both junior and senior secondary classes who have authored computer games as a part of their ICT learning. Senior secondary students (Year 12 ICT) were given the task of programming the games, which were to be based on Mathematics learning topics while the junior secondary students (Year 8 ICT) were given the task of creating multimedia elements for the games. A Mathematics teacher volunteered to assist in the project and provided guidance on the inclusion of suitable Mathematics curricular content into these computer games. The student-authored computer games were then used to support another group of Year 8 Mathematics students to learn the topics of Area, Volume and Time. Data was collected through interviews, classroom observations and artefacts. The teacher researcher, acting in the role of ICT teacher, coordinated with the students and the Mathematics teacher to conduct this study. Instrumental case study was applied as research methodology and Third Generation Activity Theory served as theoretical framework for this study. Data was analysed adopting qualitative coding procedures. Findings of this study indicate that having students author and play computer games promoted student engagement and that ICT played a supportive role in learning and allowed students to gain certain essential skills. Although this study will suggest integrating computer games to support classroom learning, it cannot be presumed that computer games are an immediate solution for promoting student engagement.
Resumo:
Situated on Youtube, and shown in various locations. In this video we show a 3D mock up of a personal house purchasing process. A path traversal metaphor is used to give a sense of progression along the process stages. The intention is to be able to use console devices like an Xbox to consume business processes. This is so businesses can expose their internal processes to consumers using sophisticated user interfaces. The demonstrator was developed using Microsoft XNA, with assistance from the Suncorp Bank and the Smart Services CRC. More information at: www.bpmve.org
Resumo:
Stigmergy is a biological term originally used when discussing insect or swarm behaviour, and describes a model supporting environment-based communication separating artefacts from agents. This phenomenon is demonstrated in the behavior of ants and their food foraging supported by pheromone trails, or similarly termites and their termite nest building process. What is interesting with this mechanism is that highly organized societies are formed without an apparent central management function. We see design features in Web sites that mimic stigmergic mechanisms as part of the User Interface and we have created generalizations of these patterns. Software development and Web site development techniques have evolved significantly over the past 20 years. Recent progress in this area proposes languages to model web applications to facilitate the nuances specific to these developments. These modeling languages provide a suitable framework for building reusable components encapsulating our design patterns of stigmergy. We hypothesize that incorporating stigmergy as a separate feature of a site’s primary function will ultimately lead to enhanced user coordination.
Resumo:
Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.
Resumo:
Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.
Resumo:
Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.
Resumo:
Many software applications extend their functionality by dynamically loading executable components into their allocated address space. Such components, exemplified by browser plugins and other software add-ons, not only enable reusability, but also promote programming simplicity, as they reside in the same address space as their host application, supporting easy sharing of complex data structures and pointers. However, such components are also often of unknown provenance and quality and may be riddled with accidental bugs or, in some cases, deliberately malicious code. Statistics show that such component failures account for a high percentage of software crashes and vulnerabilities. Enabling isolation of such fine-grained components is therefore necessary to increase the stability, security and resilience of computer programs. This thesis addresses this issue by showing how host applications can create isolation domains for individual components, while preserving the benefits of a single address space, via a new architecture for software isolation called LibVM. Towards this end, we define a specification which outlines the functional requirements for LibVM, identify the conditions under which these functional requirements can be met, define an abstract Application Programming Interface (API) that encompasses the general problem of isolating shared libraries, thus separating policy from mechanism, and prove its practicality with two concrete implementations based on hardware virtualization and system call interpositioning, respectively. The results demonstrate that hardware isolation minimises the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution’s correctness. This thesis concludes that, not only is it feasible to create such isolation domains for individual components, but that it should also be a fundamental operating system supported abstraction, which would lead to more stable and secure applications.