937 resultados para Illiac computer Programming.
Resumo:
In this chapter we aim to explore how videogames can lead to improvements in wellbeing. Following Keyes (2007) and Huppert and So (2012) we view wellbeing as a multidimensional concept with both hedonic and eudaimonic aspects. In this chapter we take a broad approach in terms of exploring the impact of videogames on the psychological, social, and physical components of wellbeing. We explore how videogames have been shown to have an impact in each of these domains. Although there is a great deal of evidence for the actual and potential positive impacts of videogames, there are many unanswered questions regarding the situations in which there is likely to be an impact of videogame play on wellbeing, as well as the aspects of wellbeing that are likely to be impacted by videogame play. We conclude the chapter by outlining the key questions for future research. Our focus in this chapter is on the positive influences of videogames. We do not explore research on contexts in which negative impacts are possible or subgroups for which videogames could cause harm. However, these questions are obviously important and we see balanced engagement with age-appropriate videogames as a key prerequisite for any of the wellbeing benefits discussed below.
Resumo:
In this paper we propose a novel approach to multi-action recognition that performs joint segmentation and classification. This approach models each action using a Gaussian mixture using robust low-dimensional action features. Segmentation is achieved by performing classification on overlapping temporal windows, which are then merged to produce the final result. This approach is considerably less complicated than previous methods which use dynamic programming or computationally expensive hidden Markov models (HMMs). Initial experiments on a stitched version of the KTH dataset show that the proposed approach achieves an accuracy of 78.3%, outperforming a recent HMM-based approach which obtained 71.2%.
Resumo:
Introduction of dynamic pricing in present retail market, considerably affects customers with an increased cost of energy consumption. Therefore, customers are enforced to control their loads according to price variation. This paper proposes a new technique of Home Energy Management, which helps customers to minimize their cost of energy consumption by appropriately controlling their loads. Thermostatically Controllable Appliances (TCAs) such as air conditioner and water heater are focused in this study, as they consume more than 50% of the total household energy consumption. The control process includes stochastic dynamic programming, which incorporated uncertainties in price and demand variation. It leads to an accurate selection of appliance settings. It is followed by a real time control of selected appliances with its optimal settings. Temperature set points of TCAs are adjusted based on price droop which is a reflection of actual cost of energy consumption. Customer satisfaction is maintained within limits using constraint optimization. It is showed that considerable energy savings is achieved.
Resumo:
We propose a method for learning specific object representations that can be applied (and reused) in visual detection and identification tasks. A machine learning technique called Cartesian Genetic Programming (CGP) is used to create these models based on a series of images. Our research investigates how manipulation actions might allow for the development of better visual models and therefore better robot vision. This paper describes how visual object representations can be learned and improved by performing object manipulation actions, such as, poke, push and pick-up with a humanoid robot. The improvement can be measured and allows for the robot to select and perform the `right' action, i.e. the action with the best possible improvement of the detector.
Resumo:
The standard method for deciding bit-vector constraints is via eager reduction to propositional logic. This is usually done after first applying powerful rewrite techniques. While often efficient in practice, this method does not scale on problems for which top-level rewrites cannot reduce the problem size sufficiently. A lazy solver can target such problems by doing many satisfiability checks, each of which only reasons about a small subset of the problem. In addition, the lazy approach enables a wide range of optimization techniques that are not available to the eager approach. In this paper we describe the architecture and features of our lazy solver (LBV). We provide a comparative analysis of the eager and lazy approaches, and show how they are complementary in terms of the types of problems they can efficiently solve. For this reason, we propose a portfolio approach that runs a lazy and eager solver in parallel. Our empirical evaluation shows that the lazy solver can solve problems none of the eager solvers can and that the portfolio solver outperforms other solvers both in terms of total number of problems solved and the time taken to solve them.
Resumo:
Programming is a subject that many beginning students find difficult. The PHP Intelligent Tutoring System (PHP ITS) has been designed with the aim of making it easier for novices to learn the PHP language in order to develop dynamic web pages. Programming requires practice. This makes it necessary to include practical exercises in any ITS that supports students learning to program. The PHP ITS works by providing exercises for students to solve and then providing feedback based on their solutions. The major challenge here is to be able to identify many semantically equivalent solutions to a single exercise. The PHP ITS achieves this by using theories of Artificial Intelligence (AI) including first-order predicate logic and classical and hierarchical planning to model the subject matter taught by the system. This paper highlights the approach taken by the PHP ITS to analyse students’ programs that include a number of program constructs that are used by beginners of web development. The PHP ITS was built using this model and evaluated in a unit at the Queensland University of Technology. The results showed that it was capable of correctly analysing over 96 % of the solutions to exercises supplied by students.
Resumo:
This thesis articulates and examines public engagement programming in an emerging, non¬-traditional site. As a practice-led research project, the creative work proposes a site responsive, engagement centric, agile model for curatorial programming that developed out of the dynamic, new media/digital, curatorial practice at QUT's Creative Industries Precinct. The model and its accompanying exegetical framework, Curating in Uncharted Territories, offer a theoretically informed approach to programming, delivering and reporting for curatorial practices in a non¬-traditional sites of public engagement. The research provides the foundation for full development of the model and the basis for further research.
Resumo:
Emergency Response Teams increasingly use interactive technology to help manage information and communications. The challenge is to maintain a high situation awareness for different interactive devices sizes. This research specifically compared a handheld interactive device in the form of an iPad with a large interactive multi-touch tabletop. A search and rescue inspired simulator was designed to test operator situation awareness for the two sized devices. The results show that operators had better situation awareness on the tabletop device when the operation related to detecting of moving targets, searching target locations, distinguishing target types, and comprehending displayed information.
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.
Resumo:
A description of a computer program to analyse cine angiograms of the heart and pressure waveforms to calculate valve gradients.
Resumo:
Hyperthermia, raised temperature, has been used as a means of treating cancer for centuries. Hippocrates (400 BC) and Galen (200 BC) used red-hot irons to treat small tumours. Much later, after the Renaissance, there are many reports of spontaneous tumour regression in patients with fevers produced by erysipelas, malaria, smallpox, tuberculosis and influenza. These illnesses produce fevers of about 40 °C which last for several days. Temperatures of at least 40 °C were found to be necessary for tumour regression. Towards the end of the nineteenth century pyrogenic bacteria were injected into patients with cancer. In 1896, Coly used a mixture of erysipelas and B. prodigeosus, with some success...
Resumo:
The evolution of technological systems is hindered by systemic components, referred to as reverse salients, which fail to deliver the necessary level of technological performance thereby inhibiting the performance delivery of the system as a whole. This paper develops a performance gap measure of reverse salience and applies this measurement in the study of the PC (personal computer) technological system, focusing on the evolutions of firstly the CPU (central processing unit) and PC game sub-systems, and secondly the GPU (graphics processing unit) and PC game sub-systems. The measurement of the temporal behavior of reverse salience indicates that the PC game sub-system is the reverse salient, continuously trailing behind the technological performance of the CPU and GPU sub-systems from 1996 through 2006. The technological performance of the PC game sub-system as a reverse salient trails that of the CPU sub-system by up to 2300 MHz with a gradually decreasing performance disparity in recent years. In contrast, the dynamics of the PC game sub-system as a reverse salient trails the GPU sub-system with an ever increasing performance gap throughout the timeframe of analysis. In addition, we further discuss the research and managerial implications of our findings.
Resumo:
This paper presents a numerical study of the response of axially loaded concrete filled steel tube (CFST) columns under lateral impact loading using explicit non-linear finite element techniques. The aims of this paper are to evaluate the vulnerability of existing columns to credible impact events as well as to contribute new information towards the safe design of such vulnerable columns. The model incorporates concrete confinement, strain rate effects of steel and concrete, contact between the steel tube and concrete and dynamic relaxation for pre-loading, which is a relatively recent method for applying a pre-loading in the explicit solver. The finite element model was first verified by comparing results with existing experimental results and then employed to conduct a parametric sensitivity analysis. The effects of various structural and load parameters on the impact response of the CFST column were evaluated to identify the key controlling factors. Overall, the major parameters which influence the impact response of the column are the steel tube thickness to diameter ratio, the slenderness ratio and the impact velocity. The findings of this study will enhance the current state of knowledge in this area and can serve as a benchmark reference for future analysis and design of CFST columns under lateral impact.
Resumo:
Mobile devices are very popular among tertiary student populations. This study looks at student use of hand-held mobile devices within the context of a first year programming unit. This research sought for ways in which an educational app on these devices could be successfully integrated into such a class's learning.