772 resultados para computer-based instruction
Resumo:
The aim of the study is to establish optimum building aspect ratios and south window sizes of residential buildings from thermal performance point of view. The effects of 6 different building aspect ratios and eight different south window sizes for each building aspect ratio are analyzed for apartments located at intermediate floors of buildings, by the aid of the computer based thermal analysis program SUNCODE-PC in five cities of Turkey: Erzurum, Ankara, Diyarbakir, Izmir, and Antalya. The results are evaluated in terms of annual energy consumption and the optimum values are driven. Comparison of optimum values and the total energy consumption rates is made among the analyzed cities.
Resumo:
This research explores music in space, as experienced through performing and music-making with interactive systems. It explores how musical parameters may be presented spatially and displayed visually with a view to their exploration by a musician during performance. Spatial arrangements of musical components, especially pitches and harmonies, have been widely studied in the literature, but the current capabilities of interactive systems allow the improvisational exploration of these musical spaces as part of a performance practice. This research focuses on quantised spatial organisation of musical parameters that can be categorised as grid music systems (GMSs), and interactive music systems based on them. The research explores and surveys existing and historical uses of GMSs, and develops and demonstrates the use of a novel grid music system designed for whole body interaction. Grid music systems provide plotting of spatialised input to construct patterned music on a two-dimensional grid layout. GMSs are navigated to construct a sequence of parametric steps, for example a series of pitches, rhythmic values, a chord sequence, or terraced dynamic steps. While they are conceptually simple when only controlling one musical dimension, grid systems may be layered to enable complex and satisfying musical results. These systems have proved a viable, effective, accessible and engaging means of music-making for the general user as well as the musician. GMSs have been widely used in electronic and digital music technologies, where they have generally been applied to small portable devices and software systems such as step sequencers and drum machines. This research shows that by scaling up a grid music system, music-making and musical improvisation are enhanced, gaining several advantages: (1) Full body location becomes the spatial input to the grid. The system becomes a partially immersive one in four related ways: spatially, graphically, sonically and musically. (2) Detection of body location by tracking enables hands-free operation, thereby allowing the playing of a musical instrument in addition to “playing” the grid system. (3) Visual information regarding musical parameters may be enhanced so that the performer may fully engage with existing spatial knowledge of musical materials. The result is that existing spatial knowledge is overlaid on, and combined with, music-space. Music-space is a new concept produced by the research, and is similar to notions of other musical spaces including soundscape, acoustic space, Smalley's “circumspace” and “immersive space” (2007, 48-52), and Lotis's “ambiophony” (2003), but is rather more textural and “alive”—and therefore very conducive to interaction. Music-space is that space occupied by music, set within normal space, which may be perceived by a person located within, or moving around in that space. Music-space has a perceivable “texture” made of tensions and relaxations, and contains spatial patterns of these formed by musical elements such as notes, harmonies, and sounds, changing over time. The music may be performed by live musicians, created electronically, or be prerecorded. Large-scale GMSs have the capability not only to interactively display musical information as music representative space, but to allow music-space to co-exist with it. Moving around the grid, the performer may interact in real time with musical materials in music-space, as they form over squares or move in paths. Additionally he/she may sense the textural matrix of the music-space while being immersed in surround sound covering the grid. The HarmonyGrid is a new computer-based interactive performance system developed during this research that provides a generative music-making system intended to accompany, or play along with, an improvising musician. This large-scale GMS employs full-body motion tracking over a projected grid. Playing with the system creates an enhanced performance employing live interactive music, along with graphical and spatial activity. Although one other experimental system provides certain aspects of immersive music-making, currently only the HarmonyGrid provides an environment to explore and experience music-space in a GMS.
Resumo:
In 1995 Griffith University undertook a review of the provision of learning assistance for undergraduate students across its five campuses. One outcome of that review was the introduction of a multi-campus, computer-based, delivery system intended to support broader learning assistance activities. This paper describes the delivery system and the learning assistance resources and shares feedback from students and staff who engaged with the resources in semester one 1997. Highlighted are some broader issues associated with increasing the flexibility and accessibility of learning assistance to a larger, more diverse student population.
Resumo:
Enormous amounts of money and energy are being devoted to the development, use and organisation of computer-based scientific visualisations (e.g. animations and simulations) in science education. It seems plausible that visualisations that enable students to gain visual access to scientific phenomena that are too large, too small or occur too quickly or too slowly to be seen by the naked eye, or to scientific concepts and models, would yield enhanced conceptual learning. When the literature is searched, however, it quickly becomes apparent that there is a dearth of quantitative evidence for the effectiveness of scientific visualisations in enhancing students’ learning of science concepts. This paper outlines an Australian project that is using innovative research methodology to gather evidence on this question in physics and chemistry classrooms.
Resumo:
Expected satiety has been shown to play a key role in decisions around meal size. Recently it has become clear that these expectations can also influence the satiety that is experienced after a food has been consumed. As such, increasing the expected and actual satiety a food product confers without increasing its caloric content is of importance. In this study we sought to determine whether this could be achieved via product labelling. Female participants (N=75) were given a 223-kcal yoghurt smoothie for lunch. In separate conditions the smoothie was labelled as a diet brand, a highly-satiating brand, or an ‘own brand’ control. Expected satiety was assessed using rating scales and a computer-based ‘method of adjustment’, both prior to consuming the smoothie and 24 hours later. Hunger and fullness were assessed at baseline, immediately after consuming the smoothie, and for a further three hours. Despite the fact that all participants consumed the same food, the smoothie branded as highly-satiating was consistently expected to deliver more satiety than the other ‘brands’; this difference was sustained 24 hours after consumption. Furthermore, post-consumption and over three hours, participants consuming this smoothie reported significantly less hunger and significantly greater fullness. These findings demonstrate that the satiety that a product confers depends in part on information that is present around the time of consumption. We suspect that this process is mediated by changes to expected satiety. These effects may potentially be utilised in the development of successful weight-management products.
Resumo:
Previous studies have shown that exercise (Ex) interventions create a stronger coupling between energy intake (EI) and energy expenditure (EE) leading to increased homeostasis of the energy-balance (EB) regulatory system compared to a diet intervention where an un-coupling between EI and EE occurs. The benefits of weight loss from Ex and diet interventions greatly depend on compensatory responses. The present study investigated an 8-week medium-term Ex and diet intervention program (Ex intervention comprised of 500kcal EE five days per week over four weeks at 65-75% maximal heart rate, whereas the diet intervention comprised of a 500kcal decrease in EI five days per week over four weeks) and its effects on compensatory responses and appetite regulation among healthy individuals using a between- and within-subjects design. Effects of an acute dietary manipulation on appetite and compensatory behaviours and whether a diet and/or Ex intervention pre-disposes individuals to disturbances in EB homeostasis were tested. Energy intake at an ad libitum lunch test meal after a breakfast high- and low-energy pre-load (the high energy pre-load contained 556kcal and the low energy pre-load contained 239kcal) were measured at the Baseline (Weeks -4 to 0) and Intervention (Weeks 0 to 4) phases in 13 healthy volunteers (three males and ten females; mean age 35 years [sd + 9] and mean BMI 25 kg/m2 [sd + 3.8]) [participants in each group included Ex=7, diet=5 (one female in the diet group dropped out midway), thus, 12 participants completed the study]. At Weeks -4, 0 and 4, visual analogue scales (VAS) were used to assess hunger and satiety and liking and wanting (L&W) for nutrient and taste preferences using a computer-based system (E-Prime v1.1.4). Ad libitum test meal EI was consistently lower after the HE pre-load compared to the LE pre-load. However, this was not consistent during the diet intervention however. A pre-load x group interaction on ad libitum test meal EI revealed that during the intervention phase the Ex group showed an improved sensitivity to detect the energy content between the two pre-loads and improved compensation for the ad libitum test meal whereas the diet group’s ability to differentiate between the two pre-loads decreased and showed poorer compensation (F[1,10]=2.88, p-value not significant). This study supports previous findings of the effect Ex and diet interventions have on appetite and compensatory responses; Ex increases and diet decreases energy balance sensitivity.
Resumo:
Purpose – The purpose of this paper is to provide description and analysis of how a traditional industry is currently using e-learning, and to identify how the potential of e-learning can be realised whilst acknowledging the technological divide between younger and older workers. Design/methodology/approach – An exploratory qualitative methodology was employed to analyse three key questions: How is the Australian rail industry currently using e-learning? Are there age-related issues with the current use of e-learning in the rail industry? How could e-learning be used in future to engage different generations of learners in the rail industry? Data were collected in five case organisations from across the Australian rail industry. Findings – Of the rail organisations interviewed, none believed they were using e-learning to its full potential. The younger, more technologically literate employees are not having their expectations met and therefore retention of younger workers has become an issue. The challenge for learning and development practitioners is balancing the preferences of an aging workforce with these younger, more “technology-savvy”, learners and the findings highlight some potential ways to begin addressing this balance. Practical implications – The findings identified the potential for organisations (even those in a traditional industry such as rail) to better utilise e-learning to attract and retain younger workers but also warns against making assumptions about technological competency based on age. Originality/value – Data were gathered across an industry, and thus this paper takes an industry approach to considering the potential age-related issues with e-learning and the ways it may be used to meet the needs of different generations in the workplace.
Resumo:
This paper presents a number of characteristics of the Internet that makes it attractive to online groomers. Relevant Internet characteristics include disconnected personal communication, mediating technology, universality, network externalities, distribution channel, time moderator, low‐cost standard, electronic double, electronic double manipulation, information asymmetry, infinite virtual capacity, independence in time and space, cyberspace, and dynamic social network. Potential sex offenders join virtual communities, where they meet other persons who have the same interest. A virtual community provides an online meeting place where people with similar interests can communicate and find useful information. Communication between members may be via email, bulletin boards, online chat, web‐based conferencing or other computer‐based media.
Resumo:
A quantitative, quasi-experimental study of the effectiveness of computer-based scientific visualizations for concept learning on the part of Year 11 physics students (n=80) was conducted in six Queensland high school classrooms. Students’ gender and academic ability were also considered as factors in relation to the effectiveness of teaching with visualizations. Learning with visualizations was found to be equally effective as learning without them for all students, with no statistically significant difference in outcomes being observed for the group as a whole or on the academic ability dimension. Male students were found to learn significantly better with visualizations than without, while no such effect was observed for female students. This may give rise to some concern for the equity issues raised by introducing visualizations. Given that other research shows that students enjoy learning with visualizations and that their engagement with learning is enhanced, the finding that the learning outcomes are the same as for teaching without visualizations supports teachers’ use of visualizations.
Resumo:
Echocardiography is the commonest form of non-invasive cardiac imaging and is fundamental to patient management. However, due to its methodology, it is also operator dependent. There are well defined pathways in training and ongoing accreditation to achieve and maintain competency. To satisfy these requirements, significant time has to be dedicated to scanning patients, often in the time pressured clinical environment. Alternative, computer based training methods are being considered to augment echocardiographic training. Numerous advances in technology have resulted in the development of interactive programmes and simulators to teach trainees the skills to perform particular procedures, including transthoracic and transoesophageal echocardiography. 82 sonographers and TOE proceduralists utilised an echocardiographic simulator and assessed its utility using defined criteria. 40 trainee sonographers assessed the simulator and were taught how to obtain an apical 2 chamber (A2C) view and image the superior vena cava (SVC). 100% and 88% found the simulator useful in obtaining the SVC or A2C view respectively. All users found it easy to use and the majority found it helped with image acquisition and interpretation. 42 attendees of a TOE training day utilising the simulator assessed the simulator with 100% finding it easy to use, as well as the augmented reality graphics benefiting image acquisition. 90% felt that it was realistic. This study revealed that both trainee sonographers and TOE proceduralists found the simulation process was realistic, helped in image acquisition and improved assessment of spatial relationships. Echocardiographic simulators may play an important role in the future training of echocardiographic skills.
Resumo:
Intelligent Tutoring Systems (ITSs) are computer systems designed to provide individualised help to students, learning in a problem solving context. The difference between an ITS and a Computer Assisted Instruction (CAI) system is that an ITS has a Student Model which allows it to provide a better educational environment. The Student Model contains information on what the student knows, and does not know, about the domain being learnt, as well as other personal characteristics such as preferred learning style. This research has resulted in the design and development of a new ITS: Personal Access Tutor (PAT). PAT is an ITS that helps students to learn Rapid Application Development in a database environment. More specifically, PAT focuses on helping students to learn how to create forms and reports in Microsoft Access. To provide an augmented learning environment, PAT’s architecture is different to most other ITSs. Instead of having a simulation, PAT uses a widelyused database development environment (Microsoft Access). This enables the students to ask for help, while developing real applications using real database software. As part of this research, I designed and created the knowledge base required for PAT. This contains four models: the domain, student, tutoring and exercises models. The Instructional Expert I created for PAT provides individualised help to the students to help them correctly finish each exercise, and also proposes the next exercise that a student should work on. PAT was evaluated by students enrolled in the Databases subject at QUT, and by staff members involved in teaching the subject. The results of the evaluation were positive and are discussed in the thesis.
Resumo:
This thesis is concerned with creating and evaluating interactive art systems that facilitate emergent participant experiences. For the purposes of this research, interactive art is the computer based arts involving physical participation from the audience, while emergence is when a new form or concept appears that was not directly implied by the context from which it arose. This emergent ‘whole’ is more than a simple sum of its parts. The research aims to develop understanding of the nature of emergent experiences that might arise during participant interaction with interactive art systems. It also aims to understand the design issues surrounding the creation of these systems. The approach used is Practice-based, integrating practice, evaluation and theoretical research. Practice used methods from Reflection-in-action and Iterative design to create two interactive art systems: Glass Pond and +-now. Creation of +-now resulted in a novel method for instantiating emergent shapes. Both art works were also evaluated in exploratory studies. In addition, a main study with 30 participants was conducted on participant interaction with +-now. These sessions were video recorded and participants were interviewed about their experience. Recordings were transcribed and analysed using Grounded theory methods. Emergent participant experiences were identified and classified using a taxonomy of emergence in interactive art. This taxonomy draws on theoretical research. The outcomes of this Practice-based research are summarised as follows. Two interactive art systems, where the second work clearly facilitates emergent interaction, were created. Their creation involved the development of a novel method for instantiating emergent shapes and it informed aesthetic and design issues surrounding interactive art systems for emergence. A taxonomy of emergence in interactive art was also created. Other outcomes are the evaluation findings about participant experiences, including different types of emergence experienced and the coding schemes produced during data analysis.
Resumo:
By the end of the 20th century the shift from professional recording studio to personal computer based recording systems was well established (Chadabe 1997) and musicians could increasingly see the benefits of value adding to the musical process by producing their own musical endeavours. At the Queensland University of Technology (QUT) where we were teaching, the need for a musicianship program that took account of these trends was becoming clear. The Sound Media Musicianship unit described in this chapter was developed to fill this need and ran from 1999 through 2010.
Resumo:
Identifying the design features that impact construction is essential to developing cost effective and constructible designs. The similarity of building components is a critical design feature that affects method selection, productivity, and ultimately construction cost and schedule performance. However, there is limited understanding of what constitutes similarity in the design of building components and limited computer-based support to identify this feature in a building product model. This paper contributes a feature-based framework for representing and reasoning about component similarity that builds on ontological modelling, model-based reasoning and cluster analysis techniques. It describes the ontology we developed to characterize component similarity in terms of the component attributes, the direction, and the degree of variation. It also describes the generic reasoning process we formalized to identify component similarity in a standard product model based on practitioners' varied preferences. The generic reasoning process evaluates the geometric, topological, and symbolic similarities between components, creates groupings of similar components, and quantifies the degree of similarity. We implemented this reasoning process in a prototype cost estimating application, which creates and maintains cost estimates based on a building product model. Validation studies of the prototype system provide evidence that the framework is general and enables a more accurate and efficient cost estimating process.
Resumo:
Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.