971 resultados para Development Indicator
Resumo:
Background To describe the iterative development process and final version of ‘MobileMums’: a physical activity intervention for women with young children (<5 years) delivered primarily via mobile telephone (mHealth) short messaging service (SMS). Methods MobileMums development followed the five steps outlined in the mHealth development and evaluation framework: 1) conceptualization (critique of literature and theory); 2) formative research (focus groups, n= 48); 3) pre-testing (qualitative pilot of intervention components, n= 12); 4) pilot testing (pilot RCT, n= 88); and, 5) qualitative evaluation of the refined intervention (n= 6). Results Key findings identified throughout the development process that shaped the MobileMums program were the need for: behaviour change techniques to be grounded in Social Cognitive Theory; tailored SMS content; two-way SMS interaction; rapport between SMS sender and recipient; an automated software platform to generate and send SMS; and, flexibility in location of a face-to-face delivered component. Conclusions The final version of MobileMums is flexible and adaptive to individual participant’s physical activity goals, expectations and environment. MobileMums is being evaluated in a community-based randomised controlled efficacy trial (ACTRN12611000481976).
Resumo:
Recently many international tertiary educational programs have capitalised on the value design and business can have upon their interception (Martin, 2009; Brown, 2008; Bruce and Bessant, 2002; Manzini, 2009). This paper discusses the role that two teaching units – New Product Development and Design Led Innovation – play in forming an understanding of commercialisation needed in today’s Industrial Design education. These units are taught consecutively in the later years of the Bachelor of Industrial Design program at the Queensland University of Technology, Brisbane, Australia. In this paper, each teaching unit is discussed in detail and then as a conglomerate, in order to form a basis of knowledge students need in order to fully capitalise on the value design has in business, and to produce a more capable Industrial Design graduate of the future.
Resumo:
A comparison was made of accelerated professional development (APD) for nurses (n=64), involving peer consultation and reflective practice, and peer consultation alone (n=30). Although APD participants had a higher completion rate, improvements in caregiver behaviors and work environment were not significantly different.
Resumo:
Since 2007 Kite Arts Education Program (KITE), based at Queensland Performing Arts Centre (QPAC), has been engaged in delivering a series of theatre-based experiences for children in low socio-economic primary schools in Queensland. The artist in residence (AIR) project titled Yonder includes performances developed by the children with the support and leadership of teacher artists from KITE for their community and parents/carers,supported by a peak community cultural institution. In 2009,Queensland Performing Arts Centre partnered with Queensland University of Technology (QUT) Creative Industries Faculty (Drama) to conduct a three-year evaluation of the Yonder project to understand the operational dynamics, artistic outputs and the educational benefits of the project. This paper outlines the research findings for children engaged in the Yonder project in the interrelated areas of literacy development and social competencies. Findings are drawn from six iterations of the project in suburban locations on the edge of Brisbane city and in regional Queensland.
Resumo:
The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.
Resumo:
Over the last two decades, moves toward “inclusion” have prompted change in the formation of education policies, schooling structures and pedagogical practice. Yet, exclusion through the categorisation and segregation of students with diverse abilities has grown; particularly for students with challenging behaviour. This paper considers what has happened to inclusive education by focusing on three educational jurisdictions known to be experiencing different rates of growth in the identification of special educational needs: New South Wales (Australia), Alberta (Canada) and Finland (Europe). In our analysis, we consider the effects of competing policy forces that appear to thwart the development of inclusive schools in two of our case-study regions.
Resumo:
This paper is concerned with the unsupervised learning of object representations by fusing visual and motor information. The problem is posed for a mobile robot that develops its representations as it incrementally gathers data. The scenario is problematic as the robot only has limited information at each time step with which it must generate and update its representations. Object representations are refined as multiple instances of sensory data are presented; however, it is uncertain whether two data instances are synonymous with the same object. This process can easily diverge from stability. The premise of the presented work is that a robot's motor information instigates successful generation of visual representations. An understanding of self-motion enables a prediction to be made before performing an action, resulting in a stronger belief of data association. The system is implemented as a data-driven partially observable semi-Markov decision process. Object representations are formed as the process's hidden states and are coordinated with motor commands through state transitions. Experiments show the prediction process is essential in enabling the unsupervised learning method to converge to a solution - improving precision and recall over using sensory data alone.
Resumo:
Numerous environmental rating tools have developed around the world over the past decade or so, in an attempt to increase awareness of the impact buildings have on the environment. Whilst many of these tools can be applied across a variety of building types, the majority focus mainly on the commercial building sector. Only recently have some of the better known environmental rating tools become adaptable to the land development sector, where arguably the most visible environmental impacts are made. EnviroDevelopment is one such tool that enables rating of residential land development in Australia. This paper seeks to quantify the environmental benefits achieved by the environmental rating tool EnviroDevelopment, using data from its certified residential projects across Australia. This research will identify the environmental gains achieved in the residential land development sector that can be attributed to developers aspiring to gain certification under this rating tool.
Resumo:
Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Resumo:
The Internet is one of the most significant information and communication technologies to emerge during the end of the last century. It created new and effective means by which individuals and groups communicate. These advances led to marked institutional changes most notably in the realm of commercial exchange: it did not only provide the high-speed communication infrastructure to business enterprises; it also opened them to the global consumer base where they could market their products and services. Commercial interests gradually dominated Internet technology over the past several years and have been a factor in the increase of its user population and enhancement of infrastructure. Such commercial interests fitted comfortably within the structures of the Philippine government. As revealed in the study, state policies and programs make use of Internet technology as an enabler of commercial institutional reforms using traditional economic measures. Yet, despite efforts to maximize the Internet as an enabler for market-driven economic growth, the accrued benefits are yet to come about; it is largely present only in major urban areas and accessible to a small number of social groups. The failure of the Internet’s developmental capability can be traced back to the government’s wholesale adoption of commercial-centered discourse. The Internet’s developmental gains (i.e. instrumental, communicative and emancipatory) and features, which were always there since its inception, have been visibly left out in favor of its commercial value. By employing synchronic and diachronic analysis, it can be shown that the Internet can be a vital technology in promoting genuine social development in the Philippines. In general, the object is to realize a social environment of towards a more inclusive and participatory application of Internet technology, equally aware of the caveats or risks the technology may pose. It is argued further that there is a need for continued social scientific research regarding the social as and developmental implications of Internet technology at local level structures, such social sectors, specific communities and organizations. On the meta-level, such approach employed in this research can be a modest attempt in increasing the calculus of hope especially among the marginalized Filipino sectors, with the use of information and communications technologies. This emerging field of study—tentatively called Progressive Informatics—must emanate from the more enlightened social sectors, namely: the non-government, academic and locally-based organizations.
Resumo:
PURPOSE: This pilot project’s aim was to trial a tool and process for developing students’ ability to engage in self-assessment using reflection on their clinical experiences, including feedback from workplace learning, in order to aid them in linking theory to practice and develop strategies to improve performance. BACKGROUND: In nursing education, students can experience a mismatch in performance compared to theoretical learning, this is referred to as the ‘theory practice gap’ (Scully 2011, Chan Chan & Liu 2011). One specific contributing factor seems to be students’ inability to engage in meaningful reflection and self-correcting behaviours. A self-assessment strategy was implemented within a third year clinical unit to ameliorate this mismatch with encouraging results, as students developed self-direction in addressing learning needs. In this pilot project the above strategy was adapted for implementation between different clinical units, to create a whole of course approach to integrating workplace learning. METHOD: The methodology underpinning this project is a scaffolded, supported reflective practice process. Improved self-assessment skills is achieved by students reflecting on and engaging with feedback, then mapping this to learning outcomes to identify where performance can be improved. Evaluation of this project includes: collation of student feedback identifying successful strategies along with barriers encountered in implementation; feedback from students and teachers via above processes and tools; and comparison of the number of learning contracts issued in clinical nursing units with similar cohorts. RESULTS: Results will be complete by May 2012 and include analysis of the data collected via the above evaluation methods. Other outcomes will include the refined process and tool, plus resources that should improve cost effectiveness without reducing student support. CONCLUSION: Implementing these tools and processes over the entire student’s learning package, will assist them to demonstrate progressive development through the course. Students will have learnt to understand feedback and integrate these skills for life-long learning.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
The Clean Development Mechanism (CDM) has been praised for its ingenuity in mobilising finance to implement sustainable development practices in non-industrialised countries (known as Non-Annex 1 parties under the Kyoto Protocol). During the first commitment period of the Kyoto Protocol (2008-2012), a large number of clean development mechanism projects have been registered with the CDM board. In addition to the large number of registered CDM projects, there are significant numbers of proposed projects stalled in implementation due to the cumbersome and lengthy CDM approval process. Despite this regulatory criticism it is recognised that the role performed by the CDM is essential for achieving a significant reduction in global green house gas emissions. This is because the CDM funds sustainable development in countries that lack capacity to do so on their own. It is anticipated that some form of CDM instrument will continue post the 2012 timeframe and that reform of the mechanism will be focused around making the mechanism’s approval and implementation processes faster and more efficient.
Resumo:
The aim of the present study was to determine the effect of carbohydrate (CHO; sucrose) ingestion and environmental heat on the development of fatigue and the distribution of power output during a 16.1-km cycling time trial. Ten male cyclists (Vo(2max) = 61.7 +/- 5.0 ml.kg(-1).min(-1), mean +/- SD) performed four 90-min constant-pace cycling trials at 80% of second ventilatory threshold (220 +/- 12 W). Trials were conducted in temperate (18.1 +/- 0.4 degrees C) or hot (32.2 +/- 0.7 degrees C) conditions during which subjects ingested either CHO (0.96 g.kg(-1).h(-1)) or placebo (PLA) gels. All trials were followed by a 16.1-km time trial. Before and immediately after exercise, percent muscle activation was determined using superimposed electrical stimulation. Power output, integrated electromyography (iEMG) of vastus lateralis, rectal temperature, and skin temperature were recorded throughout the trial. Percent muscle activation significantly declined during the CHO and PLA trials in hot (6.0 and 6.9%, respectively) but not temperate conditions (1.9 and 2.2%, respectively). The decline in power output during the first 6 km was significantly greater during exercise in the heat. iEMG correlated significantly with power output during the CHO trials in hot and temperate conditions (r = 0.93 and 0.73; P < 0.05) but not during either PLA trial. In conclusion, cyclists tended to self-select an aggressive pacing strategy (initial high intensity) in the heat.
Resumo:
The field of destination image has been widely discussed in the destination literature since the early 1970s (see Mayo, 1973). However the extent to which travel context impacts on an individual’s destination image evaluation, and therefore destination choice, has received scant attention (Hu & Ritchie, 1993). This study, utilising expectancy-value theory, sought to elicit salient destination attributes from consumers across two travel contexts: short-break holidays and longer getaways. Using the Repertory Test technique, attributes elicited as being salient for short-break holidays were consistent with those elicited for longer getaways. While this study was limited to Brisbane’s near-home destinations, the results will be of interest to destination marketers and researchers interested in the challenge of positioning a destination in diverse markets.