847 resultados para Development Indicator
Resumo:
The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.
Resumo:
Over the last two decades, moves toward “inclusion” have prompted change in the formation of education policies, schooling structures and pedagogical practice. Yet, exclusion through the categorisation and segregation of students with diverse abilities has grown; particularly for students with challenging behaviour. This paper considers what has happened to inclusive education by focusing on three educational jurisdictions known to be experiencing different rates of growth in the identification of special educational needs: New South Wales (Australia), Alberta (Canada) and Finland (Europe). In our analysis, we consider the effects of competing policy forces that appear to thwart the development of inclusive schools in two of our case-study regions.
Resumo:
This paper is concerned with the unsupervised learning of object representations by fusing visual and motor information. The problem is posed for a mobile robot that develops its representations as it incrementally gathers data. The scenario is problematic as the robot only has limited information at each time step with which it must generate and update its representations. Object representations are refined as multiple instances of sensory data are presented; however, it is uncertain whether two data instances are synonymous with the same object. This process can easily diverge from stability. The premise of the presented work is that a robot's motor information instigates successful generation of visual representations. An understanding of self-motion enables a prediction to be made before performing an action, resulting in a stronger belief of data association. The system is implemented as a data-driven partially observable semi-Markov decision process. Object representations are formed as the process's hidden states and are coordinated with motor commands through state transitions. Experiments show the prediction process is essential in enabling the unsupervised learning method to converge to a solution - improving precision and recall over using sensory data alone.
Resumo:
Numerous environmental rating tools have developed around the world over the past decade or so, in an attempt to increase awareness of the impact buildings have on the environment. Whilst many of these tools can be applied across a variety of building types, the majority focus mainly on the commercial building sector. Only recently have some of the better known environmental rating tools become adaptable to the land development sector, where arguably the most visible environmental impacts are made. EnviroDevelopment is one such tool that enables rating of residential land development in Australia. This paper seeks to quantify the environmental benefits achieved by the environmental rating tool EnviroDevelopment, using data from its certified residential projects across Australia. This research will identify the environmental gains achieved in the residential land development sector that can be attributed to developers aspiring to gain certification under this rating tool.
Resumo:
Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Resumo:
The Internet is one of the most significant information and communication technologies to emerge during the end of the last century. It created new and effective means by which individuals and groups communicate. These advances led to marked institutional changes most notably in the realm of commercial exchange: it did not only provide the high-speed communication infrastructure to business enterprises; it also opened them to the global consumer base where they could market their products and services. Commercial interests gradually dominated Internet technology over the past several years and have been a factor in the increase of its user population and enhancement of infrastructure. Such commercial interests fitted comfortably within the structures of the Philippine government. As revealed in the study, state policies and programs make use of Internet technology as an enabler of commercial institutional reforms using traditional economic measures. Yet, despite efforts to maximize the Internet as an enabler for market-driven economic growth, the accrued benefits are yet to come about; it is largely present only in major urban areas and accessible to a small number of social groups. The failure of the Internet’s developmental capability can be traced back to the government’s wholesale adoption of commercial-centered discourse. The Internet’s developmental gains (i.e. instrumental, communicative and emancipatory) and features, which were always there since its inception, have been visibly left out in favor of its commercial value. By employing synchronic and diachronic analysis, it can be shown that the Internet can be a vital technology in promoting genuine social development in the Philippines. In general, the object is to realize a social environment of towards a more inclusive and participatory application of Internet technology, equally aware of the caveats or risks the technology may pose. It is argued further that there is a need for continued social scientific research regarding the social as and developmental implications of Internet technology at local level structures, such social sectors, specific communities and organizations. On the meta-level, such approach employed in this research can be a modest attempt in increasing the calculus of hope especially among the marginalized Filipino sectors, with the use of information and communications technologies. This emerging field of study—tentatively called Progressive Informatics—must emanate from the more enlightened social sectors, namely: the non-government, academic and locally-based organizations.
Resumo:
PURPOSE: This pilot project’s aim was to trial a tool and process for developing students’ ability to engage in self-assessment using reflection on their clinical experiences, including feedback from workplace learning, in order to aid them in linking theory to practice and develop strategies to improve performance. BACKGROUND: In nursing education, students can experience a mismatch in performance compared to theoretical learning, this is referred to as the ‘theory practice gap’ (Scully 2011, Chan Chan & Liu 2011). One specific contributing factor seems to be students’ inability to engage in meaningful reflection and self-correcting behaviours. A self-assessment strategy was implemented within a third year clinical unit to ameliorate this mismatch with encouraging results, as students developed self-direction in addressing learning needs. In this pilot project the above strategy was adapted for implementation between different clinical units, to create a whole of course approach to integrating workplace learning. METHOD: The methodology underpinning this project is a scaffolded, supported reflective practice process. Improved self-assessment skills is achieved by students reflecting on and engaging with feedback, then mapping this to learning outcomes to identify where performance can be improved. Evaluation of this project includes: collation of student feedback identifying successful strategies along with barriers encountered in implementation; feedback from students and teachers via above processes and tools; and comparison of the number of learning contracts issued in clinical nursing units with similar cohorts. RESULTS: Results will be complete by May 2012 and include analysis of the data collected via the above evaluation methods. Other outcomes will include the refined process and tool, plus resources that should improve cost effectiveness without reducing student support. CONCLUSION: Implementing these tools and processes over the entire student’s learning package, will assist them to demonstrate progressive development through the course. Students will have learnt to understand feedback and integrate these skills for life-long learning.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
The Clean Development Mechanism (CDM) has been praised for its ingenuity in mobilising finance to implement sustainable development practices in non-industrialised countries (known as Non-Annex 1 parties under the Kyoto Protocol). During the first commitment period of the Kyoto Protocol (2008-2012), a large number of clean development mechanism projects have been registered with the CDM board. In addition to the large number of registered CDM projects, there are significant numbers of proposed projects stalled in implementation due to the cumbersome and lengthy CDM approval process. Despite this regulatory criticism it is recognised that the role performed by the CDM is essential for achieving a significant reduction in global green house gas emissions. This is because the CDM funds sustainable development in countries that lack capacity to do so on their own. It is anticipated that some form of CDM instrument will continue post the 2012 timeframe and that reform of the mechanism will be focused around making the mechanism’s approval and implementation processes faster and more efficient.
Resumo:
The aim of the present study was to determine the effect of carbohydrate (CHO; sucrose) ingestion and environmental heat on the development of fatigue and the distribution of power output during a 16.1-km cycling time trial. Ten male cyclists (Vo(2max) = 61.7 +/- 5.0 ml.kg(-1).min(-1), mean +/- SD) performed four 90-min constant-pace cycling trials at 80% of second ventilatory threshold (220 +/- 12 W). Trials were conducted in temperate (18.1 +/- 0.4 degrees C) or hot (32.2 +/- 0.7 degrees C) conditions during which subjects ingested either CHO (0.96 g.kg(-1).h(-1)) or placebo (PLA) gels. All trials were followed by a 16.1-km time trial. Before and immediately after exercise, percent muscle activation was determined using superimposed electrical stimulation. Power output, integrated electromyography (iEMG) of vastus lateralis, rectal temperature, and skin temperature were recorded throughout the trial. Percent muscle activation significantly declined during the CHO and PLA trials in hot (6.0 and 6.9%, respectively) but not temperate conditions (1.9 and 2.2%, respectively). The decline in power output during the first 6 km was significantly greater during exercise in the heat. iEMG correlated significantly with power output during the CHO trials in hot and temperate conditions (r = 0.93 and 0.73; P < 0.05) but not during either PLA trial. In conclusion, cyclists tended to self-select an aggressive pacing strategy (initial high intensity) in the heat.
Resumo:
The field of destination image has been widely discussed in the destination literature since the early 1970s (see Mayo, 1973). However the extent to which travel context impacts on an individual’s destination image evaluation, and therefore destination choice, has received scant attention (Hu & Ritchie, 1993). This study, utilising expectancy-value theory, sought to elicit salient destination attributes from consumers across two travel contexts: short-break holidays and longer getaways. Using the Repertory Test technique, attributes elicited as being salient for short-break holidays were consistent with those elicited for longer getaways. While this study was limited to Brisbane’s near-home destinations, the results will be of interest to destination marketers and researchers interested in the challenge of positioning a destination in diverse markets.
Resumo:
Increasing use of computerized systems in our daily lives creates new adversarial opportunities for which complex mechanisms are exploited to mend the rapid development of new attacks. Behavioral Biometrics appear as one of the promising response to these attacks. But it is a relatively new research area, specific frameworks for evaluation and development of behavioral biometrics solutions could not be found yet. In this paper we present a conception of a generic framework and runtime environment which will enable researchers to develop, evaluate and compare their behavioral biometrics solutions with repeatable experiments under the same conditions with the same data.
Resumo:
Over the past several decades, policy has become increasingly global. In economics, for example, policy has followed the so-called Washington Consensus of privatization, liberalization, and deregulation. In education, global policy has included the proliferation of strategies including standardized testing, paraprofessional teachers, user fees, and privatization. There are many problems with these neoliberal policies. Foremost among them, is the havoc they wreak on the lives of so many children and adults. Poverty, inequality, and myriad associated problems have reached new heights in this neoliberal era. Moreover, these policies have been adopted uncritically and alternative policies have been ignored, which leads to our focus here.
Resumo:
Teachers need professional development to keep current with teaching practices, although costs for extensive professional development can be prohibitive across an education system. Mentoring provides one way for embedding cost-effective professional development. This mixed-method study includes surveying mentor teachers (n = 101) on a five-part Likert scale and interviews with experienced mentors (n = 10) to investigate professional development for mentors as a result of the mentoring process. Quantitative data were analysed through a pedagogical knowledge framework and qualitative data were collated into themes. Survey data showed that although mentoring of pedagogical knowledge was variable, mentoring pedagogical knowledge practices occurs with the majority of mentors, which requires mentors to evaluate and articulate teaching practices. Qualitative data showed that mentoring acted as professional development and led towards enhancing communication skills, developing leadership roles (problem-solving and building capacity) and advancing pedagogical knowledge. Providing professional development to teachers on mentoring can help to build capacity in two ways: quality mentoring of preservice teachers through explicit mentoring practices, and reflecting and deconstructing teaching practices for mentors’ own pedagogical advancements.
Resumo:
The majority of current first year university students belong to Generation Y. Consequently, research suggests that, in order to more effectively engage them, their particular learning preferences should be acknowledged in the organisation of their learning environments and in the support provided. These preferences are reflected in the Torts Student Peer Mentor Program, which, as part of the undergraduate law degree at the Queensland University of Technology, utilises active learning, structured sessions and teamwork to supplement student understanding of the substantive law of Torts with the development of life-long skills. This article outlines the Program, and its relevance to the learning styles and experiences of Generation Y first year law students transitioning to university, in order to investigate student perceptions of its effectiveness – both generally and, more specifically, in terms of the Program’s capacity to assist students to develop academic and work-related skills.