785 resultados para engine performance


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aftermath of the Queensland floods of January 2011 continues to be played out in the courts. The effect of the floods on such a large scale has awakened the use of some statutory provisions that have not previously been litigated .Section 64 of the Property Law Act 1974 (Qld) is such a section. A version of this provision appears as s 34 of the Sale of Land Act 1982 (Vic). Broadly speaking, these sections permit a buyer of a dwelling house which has been damaged or destroyed between contract and completion to rescind the contract and recover their deposit provided that the rescission notice is given prior to "the date of completion or possession". The Court of Appeal decision of Dunworth v Mirvac Queensland Pty Ltd [2011] QCA 200 appears to be the first litigation upon the application of the section since it came into force in 1975.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since 2007 Kite Arts Education Program (KITE), based at Queensland Performing Arts Centre (QPAC), has been engaged in delivering a series of theatre-based experiences for children in low socio-economic primary schools in Queensland. The artist in residence (AIR) project titled Yonder includes performances developed by the children with the support and leadership of teacher artists from KITE for their community and parents/carers,supported by a peak community cultural institution. In 2009,Queensland Performing Arts Centre partnered with Queensland University of Technology (QUT) Creative Industries Faculty (Drama) to conduct a three-year evaluation of the Yonder project to understand the operational dynamics, artistic outputs and the educational benefits of the project. This paper outlines the research findings for children engaged in the Yonder project in the interrelated areas of literacy development and social competencies. Findings are drawn from six iterations of the project in suburban locations on the edge of Brisbane city and in regional Queensland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adolescent idiopathic scoliosis is a complex three dimensional deformity affecting 2-3% of the general population. Resulting spine deformities include progressive coronal curvature, hypokyphosis, or frank lordosis in the thoracic spine and vertebral rotation in the axial plane with posterior elements turned into the curve concavity. The potential for curve progression is heightened during the adolescent growth spurt. Success of scoliosis deformity correction depends on solid bony fusion between adjacent vertebrae after the intervertebral discs have been surgically cleared and the disc spaces filled with graft material. Problems with bone graft harvest site morbidity as well as limited bone availability have led to the search for bone graft substitutes. Recently, a bioactive and resorbable scaffold fabricated from medical grade polycaprolactone (PCL) has been developed for bone regeneration at load bearing sites. Combined with recombinant human bone morphogenic protein–2 (rhBMP-2), this has been shown to be successful in acting as a bone graft substitute in acting as a bone graft substitute in a porcine lumbar interbody fusion model when compared to autologous bone graft. This in vivo sheep study intends to evaluate the suitability of a custom designed medical grade PCL scaffold in combination with rhBMP-2 as a bone graft substitute in the setting of mini–thoracotomy surgery as a platform for ongoing research to benefit patients with adolescent idiopathic scoliosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of performance standards and assessment practices in regard to student work placements is an essential and important task. Inappropriate, inadequate, or excessively complex assessment tasks can influence levels of student engagement and the quality of learning outcomes. Critical to determining appropriate standards and assessment tasks is an understanding and knowledge of key elements of the learning environment and the extent to which opportunities are provided for students to engage in critical reflection and judgement of their own performance in the contexts of the work environment. This paper focuses on the development of essential skills and knowledge (capabilities) that provide evidence of learning in work placements by describing an approach taken in the science and technology disciplines. Assessment matrices are presented to illustrate a method of assessment for use within the context of the learning environment centred on work placements in science and technology. This study contributes to the debate on the meaning of professional capability, performance standards and assessment practices in work placement programs by providing evidence of an approach that can be adapted by other programs to achieve similar benefits. The approach may also be valuable to other learning contexts where capability and performance are being judged in situations that are outside a controlled teaching and learning environment i.e. in other life-wide learning contexts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Return side streams from anaerobic digesters and dewatering facilities at wastewater treatment plants (WWTPs) contribute a significant proportion of the total nitrogen load on a mainstream process. Similarly, significant phosphate loads are also recirculated in biological nutrient removal (BNR) wastewater treatment plants. Ion exchange using a new material, known by the name MesoLite, shows strong potential for the removal of ammonia from these side streams and an opportunity to concurrently reduce phosphate levels. A pilot plant was designed and operated for several months on an ammonia rich centrate from a dewatering centrifuge at the Oxley Creek WWTP, Brisbane, Australia. The system operated with a detention time in the order of one hour and was operated for between 12 and 24 hours prior to regeneration with a sodium rich solution. The same pilot plant was used to demonstrate removal of phosphate from an abattoir wastewater stream at similar flow rates. Using MesoLite materials, >90% reduction of ammonia was achieved in the centrate side stream. A full-scale process would reduce the total nitrogen load at the Oxley Creek WWTP by at least 18%. This reduction in nitrogen load consequently improves the TKN/COD ratio of the influent and enhances the nitrogen removal performance of the biological nutrient removal process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stories by children’s writer Dr. Seuss have often been utilised as non-traditional narrative reflections regarding the issues of ethics and morality (Greenwood, 2000). Such case studies are viewed as effective teaching and learning tools due to the associated analytical and decision-making frameworks that are represented within the texts, and focus upon the exploration of universally general virtues and approaches to ethics (Hankes, 2012). Whilst Dr. Seuss did not create a story directly related to the sport, exercise or performance domains, many of his narratives possess psychological implications that are applicable in any situation that requires ethical consideration of the thinking and choices people make. The following exploration of the ‘ethical places you’ll go’ draws upon references to his work as a guide to navigating this interesting and sometimes challenging landscape for sport, exercise, and performance psychologists (SEPP).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sport and exercise psychologists are often sought after to apply their knowledge, skills and experience from a sporting context into other performance-related industries and endeavours. Over the past two decades, this has noticeably expanded out from a natural progression into the performing arts with other ‘typical’ performers (e.g., dancers, actors, musicians, singers) through to people who work in high pressure environments that consist of clear performance outputs and requirements that are usually linked to high impact consequences for non-achievement (e.g., lawyers, surgeons, executives, military personnel, safety professionals). Whilst these areas of application continue to increase in popularity and performance psychology is more readily recognised as an important factor in people performance across industries, the use of psychology within the performing arts continues to deepen and solidify its value as an essential and critical factor for success. This article focuses on the contribution of psychology to the performing arts that I have observed over more than 20 years – obtained through a variety of roles primarily within the dance sector including as performer, educator, health professional, researcher, commentator and senior leader.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Educators are faced with many challenging questions in designing an effective curriculum. What prerequisite knowledge do students have before commencing a new subject? At what level of mastery? What is the spread of capabilities between bare-passing students vs. the top performing group? How does the intended learning specification compare to student performance at the end of a subject? In this paper we present a conceptual model that helps in answering some of these questions. It has the following main capabilities: capturing the learning specification in terms of syllabus topics and outcomes; capturing mastery levels to model progression; capturing the minimal vs. aspirational learning design; capturing confidence and reliability metrics for each of these mappings; and finally, comparing and reflecting on the learning specification against actual student performance. We present a web-based implementation of the model, and validate it by mapping the final exams from four programming subjects against the ACM/IEEE CS2013 topics and outcomes, using Bloom's Taxonomy as the mastery scale. We then import the itemised exam grades from 632 students across the four subjects and compare the demonstrated student performance against the expected learning for each of these. Key contributions of this work are the validated conceptual model for capturing and comparing expected learning vs. demonstrated performance, and a web-based implementation of this model, which is made freely available online as a community resource.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.