115 resultados para Measurement of performance


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adolescent idiopathic scoliosis (AIS) is the most common form of spinal deformity in paediatrics, prevalent in approximately 2-4% of the general population. While it is a complex three-dimensional deformity, it is clinically characterised by an abnormal lateral curvature of the spine. The treatment for severe deformity is surgical correction with the use of structural implants. Anterior single rod correction employs a solid rod connected to the anterior spine via vertebral body screws. Correction is achieved by applying compression between adjacent vertebral body screws, before locking each screw onto the rod. Biomechanical complication rates have been reported as high as 20.8%, and include rod breakage, screw pull-out and loss of correction. Currently, the corrective forces applied to the spine are unknown. These forces are important variables to consider in understanding the biomechanics of scoliosis correction. The purpose of this study was to measure these forces intra-operatively during anterior single rod AIS correction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The measurement of broadband ultrasonic attenuation (BUA) in cancellous bone at the calcaneus was first described in 1984. The assessment of osteoporosis by BUA has recently been recognized by Universities UK, within its EurekaUK book, as being one of the “100 discoveries and developments in UK Universities that have changed the world” over the past 50 years, covering the whole academic spectrum from the arts and humanities to science and technology. Indeed, BUA technique has been clinically validated and is utilized worldwide, with at least seven commercial systems providing calcaneal BUA measurement. However, a fundamental understanding of the dependence of BUA upon the material and structural properties of cancellous bone is still lacking. This review aims to provide a science- and technology-orientated perspective on the application of BUA to the medical disease of osteoporosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION In their target article, Yuri Hanin and Muza Hanina outlined a novel multidisciplinary approach to performance optimisation for sport psychologists called the Identification-Control-Correction (ICC) programme. According to the authors, this empirically-verified, psycho-pedagogical strategy is designed to improve the quality of coaching and consistency of performance in highly skilled athletes and involves a number of steps including: (i) identifying and increasing self-awareness of ‘optimal’ and ‘non-optimal’ movement patterns for individual athletes; (ii) learning to deliberately control the process of task execution; and iii), correcting habitual and random errors and managing radical changes of movement patterns. Although no specific examples were provided, the ICC programme has apparently been successful in enhancing the performance of Olympic-level athletes. In this commentary, we address what we consider to be some important issues arising from the target article. We specifically focus attention on the contentious topic of optimization in neurobiological movement systems, the role of constraints in shaping emergent movement patterns and the functional role of movement variability in producing stable performance outcomes. In our view, the target article and, indeed, the proposed ICC programme, would benefit from a dynamical systems theoretical backdrop rather than the cognitive scientific approach that appears to be advocated. Although Hanin and Hanina made reference to, and attempted to integrate, constructs typically associated with dynamical systems theoretical accounts of motor control and learning (e.g., Bernstein’s problem, movement variability, etc.), these ideas required more detailed elaboration, which we provide in this commentary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of particulate matter on environment and public health have been widely studied in recent years. A number of studies in the medical field have tried to identify the specific effect on human health of particulate exposure, but agreement amongst these studies on the relative importance of the particles’ size and its origin with respect to health effects is still lacking. Nevertheless, air quality standards are moving, as the epidemiological attention, towards greater focus on the smaller particles. Current air quality standards only regulate the mass of particulate matter less than 10 μm in aerodynamic diameter (PM10) and less than 2.5 μm (PM2.5). The most reliable method used in measuring Total Suspended Particles (TSP), PM10, PM2.5 and PM1 is the gravimetric method since it directly measures PM concentration, guaranteeing an effective traceability to international standards. This technique however, neglects the possibility to correlate short term intra-day variations of atmospheric parameters that can influence ambient particle concentration and size distribution (emission strengths of particle sources, temperature, relative humidity, wind direction and speed and mixing height) as well as human activity patterns that may also vary over time periods considerably shorter than 24 hours. A continuous method to measure the number size distribution and total number concentration in the range 0.014 – 20 μm is the tandem system constituted by a Scanning Mobility Particle Sizer (SMPS) and an Aerodynamic Particle Sizer (APS). In this paper, an uncertainty budget model of the measurement of airborne particle number, surface area and mass size distributions is proposed and applied for several typical aerosol size distributions. The estimation of such an uncertainty budget presents several difficulties due to i) the complexity of the measurement chain, ii) the fact that SMPS and APS can properly guarantee the traceability to the International System of Measurements only in terms of number concentration. In fact, the surface area and mass concentration must be estimated on the basis of separately determined average density and particle morphology. Keywords: SMPS-APS tandem system, gravimetric reference method, uncertainty budget, ultrafine particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patients with severe back deformities can greatly benefit from customized medical seating. Customized medical seating is made by taking measurements of each individual patient and making the seat as per these measurements. The current measuring systems employed by the industry are limited to use in clinics which are generally located only in major population centres. Patients living in remote areas are severely affected by this as the clinics could be far away and inaccessible for these patients. To provide service of customized medical seating requires a new measurement system which is portable so that the system could be transported to the patients in remote areas. The requirements for a new measurement system are analysed to suite the needs of Equipment Technology Services of the Cerebral Palsy League of Queensland. Design for a new measurement system was conceptualised by reviewing systems and technologies in various scientific disciplines. Design for a new system was finalised by optimizing each individual component. The final approach was validated by measuring difficult models and repeating the process to check for process variances. This system has now been adopted for clinical evaluation by ETS Suggestions have been made for further improvements in this new measurement approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We aimed to investigate the naturally occurring horizontal plane movements of a head stabilized in a standard ophthalmic headrest and to analyze their magnitude, velocity, spectral characteristics, and correlation to the cardio pulmonary system. Two custom-made air-coupled highly accurate (±2 μm)ultrasound transducers were used to measure the displacements of the head in different horizontal directions with a sampling frequency of 100 Hz. Synchronously to the head movements, an electrocardiogram (ECG) signal was recorded. Three healthy subjects participated in the study. Frequency analysis of the recorded head movements and their velocities was carried out, and functions of coherence between the two displacements and the ECG signal were calculated. Frequency of respiration and the heartbeat were clearly visible in all recorded head movements. The amplitude of head displacements was typically in the range of ±100 μm. The first harmonic of the heartbeat (in the range of 2–3 Hz), rather than its principal frequency, was found to be the dominant frequency of both head movements and their velocities. Coherence analysis showed high interdependence between the considered signals for frequencies of up to 20 Hz. These findings may contribute to the design of better ophthalmic headrests and should help other studies in the decision making of whether to use a heavy headrest or a bite bar.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: In 1970, Enright observed a distortion of perceived driving speed, induced by monocular application of a neutral density (ND) filter. If a driver looks out of the right side of a vehicle with a filter over the right eye, the driver perceives a reduction of the vehicle’s apparent velocity, while applying a ND filter over the left eye increases the vehicle’s apparent velocity. The purpose of the current study was to provide the first empirical measurements of the Enright phenomenon. Methods: Ten experienced drivers were tested and drove an automatic sedan on a closed road circuit. Filters (0.9 ND) were placed over the left, right or both eyes during a driving run, in addition to a control condition with no filters in place. Subjects were asked to look out of the right side of the car and adjust their driving speed to either 40 km/h or 60 km/h. Results: Without a filter or with both eyes filtered subjects showed good estimation of speed when asked to travel at 60 km/h but travelled a mean of 12 to 14 km/h faster than the requested 40 km/h. Subjects travelled faster than these baselines by a mean of 7 to 9 km/h (p < 0.001) with the filter over their right eye, and 3 to 5 km/h slower with the filter over their left eye (p < 0.05). Conclusions: The Enright phenomenon causes significant and measurable distortions of perceived driving speed under realworld driving conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over recent years, many scholars have studied the conceptual modeling of information systems based on a theory of ontological expressiveness. This theory offers four constructs that inform properties of modeling grammars in the form of ontological deficiencies, and their implications for development and use of conceptual modeling in IS practice. In this paper we report on the development of a valid and reliable instrument for measuring the perceptions that individuals have of the ontological deficiencies of conceptual modeling grammars. We describe a multi-stage approach for instrument development that incorporates feedback from expert and user panels. We also report on a field test of the instrument with 590 modeling practitioners. We further study how different levels of modeling experience influence user perceptions of ontological deficiencies of modeling grammars. We provide implications for practice and future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is widely held that strong relationships exist between housing, economic status, and well being. This is exemplified by widespread housing stock surpluses in many countries which threaten to destabilise numerous aspects related to individuals and community. However, the position of housing demand and supply is not consistent. The Australian position provides a distinct contrast whereby seemingly inexorable housing demand generally remains a critical issue affecting the socio-economic landscape. Underpinned by high levels of immigration, and further buoyed by sustained historically low interest rates, increasing income levels, and increased government assistance for first home buyers, this strong housing demand ensures elements related to housing affordability continue to gain prominence. A significant, but less visible factor impacting housing affordability – particularly new housing development – relates to holding costs. These costs are in many ways “hidden” and cannot always be easily identified. Although it is only one contributor, the nature and extent of its impact requires elucidation. In its simplest form, it commences with a calculation of the interest or opportunity cost of land holding. However, there is significantly more complexity for major new developments - particularly greenfield property development. Preliminary analysis conducted by the author suggests that even small shifts in primary factors impacting holding costs can appreciably affect housing affordability – and notably, to a greater extent than commonly held. Even so, their importance and perceived high level impact can be gauged from the unprecedented level of attention policy makers have given them over recent years. This may be evidenced by the embedding of specific strategies to address burgeoning holding costs (and particularly those cost savings associated with streamlining regulatory assessment) within statutory instruments such as the Queensland Housing Affordability Strategy, and the South East Queensland Regional Plan. However, several key issues require investigation. Firstly, the computation and methodology behind the calculation of holding costs varies widely. In fact, it is not only variable, but in some instances completely ignored. Secondly, some ambiguity exists in terms of the inclusion of various elements of holding costs, thereby affecting the assessment of their relative contribution. Perhaps this may in part be explained by their nature: such costs are not always immediately apparent. Some forms of holding costs are not as visible as the more tangible cost items associated with greenfield development such as regulatory fees, government taxes, acquisition costs, selling fees, commissions and others. Holding costs are also more difficult to evaluate since for the most part they must be ultimately assessed over time in an ever-changing environment, based on their strong relationship with opportunity cost which is in turn dependant, inter alia, upon prevailing inflation and / or interest rates. By extending research in the general area of housing affordability, this thesis seeks to provide a more detailed investigation of those elements related to holding costs, and in so doing determine the size of their impact specifically on the end user. This will involve the development of soundly based economic and econometric models which seek to clarify the componentry impacts of holding costs. Ultimately, there are significant policy implications in relation to the framework used in Australian jurisdictions that promote, retain, or otherwise maximise, the opportunities for affordable housing.

Relevância:

100.00% 100.00%

Publicador: