967 resultados para Measuring instruments.
Resumo:
At first glance the built environments of South Florida and South East Queensland appear very similar, particularly along the highly urbanized coast. However this apparent similarity belies some fundamental differences between the two regions in terms of context and the approach to regulating development. This paper describes some of these key differences, but focuses on two research questions: 1) do these differences affect the built environment; and 2) if so, how does the built form differ? There has been considerable research on how to best measure urban form, particularly as it relates to measuring urban sprawl (Schwarz 2010; Clifton et al. 2008). Some of the key questions identified by this research include: what are the best variables to use?; what scale should be used?; and what time period to use? We will assimilate this research in order to develop a methodology for measuring urban form and apply it to both case study regions. There are several potential outcomes from this research -- one is that the built form between the two regions is quite different; and the second is that it is similar. The first outcome is what might be expected given the differences in context and development regulation. However how might the second outcome be explained – major differences in context and development regulation resulting in minor differences in key measures of urban form? One explanation is that differences in the way development is regulated are not as important in determining the built form as are private market forces.
Resumo:
Volume measurements are useful in many branches of science and medicine. They are usually accomplished by acquiring a sequence of cross sectional images through the object using an appropriate scanning modality, for example x-ray computed tomography (CT), magnetic resonance (MR) or ultrasound (US). In the cases of CT and MR, a dividing cubes algorithm can be used to describe the surface as a triangle mesh. However, such algorithms are not suitable for US data, especially when the image sequence is multiplanar (as it usually is). This problem may be overcome by manually tracing regions of interest (ROIs) on the registered multiplanar images and connecting the points into a triangular mesh. In this paper we describe and evaluate a new discreet form of Gauss’ theorem which enables the calculation of the volume of any enclosed surface described by a triangular mesh. The volume is calculated by summing the vector product of the centroid, area and normal of each surface triangle. The algorithm was tested on computer-generated objects, US-scanned balloons, livers and kidneys and CT-scanned clay rocks. The results, expressed as the mean percentage difference ± one standard deviation were 1.2 ± 2.3, 5.5 ± 4.7, 3.0 ± 3.2 and −1.2 ± 3.2% for balloons, livers, kidneys and rocks respectively. The results compare favourably with other volume estimation methods such as planimetry and tetrahedral decomposition.
Resumo:
This note examines the productive efficiency of 62 starting guards during the 2011/12 National Basketball Association (NBA) season. This period coincides with the phenomenal and largely unanticipated performance of New York Knicks’ starting point guard Jeremy Lin and the attendant public and media hype known as Linsanity. We employ a data envelopment analysis (DEA) approach that includes allowance for an undesirable output, here turnovers per game, with the desirable outputs of points, rebounds, assists, steals, and blocks per game and an input of minutes per game. The results indicate that depending upon the specification, between 29 and 42 percent of NBA guards are fully efficient, including Jeremy Lin, with a mean inefficiency of 3.7 and 19.2 percent. However, while Jeremy Lin is technically efficient, he seldom serves as a benchmark for inefficient players, at least when compared with established players such as Chris Paul and Dwayne Wade. This suggests the uniqueness of Jeremy Lin’s productive solution and may explain why his unique style of play, encompassing individual brilliance, unselfish play, and team leadership, is of such broad public appeal.
Resumo:
Background: Antibiotic overuse is a global public health issue that is influenced by several factors. The degree and prevalence of antibiotic overuse is difficult to measure directly. A more practical approach, such as the use of a psycho-social measurement instrument, might allow for the observation and assessment of patterns of antibiotic use. Study objective: The aim of this paper is to review the nature, validity, and reliability of measurement scales designed to measure factors associated with antibiotic misuse/overuse. Design: This study is descriptive and includes a systematic integration of the measurement scales used in the literature to measure factors associated with antibiotic misuse/overuse. The review included 70 international scientific publications from 1992 to 2010. Main results: Studies have presented scales to measure antibiotic misuse. However, the workup of these instruments is often not mentioned, or the scales are used with only early-phase validation, such as content or face validity. Other studies have discussed the reliability of these scales. However, the full validation process has not been discussed in any of the reviewed measurement scales. Conclusion: A reliable, fully validated measurement scale must be developed to assess the factors associated with the overuse of antibiotics. Identifying these factors will help to minimize the misuse of antibiotics.
Resumo:
Enterprise architecture management (EAM) has become an intensively discussed approach to manage enterprise transformations. While many organizations employ EAM, a notable insecurity about the value of EAM remains. In this paper, we propose a model to measure the realization of benefits from EAM. We identify EAM success factors and EAM benefits through a comprehensive literature review and eleven explorative expert interviews. Based on our findings, we integrate the EAM success factors and benefits with the established DeLone & McLean IS success model resulting in a model that explains the realization of EAM benefits. This model aids organizations as a benchmark and framework for identifying and assessing the setup of their EAM initiatives and whether and how EAM benefits are materialized. We see our model also as a first step to gain insights in and start a discussion on the theory of EAM benefit realization.
Resumo:
Work integrated learning (WIL) or professional practice units are recognised as providing learning experiences that help students make successful transitions to professional practice. These units require students to engage in learning in the workplace; to reflect on this learning; and to integrate it with learning at university. However, an analysis of a recent cohort of property economics students at a large urban university provides evidence that there is great variation in work based learning experiences undertaken and that this impacts on students’capacity to respond to assessment tasks which involve critiquing these experiences in the form of reflective reports. This paper highlights the need to recognise the diversity of work based experiences; the impact this has on learning outcomes; and to find more effective and equitable ways of measuring these outcomes. The paper briefly discusses assessing learning outcomes in WIL and then describes the model of WIL in the Faculty of Built Environment and Engineering at the Queensland University of Technology (QUT). The paper elaborates on the diversity of students’ experiences and backgrounds including variations in the length of work experience, placement opportunities and conditions of employment.For example, the analysis shows that students with limited work experience often have difficulty critiquing this work experience and producing high level reflective reports. On the other hand students with extensive, discipline relevant work experience can be frustrated by assessment requirements that do not take their experience into account. Added to this the Global Financial Crisis (GFC) has restricted both part time and full time placement opportunities for some students. These factors affect students’ capacity to a) secure a relevant work experience, b) reflect critically on the work experiences and c) appreciate the impact the overall experience can have on their learning outcomes and future professional opportunities. Our investigation highlights some of the challenges faced in implementing effective and equitable approaches across diverse student cohorts. We suggest that increased flexibility in assessment requirements and increased feedback from industry may help address these challenges.
Resumo:
Aim: Whilst motorcycle rider training is commonly incorporated into licensing programs in many developed nations, little empirical support has been found in previous research to prescribe it as an effective road safety countermeasure. It has been posited that the lack of effect of motorcycle rider training on crash reduction may, in part, be due to the predominant focus on skills-based training with little attention devoted to addressing attitudes and motives that influence subsequent risky riding. However, little past research has actually endeavoured to measure attitudinal and motivational factors as a function of rider training. Accordingly, this study was undertaken to assess the effect of a commercial motorcycle rider training program on psychosocial factors that have been shown to influence risk taking by motorcyclists. Method: Four hundred and thirty-eight motorcycle riders attending a competency-based licence training course in Brisbane, Australia, voluntarily participated in the study. A self-report questionnaire adapted from the Rider Risk Assessment Measure (RRAM) was administered to participants at the commencement of training, then again at the conclusion of training. Participants were informed of the independent nature of the research and that their responses would in no way effect their chance of obtaining a licence. To minimise potential demand characteristics, participants were instructed to seal completed questionnaires in envelopes and place them in a sealed box accessible only by the research team (i.e. not able to be viewed by instructors). Results: Significant reductions in the propensity for thrill seeking and intentions to engage in risky riding in the next 12 months were found at the end of training. In addition, a significant increase in attitudes to safety was found. Conclusions: These findings indicate that rider training may have a positive short-term influence on riders’ propensity for risk taking. However, such findings must be interpreted with caution in regard to the subsequent safety of riders as these factors may be subject to further influence once riders are licensed and actively engage with peers during on-road riding. This highlights a challenge for road safety education / training programs in regard to the adoption of safety practices and the need for behavioural follow-up over time to ascertain long-term effects. This study was the initial phase of an ongoing program of research into rider training and risk taking framed around Theory of Planned Behaviour concepts. A subsequent 12 month follow-up of the study participants has been undertaken with data analysis pending.
Measuring creative potential: Using social network analysis to monitor a learners' creative capacity
Resumo:
Traditional treatments for weight management have focussed on prescribed dietary restriction or regular exercise, or a combination of both. However recidivism for such prescribed treatments remains high, particularly among the overweight and obese. The aim of this thesis was to investigate voluntary dietary changes in the presence of prescribed mixed-mode exercise, conducted over 16 weeks. With the implementation of a single lifestyle change (exercise) it was postulated that the onerous burden of concomitant dietary and exercise compliance would be reduced, leading to voluntary lifestyle changes in such areas as diet. In addition, the failure of exercise as a single weight loss treatment has been reported to be due to compensatory energy intakes, although much of the evidence is from acute exercise studies, necessitating investigation of compensatory intakes during a long-term exercise intervention. Following 16 weeks of moderate intensity exercise, 30 overweight and obese (BMI≥25.00 kg.m-2) men and women showed small but statistically significant decreases in mean dietary fat intakes, without compensatory increases in other macronutrient or total energy intakes. Indeed total energy intakes were significantly lower for men and women following the exercise intervention, due to the decreases in dietary fat intakes. There was a risk that acceptance of the statistical validity of the small changes to dietary fat intakes may have constituted a Type 1 error, with false rejection of the Null hypothesis. Oro-sensory perceptions to changes in fat loads were therefore investigated to determine whether the measured dietary fat changes were detectable by the human palate. The ability to detect small changes in dietary fat provides sensory feedback for self-initiated dietary changes, but lean and overweight participants were unable to distinguish changes to fat loads of similar magnitudes to that measured in the exercise intervention study. Accuracy of the dietary measurement instrument was improved with the effects of random error (day-to-day variability) minimised with the use of a statistically validated 8-day, multiple-pass, 24 hour dietary recall instrument. However systematic error (underreporting) may have masked the magnitude of dietary change, particularly the reduction in dietary fat intakes. A purported biomarker (plasma Apolipoprotein A-IV) (apoA-IV) was subsequently investigated, to monitor systematic error in self-reported dietary intakes. Changes in plasma apoA-IV concentrations were directly correlated with increased and decreased changes to dietary fat intakes, suggesting that this objective marker may be a useful tool to improve the accuracy of dietary measurement in overweight and obese populations, who are susceptible to dietary underreporting.
Resumo:
Topographic structural complexity of a reef is highly correlated to coral growth rates, coral cover and overall levels of biodiversity, and is therefore integral in determining ecological processes. Modeling these processes commonly includes measures of rugosity obtained from a wide range of different survey techniques that often fail to capture rugosity at different spatial scales. Here we show that accurate estimates of rugosity can be obtained from video footage captured using underwater video cameras (i.e., monocular video). To demonstrate the accuracy of our method, we compared the results to in situ measurements of a 2m x 20m area of forereef from Glovers Reef atoll in Belize. Sequential pairs of images were used to compute fine scale bathymetric reconstructions of the reef substrate from which precise measurements of rugosity and reef topographic structural complexity can be derived across multiple spatial scales. To achieve accurate bathymetric reconstructions from uncalibrated monocular video, the position of the camera for each image in the video sequence and the intrinsic parameters (e.g., focal length) must be computed simultaneously. We show that these parameters can be often determined when the data exhibits parallax-type motion, and that rugosity and reef complexity can be accurately computed from existing video sequences taken from any type of underwater camera from any reef habitat or location. This technique provides an infinite array of possibilities for future coral reef research by providing a cost-effective and automated method of determining structural complexity and rugosity in both new and historical video surveys of coral reefs.
Resumo:
The Moon appears to be much larger closer to the horizon than when higher in the sky. This is called the ‘Moon Illusion’ since the observed size of the Moon is not actually larger when the Moon is just above the horizon. This article describes a technique for verifying that the observed size of the Moon in not larger on the horizon. The technique can be easily performed in a high school teaching environment. Moreover, the technique demonstrates the surprising fact that the observed size of the Moon is actually smaller on the horizon due to atmospheric refraction. For the purposes of this paper, several images of the moon were taken with the Moon close to the horizon and close to the zenith. Images were processed using a free program called ImageJ. The Moon was found to be 5.73 ±0.04% smaller in area on the horizon then at the zenith.