175 resultados para Formative assessment framework. Assessment tools. Ames
Resumo:
The international focus on embracing daylighting for energy efficient lighting purposes and the corporate sector’s indulgence in the perception of workplace and work practice “transparency” has spurned an increase in highly glazed commercial buildings. This in turn has renewed issues of visual comfort and daylight-derived glare for occupants. In order to ascertain evidence, or predict risk, of these events; appraisals of these complex visual environments require detailed information on the luminances present in an occupant’s field of view. Conventional luminance meters are an expensive and time consuming method of achieving these results. To create a luminance map of an occupant’s visual field using such a meter requires too many individual measurements to be a practical measurement technique. The application of digital cameras as luminance measurement devices has solved this problem. With high dynamic range imaging, a single digital image can be created to provide luminances on a pixel-by-pixel level within the broad field of view afforded by a fish-eye lens: virtually replicating an occupant’s visual field and providing rapid yet detailed luminance information for the entire scene. With proper calibration, relatively inexpensive digital cameras can be successfully applied to the task of luminance measurements, placing them in the realm of tools that any lighting professional should own. This paper discusses how a digital camera can become a luminance measurement device and then presents an analysis of results obtained from post occupancy measurements from building assessments conducted by the Mobile Architecture Built Environment Laboratory (MABEL) project. This discussion leads to the important realisation that the placement of such tools in the hands of lighting professionals internationally will provide new opportunities for the lighting community in terms of research on critical issues in lighting such as daylight glare and visual quality and comfort.
Resumo:
Road and highway infrastructure provides the backbone for a nation’s economic growth. The versatile dispersion of population in Australia and its resource boom, coupled with improved living standards and growing societal expectations, calls for continuing development and improvement of road infrastructure under the current local, state and federal governments’ policies and strategic plans. As road infrastructure projects involve huge resources and mechanisms, achieving sustainability not only on economic scales but also through environmental and social responsibility becomes a crucial issue. While sustainability is a logical link to infrastructure development, literature study and consultation with the industry found that there is a lack of common understanding on what constitutes sustainability in the infrastructure context. Its priorities are often interpreted differently among multiple stakeholders. For road infrastructure projects which typically span over long periods of time, achieving tangible sustainability outcomes during the lifecycle of development remains a formidable task. Sustainable development initiatives often remain ideological as in macro-level policies and broad-based concepts. There were little elaboration and exemplar cases on how these policies and concepts can be translated into practical decision-making during project implementation. In contrast, there seemed to be over commitment on research and development of sustainability assessment methods and tools. Between the two positions, there is a perception-reality gap and mismatch, specifically on how to enhance sustainability deliverables during infrastructure project delivery. Review on past research in this industry sector also found that little has been done to promote sustainable road infrastructure development; this has wide and varied potential impacts. This research identified the common perceptions and expectations by different stakeholders towards achieving sustainability in road and highway infrastructure projects. Face to face interviews on selected representatives of these stakeholders were carried out in order to select and categorize, confirm and prioritize a list of sustainability performance targets identified through literature and past research. A Delphi study was conducted with the assistance of a panel of senior industry professionals and academic experts, which further considered the interrelationship and influence of the sustainability indicators, and identified critical sustainability indicators under ten critical sustainability criteria (e.g. Environmental, Health & Safety, Resource Utilization & Management, Social & Cultural, Economic, Public Governance & Community Engagement, Relations Management, Engineering, Institutional and Project Management). This presented critical sustainability issues that needed to be addressed at the project level. Accordingly, exemplar highway development projects were used as case studies to elicit solutions for the critical issues. Through the identification and integration of different perceptions and priority needs of the stakeholders, as well as key sustainability indicators and solutions for critical issues, a set of decision-making guidelines was developed to promote and drive consistent sustainability deliverables in road infrastructure projects.
Resumo:
Economics education research studies conducted in the UK, USA and Australia to investigate the effects of learning inputs on academic performance have been dominated by the input-output model (Shanahan and Meyer, 2001). In the Student Experience of Learning framework, however, the link between learning inputs and outputs is mediated by students' learning approaches which in turn are influenced by their perceptions of the learning contexts (Evans, Kirby, & Fabrigar, 2003). Many learning inventories such as Biggs' Study Process Questionnaires and Entwistle and Ramsden' Approaches to Study Inventory have been designed to measure approaches to academic learning. However, there is a limitation to using generalised learning inventories in that they tend to aggregate different learning approaches utilised in different assessments. As a result, important relationships between learning approaches and learning outcomes that exist in specific assessment context(s) will be missed (Lizzio, Wilson, & Simons, 2002). This paper documents the construction of an assessment specific instrument to measure learning approaches in economics. The post-dictive validity of the instrument was evaluated by examining the association of learning approaches to students' perceived assessment demand in different assessment contexts.
Resumo:
The critical problem of student disengagement and underachievement in the middle years of schooling (Years 4 . 9) has focussed attention on the quality of educational programs in schools, in Australia and elsewhere. The loss of enthusiasm for science in the middle years is particularly problematic given the growing demand for science professionals. Reshaping middle years programs has included an emphasis on integrating Information and Communication Technologies (ICTs) and improving assessment practices to engage students in higher cognitive processes and enhance academic rigour. Understanding the nature of academic rigour and how to embed it in students. science assessment tasks that incorporate the use of ICTs could enable teachers to optimise the quality of the learning environment. However, academic rigour is not clearly described or defined in the literature and there is little empirical evidence upon which researchers and teachers could draw to enhance understandings. This study used a collective case study design to explore teachers' understandings of academic rigour within science assessment tasks. The research design is based on a conceptual framework that is underpinned by socio-cultural theory. Three methods were used to collect data from six middle years teachers and their students. These methods were a survey, focus group discussion with teachers and a group of students and individual semi-structured interviews with teachers. Findings of the case study revealed six criteria of academic rigour, namely, higher order thinking, alignment, building on prior knowledge, scaffolding, knowledge construction and creativity. Results showed that the middle years teachers held rich understandings of academic rigour that led to effective utilisation of ICTs in science assessment tasks. Findings also indicated that teachers could further enhance their understandings of academic rigour in some aspects of each of the criteria. In particular, this study found that academic rigour could have been further optimised by: promoting more thoughtful discourse and interaction to foster higher order thinking; increasing alignment between curriculum, pedagogy, and assessment, and students. prior knowledge; placing greater emphasis on identifying, activating and building on prior knowledge; better differentiating the level of scaffolding provided and applying it more judiciously; fostering creativity throughout tasks; enhancing teachers‟ content knowledge and pedagogical content knowledge, and providing more in-depth coverage of fewer topics to support knowledge construction. Key contributions of this study are a definition and a model which clarify the nature of academic rigour.
Resumo:
This research has established, through ultrasound, near infrared spectroscopy and biomechanics experiments, parameters and parametric relationships that can form the framework for quantifying the integrity of the articular cartilage-on-bone laminate, and objectively distinguish between normal/healthy and abnormal/degenerated joint tissue, with a focus on articular cartilage. This has been achieved by: 1. using traditional experimental methods to produce new parameters for cartilage assessment; 2. using novel methodologies to develop new parameters; and 3. investigating the interrelationships between mechanical, structural and molec- ular properties to identify and select those parameters and methodologies that can be used in a future arthroscopic probe based on points 1 and 2. By combining the molecular, micro- and macro-structural characteristics of the tissue with its mechanical properties, we arrive at a set of critical benchmarking parameters for viable and early-stage non-viable cartilage. The interrelationships between these characteristics, examined using a multivariate analysis based on principal components analysis, multiple linear regression and general linear modeling, could then to deter- mine those parameters and relationships which have the potential to be developed into a future clinical device. Specifically, this research has found that the ultrasound and near infrared techniques can subsume the mechanical parameters and combine to characterise the tissue at the molecular, structural and mechanical levels over the full depth of the cartilage matrix. It is the opinion in this thesis that by enabling the determination of the precise area of in uence of a focal defect or disease in the joint, demarcating the boundaries of articular cartilage with dierent levels of degeneration around a focal defect, better surgical decisions that will advance the processes of joint management and treatment will be achieved. Providing the basis for a surgical tool, this research will contribute to the enhancement and quanti�cation of arthroscopic procedures, extending to post- treatment monitoring and as a research tool, will enable a robust method for evaluating developing (particularly focalised) treatments.
Resumo:
Of the numerous factors that play a role in fatal pedestrian collisions, the time of day, day of the week, and time of year can be significant determinants. More than 60% of all pedestrian collisions in 2007 occurred at night, despite the presumed decrease in both pedestrian and automobile exposure during the night. Although this trend is partially explained by factors such as fatigue and alcohol consumption, prior analysis of the Fatality Analysis Reporting System database suggests that pedestrian fatalities increase as light decreases after controlling for other factors. This study applies graphical cross-tabulation, a novel visual assessment approach, to explore the relationships among collision variables. The results reveal that twilight and the first hour of darkness typically observe the greatest frequency of pedestrian fatal collisions. These hours are not necessarily the most risky on a per mile travelled basis, however, because pedestrian volumes are often still high. Additional analysis is needed to quantify the extent to which pedestrian exposure (walking/crossing activity) in these time periods plays a role in pedestrian crash involvement. Weekly patterns of pedestrian fatal collisions vary by time of year due to the seasonal changes in sunset time. In December, collisions are concentrated around twilight and the first hour of darkness throughout the week while, in June, collisions are most heavily concentrated around twilight and the first hours of darkness on Friday and Saturday. Friday and Saturday nights in June may be the most dangerous times for pedestrians. Knowing when pedestrian risk is highest is critically important for formulating effective mitigation strategies and for efficiently investing safety funds. This applied visual approach is a helpful tool for researchers intending to communicate with policy-makers and to identify relationships that can then be tested with more sophisticated statistical tools.
Resumo:
Measuring the comparative sustainability levels of cities, regions, institutions and projects is an essential procedure in creating sustainable urban futures. This paper introduces a new urban sustainability assessment model: “The Sustainable Infrastructure, Land-use, Environment and Transport Model (SILENT)”. The SILENT Model is an advanced geographic information system and indicator-based comparative urban sustainability indexing model. The model aims to assist planners and policy makers in their daily tasks in sustainable urban planning and development by providing an integrated sustainability assessment framework. The paper gives an overview of the conceptual framework and components of the model and discusses the theoretical constructs, methodological procedures, and future development of this promising urban sustainability assessment model.
Resumo:
Currently in Australia, there are no decision support tools for traffic and transport engineers to assess the crash risk potential of proposed road projects at design level. A selection of equivalent tools already exists for traffic performance assessment, e.g. aaSIDRA or VISSIM. The Urban Crash Risk Assessment Tool (UCRAT) was developed for VicRoads by ARRB Group to promote methodical identification of future crash risks arising from proposed road infrastructure, where safety cannot be evaluated based on past crash history. The tool will assist practitioners with key design decisions to arrive at the safest and the most cost -optimal design options. This paper details the development and application of UCRAT software. This professional tool may be used to calculate an expected mean number of casualty crashes for an intersection, a road link or defined road network consisting of a number of such elements. The mean number of crashes provides a measure of risk associated with the proposed functional design and allows evaluation of alternative options. The tool is based on historical data for existing road infrastructure in metropolitan Melbourne and takes into account the influence of key design features, traffic volumes, road function and the speed environment. Crash prediction modelling and risk assessment approaches were combined to develop its unique algorithms. The tool has application in such projects as road access proposals associated with land use developments, public transport integration projects and new road corridor upgrade proposals.
Resumo:
This paper presents a conceptual framework, informed by Foucault’s work on governmentality, which allows for new kinds of reflection on the practice of legal education. Put simply, this framework suggests that legal education can be understood as a form of government that relies on a specific rationalisation and programming of the activities of legal educators, students, and administrators, and is implemented by harnessing specific techniques and bodies of ‘know-how’. Applying this framework to assessment at three Australian law schools, this paper highlights how assessment practices are rationalised, programmed, and implemented, and points out how this government shapes students’ legal personae. In particular, this analysis focuses on the governmental effects of pedagogical discourses that are dominant within the design and scholarship of legal education. It demonstrates that the development of pedagogically-sound regimes of assessment has contributed to a reformulation of the terrain of government, by providing the conditions under which forms of legal personae may be more effectively shaped, and extending the power relations that achieve this. This analysis provides legal educators with an original way of reflecting on the power effects of teaching the law, and new opportunities for thinking about what is possible in legal education.
Resumo:
This study used the Australian Environmental Health Risk Assessment Framework to assess the human health risk of dioxin exposure through foods for local residents in two wards of Bien Hoa City, Vietnam. These wards are known hot-spots for dioxin and a range of stakeholders from central government to local levels were involved in this process. Publications on dioxin characteristics and toxicity were reviewed and dioxin concentrations in local soil, mud, foods, milk and blood samples were used as data for this risk assessment. A food frequency survey of 400 randomly selected households in these wards was conducted to provide data for exposure assessment. Results showed that local residents who had consumed locally cultivated foods, especially fresh water fish and bottom-feeding fish, free-ranging chicken, duck, and beef were at a very high risk, with their daily dioxin intake far exceeding the tolerable daily intake recommended by the WHO. Based on the results of this assessment, a multifaceted risk management program was developed and has been recognized as the first public health program ever to have been implemented in Vietnam to reduce the risks of dioxin exposure at dioxin hot-spots.
Resumo:
Modern statistical models and computational methods can now incorporate uncertainty of the parameters used in Quantitative Microbial Risk Assessments (QMRA). Many QMRAs use Monte Carlo methods, but work from fixed estimates for means, variances and other parameters. We illustrate the ease of estimating all parameters contemporaneously with the risk assessment, incorporating all the parameter uncertainty arising from the experiments from which these parameters are estimated. A Bayesian approach is adopted, using Markov Chain Monte Carlo Gibbs sampling (MCMC) via the freely available software, WinBUGS. The method and its ease of implementation are illustrated by a case study that involves incorporating three disparate datasets into an MCMC framework. The probabilities of infection when the uncertainty associated with parameter estimation is incorporated into a QMRA are shown to be considerably more variable over various dose ranges than the analogous probabilities obtained when constants from the literature are simply ‘plugged’ in as is done in most QMRAs. Neglecting these sources of uncertainty may lead to erroneous decisions for public health and risk management.
Resumo:
This chapter addresses the changing climate of assessment policy and practice in Australia in response to global trends in education and the mounting accountability demands of standards-driven reform. Queensland, a State of Australia, has a tradition of respecting and trusting teacher judgment through the practice of, and policy commitment to, externally moderated school-based assessment. This chapter outlines the global trends in curriculum and assessment reform, and then analyzes the impact of international comparisons on national policy. The creation of the Australian Curriculum, Assessment and Reporting Authority (ACARA) together with the intent of establishing a standards-referenced framework raises tensions and challenges for teachers’ practice. The argument for sustaining confidence in teacher-based assessment is developed with reference to research evidence pertaining to the use of more authentic assessments and moderation practices for the purposes of improving learning, equity and accountability. Evidence is drawn from local studies of teacher judgment practice and used to demonstrate these developments and in so doing illuminate the complex issues of engaging the demands of policy while sustaining confidence in teacher assessment.
Resumo:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
Resumo:
This paper presents an explanation of why the reuse of building components after demolition or deconstruction is critical to the future of the construction industry. An examination of the historical cause and response to climate change sets the scene as to why governance is becoming increasingly focused on the built environment as a mechanism to controlling waste generation associated with the process of demolition, construction and operation. Through an annotated description to the evolving design and construction methodology of a range of timber dwellings (typically 'Queenslanders' during the eras of 1880-1900, 1900-1920 & 1920-1940) the paper offers an evaluation to the variety of materials, which can be used advantageously by those wishing to 'regenerate' a Queenslander. This analysis of 'regeneration' details the constraints when considering relocation and/ or reuse by adaption including deconstruction of building components against the legislative framework requirements of the Queensland Building Act 1975 and the Queensland Sustainable Planning Act 2009, with a specific examination to those of the Building Codes of Australia. The paper concludes with a discussion of these constraints, their impacts on 'regeneration' and the need for further research to seek greater understanding of the practicalities and drivers of relocation, adaptive and building components suitability for reuse after deconstruction.