205 resultados para test data generation


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The motivation for secondary school principals in Queensland, Australia, to investigate curriculum change coincided with the commencement in 2005 of the state government’s publication of school exit test results as a measure of accountability. Aligning the schools’ curriculum with the requirements of high-stakes testing is considered by many academics and teachers as negative outcome of accountability for reasons such as ‘teaching to the test’ and narrowing the curriculum. However, this article outlines empirical evidence that principals are instigating curriculum change to improve published high-stakes test results. Three principals in this study offered several reasons as to why they wished to implement changes to school curricula. One reason articulated by all three was the pressures of accountability, particularly through the publication of high-stakes test data which has now become commonplace in education systems of many Western Nations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Current graduates in education are entering a very different profession to the one in which most of their “baby-boomer” colleagues started. It is a profession in which accountability and national high-stakes testing (e.g. NAPLAN) have become catch-cries, and where the interpretation and use of educational data is an additional challenge. This has led to schools focusing on performance, and teachers now have to analyse test data and apply the findings to their teaching.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

My oldest daughter recently secured a position as a Science/Geography teacher in a P-12 Catholic College in regional Queensland. This paper looks at the teaching world into which she has graduated. Specifically, the paper will outline and discuss findings from a survey of graduating early childhood student teachers in relation to their knowledge and skills of the current regime of high-stakes testing in Australia. The paper argues that understanding accountability and possessing skills to scrutinise test data are essential for the new teacher as s/he enters a profession in which governments world-wide are demanding a return for their investment in education. The paper will examine literature on accountability and surveillance in the form of high-stakes testing from global, school and classroom perspectives. It makes the claim that it is imperative for beginning teachers to be able to interpret high-stakes test data and considers the skills required to do this. The paper also draws on local research to comment on the readiness of graduates to meet this comparatively new professional demand.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim. Our aim in this paper is to explain a methodological/methods package devised to incorporate situational and social world mapping with frame analysis, based on a grounded theory study of Australian rural nurses' experiences of mentoring. Background. Situational analysis, as conceived by Adele Clarke, shifts the research methodology of grounded theory from being located within a postpositivist paradigm to a postmodern paradigm. Clarke uses three types of maps during this process: situational, social world and positional, in combination with discourse analysis. Method. During our grounded theory study, the process of concurrent interview data generation and analysis incorporated situational and social world mapping techniques. An outcome of this was our increased awareness of how outside actors influenced participants in their constructions of mentoring. In our attempts to use Clarke's methodological package, however, it became apparent that our constructivist beliefs about human agency could not be reconciled with the postmodern project of discourse analysis. We then turned to the literature on symbolic interactionism and adopted frame analysis as a method to examine the literature on rural nursing and mentoring as secondary form of data. Findings. While we found situational and social world mapping very useful, we were less successful in using positional maps. In retrospect, we would argue that collective action framing provides an alternative to analysing such positions in the literature. This is particularly so for researchers who locate themselves within a constructivist paradigm, and who are therefore unwilling to reject the notion of human agency and the ability of individuals to shape their world in some way. Conclusion. Our example of using this package of situational and social worlds mapping with frame analysis is intended to assist other researchers to locate participants more transparently in the social worlds that they negotiate in their everyday practice. © 2007 Blackwell Publishing Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim. This paper is a report of a study to explore rural nurses' experiences of mentoring. Background. Mentoring has recently been proposed by governments, advocates and academics as a solution to the problem for retaining rural nurses in the Australian workforce. Action in the form of mentor development workshops has changed the way that some rural nurses now construct supportive relationships as mentoring. Method. A grounded theory design was used with nine rural nurses. Eleven semi-structured interviews were conducted in various states of Australia during 2004-2005. Situational analysis mapping techniques and frame analysis were used in combination with concurrent data generation and analysis and theoretical sampling. Findings. Experienced rural nurses cultivate novices through supportive mentoring relationships. The impetus for such relationships comes from their own histories of living and working in the same community, and this was termed 'live my work'. Rural nurses use multiple perspectives of self in order to manage their interactions with others in their roles as community members, consumers of healthcare services and nurses. Personal strategies adapted to local context constitute the skills that experienced rural nurses pass-on to neophyte rural nurses through mentoring, while at the same time protecting them through troubleshooting and translating local cultural norms. Conclusion. Living and working in the same community creates a set of complex challenges for novice rural nurses that are better faced with a mentor in place. Thus, mentoring has become an integral part of experienced rural nurses' practice to promote staff retention. © 2007 The Authors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive semidefinite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space - classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semidefinite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -using the labeled part of the data one can learn an embedding also for the unlabeled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method for learning the 2-norm soft margin parameter in support vector machines, solving an important open problem.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive definite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space -- classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semi-definite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -- using the labelled part of the data one can learn an embedding also for the unlabelled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method to learn the 2-norm soft margin parameter in support vector machines, solving another important open problem. Finally, the novel approach presented in the paper is supported by positive empirical results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many ageing road bridges, particularly timber bridges, require urgent improvement due to the demand imposed by the recent version of the Australian bridge loading code, AS 5100. As traffic volume plays a key role in the decision of budget allocations for bridge refurbishment/ replacement, many bridges in low volume traffic network remain in poor condition with axle load and/ or speed restrictions, thus disadvantaging many rural communities. This thesis examines an economical and environmentally sensible option of incorporating disused flat rail wagons (FRW) in the construction of bridges in low volume, high axle load road network. The constructability, economy and structural adequacy of the FRW road bridge is reported in the thesis with particular focus of a demonstration bridge commissioned in regional Queensland. The demonstration bridge comprises of a reinforced concrete slab (RCS) pavement resting on two FRWs with custom designed connection brackets at regular intervals along the span of the bridge. The FRW-RC bridge deck assembly is supported on elastomeric rubber pads resting on the abutment. As this type of bridge replacement technology is new and its structural design is not covered in the design standards, the in-service structural performance of the FRW bridge subjected to the high axle loadings prescribed in AS 5100 is examined through performance load testing. Both the static and the moving load tests are carried out using a fully laden commonly available three-axle tandem truck. The bridge deck is extensively strain gauged and displacement at several key locations is measured using linear variable displacement transducers (LVDTs). A high speed camera is used in the performance test and the digital image data are analysed using proprietary software to capture the locations of the wheel positions on the bridge span accurately. The wheel location is thus synchronised with the displacement and strain time series to infer the structural response of the FRW bridge. Field test data are used to calibrate a grillage model, developed for further analysis of the FRW bridge to various sets of high axle loads stipulated in the bridge design standard. Bridge behaviour predicted by the grillage model has exemplified that the live load stresses of the FRW bridge is significantly lower than the yield strength of steel and the deflections are well below the serviceability limit state set out in AS 5100. Based on the results reported in this thesis, it is concluded that the disused FRWs are competent to resist high axle loading prescribed in AS 5100 and are a viable alternative structural solution of bridge deck in the context of the low volume road networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A road bridge containing disused flatbed rail wagons as the primary deck superstructure was performance tested in a low volume, high axle load traffic road in Queensland, Australia; some key results are presented in this paper. A fully laden truck of total weight 28.88 % of the serviceability design load prescribed in the Australian bridge code was used; its wheel positions were accurately captured using a high speed camera and synchronised with the real‐time deflections and strains measured at the critical members of the flat rail wagons. The strains remained well below the yield and narrated the existence of composite action between the reinforced concrete slab pavement and the wagon deck. A three dimensional grillage model was developed and calibrated using the test data, which established the structural adequacy of the rail wagons and the positive contribution of the reinforced concrete slab pavement to resist high axle traffic loads on a single lane bridge in the low volume roads network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a strategy for delayed research method selection in a qualitative interpretivist research. An exemplary case details how explorative interviews were designed and conducted in accordance with a paradigm prior to deciding whether to adopt grounded theory or phenomenology for data analysis. The focus here is to determine the most appropriate research strategy in this case the methodological framing to conduct research and represent findings, both of which are detailed. Research addressing current management issues requires both a flexible framework and the capability to consider the research problem from various angles, to derive tangible results for academia with immediate application to business demands. Researchers, and in particular novices, often struggle to decide on an appropriate research method suitable to address their research problem. This often applies to interpretative qualitative research where it is not always immediately clear which is the most appropriate method to use, as the research objectives shift and crystallize over time. This paper uses an exemplary case to reveal how the strategy for delayed research method selection contributes to deciding whether to adopt grounded theory or phenomenology in the initial phase of a PhD research project. In this case, semi-structured interviews were used for data generation framed in an interpretivist approach, situated in a business context. Research questions for this study were thoroughly defined and carefully framed in accordance with the research paradigm‟s principles, while at the same time ensuring that the requirements of both potential research methods were met. The grounded theory and phenomenology methods were compared and contrasted to determine their suitability and whether they meet the research objectives based on a pilot study. The strategy proposed in this paper is an alternative to the more „traditional‟ approach, which initially selects the methodological formulation, followed by data generation. In conclusion, the suggested strategy for delayed research method selection intends to help researchers identify and apply the most appropriate method to their research. This strategy is based on explorations of data generation and analysis in order to derive faithful results from the data generated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, three metaheuristics are proposed for solving a class of job shop, open shop, and mixed shop scheduling problems. We evaluate the performance of the proposed algorithms by means of a set of Lawrence’s benchmark instances for the job shop problem, a set of randomly generated instances for the open shop problem, and a combined job shop and open shop test data for the mixed shop problem. The computational results show that the proposed algorithms perform extremely well on all these three types of shop scheduling problems. The results also reveal that the mixed shop problem is relatively easier to solve than the job shop problem due to the fact that the scheduling procedure becomes more flexible by the inclusion of more open shop jobs in the mixed shop.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The interaction and relationship between the global warming and the thermal performance buildings are dynamic in nature. In order to model and understand this behavior, different approaches, including keeping weather variable unchanged, morphing approach and diurnal modelling method, have been used to project and generate future weather data. Among these approaches, various assumptions on the change of solar radiation, air humidity and/or wind characteristics may be adopted. In this paper, an example to illustrate the generation of future weather data for the different global warming scenarios in Australia is presented. The sensitivity of building cooling loads to the possible changes of assumed values used in the future weather data generation is investigated. It is shown that with ± 10% change of the proposed future values for solar radiation, air humidity or wind characteristics, the corresponding change in the cooling load of the modeled sample office building at different Australian capital cities would not exceed 6%, 4% and 1.5% respectively. It is also found that with ±10% changes on the proposed weather variables for both the 2070-high future scenario and the current weather scenario, the corresponding change in the cooling loads at different locations may be weaker (up to 2% difference in Hobart for ±10% change in global solar radiation), similar (less than 0.6%) difference in Hobart for ±10% change in wind speed), or stronger (up to 1.6% difference in Hobart for ±10% change in relative humidity) in the 2070-high future scenario than in the current weather scenario.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper discusses and summarises a recent systematic study on the implication of global warming on air conditioned office buildings in Australia. Four areas are covered, including analysis of historical weather data, generation of future weather data for the impact study of global warming, projection of building performance under various global warming scenarios, and evaluation of various adaptation strategies under 2070 high global warming conditions. Overall, it is found that depending on the assumed future climate scenarios and the location considered, the increase of total building energy use for the sample Australian office building may range from 0.4 to 15.1%. When the increase of annual average outdoor temperature exceeds 2 °C, the risk of overheating will increase significantly. However, the potential overheating problem could be completely eliminated if internal load density is significantly reduced.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Several track-before-detection approaches for image based aircraft detection have recently been examined in an important automated aircraft collision detection application. A particularly popular approach is a two stage processing paradigm which involves: a morphological spatial filter stage (which aims to emphasize the visual characteristics of targets) followed by a temporal or track filter stage (which aims to emphasize the temporal characteristics of targets). In this paper, we proposed new spot detection techniques for this two stage processing paradigm that fuse together raw and morphological images or fuse together various different morphological images (we call these approaches morphological reinforcement). On the basis of flight test data, the proposed morphological reinforcement operations are shown to offer superior signal to-noise characteristics when compared to standard spatial filter options (such as the close-minus-open and adaptive contour morphological operations). However, system operation characterised curves, which examine detection verses false alarm characteristics after both processing stages, illustrate that system performance is very data dependent.