970 resultados para textual complexity assessment


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article focuses on mental health assessment of refugees in clinical, educational and administrative-legal settings in order to synthesise research and practice designed to enhance and promote further development of culturally appropriate clinical assessment services during the refugee resettlement process. It specifically surveys research published over the last 25 years into the development, reliability measurement and validity testing of assessment instruments, which have been used with children, adolescents and adults from refugee backgrounds, prior to or following their arrival in a resettlement country, to determine whether the instruments meet established crosscultural standards of conceptual, functional, linguistic, technical and normative equivalence. The findings suggest that, although attempts have been made to develop internally reliable, appropriately normed tests for use with refugees from diverse cultural and linguistic backgrounds, matters of conceptual and linguistic equivalence and test–retest reliability are often overlooked. Implications of these oversights for underreporting refugees' mental health needs are considered. Efforts should also be directed towards development of culturally comparable, valid and reliable measures of refugee children's mental health and of refugee children's and adults' psychoeducational, neuropsychological and applied memory capabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature abounds with descriptions of failures in high-profile projects and a range of initiatives has been generated to enhance project management practice (e.g., Morris, 2006). Estimating from our own research, there are scores of other project failures that are unrecorded. Many of these failures can be explained using existing project management theory; poor risk management, inaccurate estimating, cultures of optimism dominating decision making, stakeholder mismanagement, inadequate timeframes, and so on. Nevertheless, in spite of extensive discussion and analysis of failures and attention to the presumed causes of failure, projects continue to fail in unexpected ways. In the 1990s, three U.S. state departments of motor vehicles (DMV) cancelled major projects due to time and cost overruns and inability to meet project goals (IT-Cortex, 2010). The California DMV failed to revitalize their drivers’ license and registration application process after spending $45 million. The Oregon DMV cancelled their five year, $50 million project to automate their manual, paper-based operation after three years when the estimates grew to $123 million; its duration stretched to eight years or more and the prototype was a complete failure. In 1997, the Washington state DMV cancelled their license application mitigation project because it would have been too big and obsolete by the time it was estimated to be finished. There are countless similar examples of projects that have been abandoned or that have not delivered the requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Lockyer Valley in southeast Queensland supports important and intensive irrigation which is dependant on the quality and availability of groundwater. Prolonged drought conditions from ~1997 resulted in a depletion of the alluvial aquifers, and concern for the long-term sustainability of this resource. By 2008, many areas of the valley were at < 20% of storage. Some relief occurred with rain events in early 2009, then in December 2010 - January 2011, most of southeast Queensland experienced unprecedented flooding. These storm-based events have caused a shift in research focus from investigations of drought conditions and mitigation to flood response analysis. For the alluvial aquifer system of the valley, a preliminary assessment of groundwater observation bore data, prior to and during the flood, indicates that there is a spatially variable aquifer response. While water levels in some bores screened in unconfined shallow aquifers have recovered by more than 10 m within a short period of time (months), others show only a small or moderate response. Measurements of pre- and post-flood groundwater levels and high-resolution time-series records from data loggers are considered within the framework of a 3D geological model of the Lockyer Valley using Groundwater Visualisation System(GVS). Groundwater level fluctuations covering both drought and flood periods are used to estimate groundwater recharge using the water table fluctuation method (WTF), supplemented by estimates derived using chloride mass balance. The presentation of hydraulic and recharge information in a 3D format has considerable advantages over the traditional 2D presentation of data. The 3D approach allows the distillation of multiple types of information(topography, geological, hydraulic and spatial) into one representation that provides valuable insights into the major controls of groundwater flow and recharge. The influence of aquifer lithology on the spatial variability of groundwater recharge is also demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and implementation of the Australian Curriculum together with national testing of students and the publication of school results place new demands on teachers. In this article we address the importance of teachers becoming attuned to the silent assessors in assessment generally and in the National Literacy and Numeracy Program (NAPLAN) more specifically. Using the concept of literacies, we develop a method to conduct a literacy audit of assessment tasks that teachers can use to help both themselves and their students. Providing assistance to students as a consequence of such an audit is imperative to improve the outcomes for students and to address issues of equity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Significant responsibility has been given to schools and sectors to interpret and plan for assessment within the Australian Curriculum. As schools take this opportunity to review and renew their school curriculum, it is important for teachers and school leaders to take the time to work out whether there are any assessment myths lurking in the conversations or assumptions that need to be challenged. Outdated myths or cultural narratives of learning can limit our thinking and student learning, without us being aware of it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Georgia’s ‘National Integrity Systems’ are the institutions, laws, procedures, practices and attitudes that encourage and support integrity in the exercise of power in modern Georgian society. Integrity systems function to ensure that power is exercised in a manner that is true to the values, purposes and duties for which that power is entrusted to, or held by, institutions and individual office-holders. This report presents the results of the Open Society Institute / Open Society – Georgia Foundation funded project Georgian National Integrity Systems Assessment (GNISA), conducted in 2005–2006 by Caucasus Institute for Peace, Democracy and Development, Transparency International Georgia, Georgian Young Lawyers Association, in close cooperation with Griffith University Institute for Ethics, Governance and Law (Australia), and Tiri Group (UK), into how different elements of integrity systems interact, which combinations of institutions and reforms make for a strong integrity system, and how Georgia’s integrity systems should evolve to ensure coherence, not chaos in the way public integrity is maintained. Nevertheless all participants of the research may not share some conclusions given in the GNISA report.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Columns are one of the key load bearing elements that are highly susceptible to vehicle impacts. The resulting severe damages to columns may leads to failures of the supporting structure that are catastrophic in nature. However, the columns in existing structures are seldom designed for impact due to inadequacies of design guidelines. The impact behaviour of columns designed for gravity loads and actions other than impact is, therefore, of an interest. A comprehensive investigation is conducted on reinforced concrete column with a particular focus on investigating the vulnerability of the exposed columns and to implement mitigation techniques under low to medium velocity car and truck impacts. The investigation is based on non-linear explicit computer simulations of impacted columns followed by a comprehensive validation process. The impact is simulated using force pulses generated from full scale vehicle impact tests. A material model capable of simulating triaxial loading conditions is used in the analyses. Circular columns adequate in capacity for five to twenty story buildings, designed according to Australian standards are considered in the investigation. The crucial parameters associated with the routine column designs and the different load combinations applied at the serviceability stage on the typical columns are considered in detail. Axially loaded columns are examined at the initial stage and the investigation is extended to analyse the impact behaviour under single axis bending and biaxial bending. The impact capacity reduction under varying axial loads is also investigated. Effects of the various load combinations are quantified and residual capacity of the impacted columns based on the status of the damage and mitigation techniques are also presented. In addition, the contribution of the individual parameter to the failure load is scrutinized and analytical equations are developed to identify the critical impulses in terms of the geometrical and material properties of the impacted column. In particular, an innovative technique was developed and introduced to improve the accuracy of the equations where the other techniques are failed due to the shape of the error distribution. Above all, the equations can be used to quantify the critical impulse for three consecutive points (load combinations) located on the interaction diagram for one particular column. Consequently, linear interpolation can be used to quantify the critical impulse for the loading points that are located in-between on the interaction diagram. Having provided a known force and impulse pair for an average impact duration, this method can be extended to assess the vulnerability of columns for a general vehicle population based on an analytical method that can be used to quantify the critical peak forces under different impact durations. Therefore the contribution of this research is not only limited to produce simplified yet rational design guidelines and equations, but also provides a comprehensive solution to quantify the impact capacity while delivering new insight to the scientific community for dealing with impacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A forced landing is an unscheduled event in flight requiring an emergency landing, and is most commonly attributed to engine failure, failure of avionics or adverse weather. Since the ability to conduct a successful forced landing is the primary indicator for safety in the aviation industry, automating this capability for unmanned aerial vehicles (UAVs) will help facilitate their integration into, and subsequent routine operations over civilian airspace. Currently, there is no commercial system available to perform this task; however, a team at the Australian Research Centre for Aerospace Automation (ARCAA) is working towards developing such an automated forced landing system. This system, codenamed Flight Guardian, will operate onboard the aircraft and use machine vision for site identification, artificial intelligence for data assessment and evaluation, and path planning, guidance and control techniques to actualize the landing. This thesis focuses on research specific to the third category, and presents the design, testing and evaluation of a Trajectory Generation and Guidance System (TGGS) that navigates the aircraft to land at a chosen site, following an engine failure. Firstly, two algorithms are developed that adapts manned aircraft forced landing techniques to suit the UAV planning problem. Algorithm 1 allows the UAV to select a route (from a library) based on a fixed glide range and the ambient wind conditions, while Algorithm 2 uses a series of adjustable waypoints to cater for changing winds. A comparison of both algorithms in over 200 simulated forced landings found that using Algorithm 2, twice as many landings were within the designated area, with an average lateral miss distance of 200 m at the aimpoint. These results present a baseline for further refinements to the planning algorithms. A significant contribution is seen in the design of the 3-D Dubins Curves planning algorithm, which extends the elementary concepts underlying 2-D Dubins paths to account for powerless flight in three dimensions. This has also resulted in the development of new methods in testing for path traversability, in losing excess altitude, and in the actual path formation to ensure aircraft stability. Simulations using this algorithm have demonstrated lateral and vertical miss distances of under 20 m at the approach point, in wind speeds of up to 9 m/s. This is greater than a tenfold improvement on Algorithm 2 and emulates the performance of manned, powered aircraft. The lateral guidance algorithm originally developed by Park, Deyst, and How (2007) is enhanced to include wind information in the guidance logic. A simple assumption is also made that reduces the complexity of the algorithm in following a circular path, yet without sacrificing performance. Finally, a specific method of supplying the correct turning direction is also used. Simulations have shown that this new algorithm, named the Enhanced Nonlinear Guidance (ENG) algorithm, performs much better in changing winds, with cross-track errors at the approach point within 2 m, compared to over 10 m using Park's algorithm. A fourth contribution is made in designing the Flight Path Following Guidance (FPFG) algorithm, which uses path angle calculations and the MacCready theory to determine the optimal speed to fly in winds. This algorithm also uses proportional integral- derivative (PID) gain schedules to finely tune the tracking accuracies, and has demonstrated in simulation vertical miss distances of under 2 m in changing winds. A fifth contribution is made in designing the Modified Proportional Navigation (MPN) algorithm, which uses principles from proportional navigation and the ENG algorithm, as well as methods specifically its own, to calculate the required pitch to fly. This algorithm is robust to wind changes, and is easily adaptable to any aircraft type. Tracking accuracies obtained with this algorithm are also comparable to those obtained using the FPFG algorithm. For all three preceding guidance algorithms, a novel method utilising the geometric and time relationship between aircraft and path is also employed to ensure that the aircraft is still able to track the desired path to completion in strong winds, while remaining stabilised. Finally, a derived contribution is made in modifying the 3-D Dubins Curves algorithm to suit helicopter flight dynamics. This modification allows a helicopter to autonomously track both stationary and moving targets in flight, and is highly advantageous for applications such as traffic surveillance, police pursuit, security or payload delivery. Each of these achievements serves to enhance the on-board autonomy and safety of a UAV, which in turn will help facilitate the integration of UAVs into civilian airspace for a wider appreciation of the good that they can provide. The automated UAV forced landing planning and guidance strategies presented in this thesis will allow the progression of this technology from the design and developmental stages, through to a prototype system that can demonstrate its effectiveness to the UAV research and operations community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

These are changing times for teachers and their students in Australia with the introduction of a national curriculum and standards driven reform. While countries in Europe such as England, and in Asia such as Singapore, are changing policy to make more use of assessment to support and improve learning it appears that we in Australia are moving towards creating policy that will raise the assessment stakes for the alleged purposes of transparency, accountability and fairness. What can be learnt from countries that have had years of high stakes testing? How can Australia avoid the mistakes of past curriculum and assessment reform efforts? And how can Australian teachers build their capacity to maximise their use of the learning power of assessment? These are key questions that will be addressed in this presentation with reference to innovative research from global networks that have maintained the assessment focus on learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sample complexity results from computational learning theory, when applied to neural network learning for pattern classification problems, suggest that for good generalization performance the number of training examples should grow at least linearly with the number of adjustable parameters in the network. Results in this paper show that if a large neural network is used for a pattern classification problem and the learning algorithm finds a network with small weights that has small squared error on the training patterns, then the generalization performance depends on the size of the weights rather than the number of weights. For example, consider a two-layer feedforward network of sigmoid units, in which the sum of the magnitudes of the weights associated with each unit is bounded by A and the input dimension is n. We show that the misclassification probability is no more than a certain error estimate (that is related to squared error on the training set) plus A3 √((log n)/m) (ignoring log A and log m factors), where m is the number of training patterns. This may explain the generalization performance of neural networks, particularly when the number of training examples is considerably smaller than the number of weights. It also supports heuristics (such as weight decay and early stopping) that attempt to keep the weights small during training. The proof techniques appear to be useful for the analysis of other pattern classifiers: when the input domain is a totally bounded metric space, we use the same approach to give upper bounds on misclassification probability for classifiers with decision boundaries that are far from the training examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Standardised testing does not recognise the creativity and skills of marginalised youth. This paper presents the development of an innovative approach to assessment designed for the re-engagement of at risk youth who have left formal schooling and are now in an alternative education institution. An electronic portfolio system (EPS) has been developed to capture, record and build on the broad range of students’ cultural and social capital. The assessment as a field of exchange model draws on categories from sociological fields of capital and reconceptualises an eportfolio and social networking hybrid system as a sociocultural zone of learning and development. The EPS, and assessment for learning more generally, are conceptualised as social fields for the exchange of capital (Bourdieu 1977, 1990). The research is underpinned by a sociocultural theoretical perspective that focuses on how students and teachers at the Flexible Learning Centre (FLC) develop and learn, within the zone of proximal development (Vygotsky, 1978). The EPS is seen to be highly effective in the engagement and social interaction between students, teachers and institutions. It is argued throughout this paper that the EPS provides a structurally identifiable space, an arena of social activity, or a field of exchange. The students, teachers and the FLC within this field are producing cultural capital exchanges. The term efield (exchange field) has been coined to refer to this constructed abstract space. Initial results from the trial show a general tendency towards engagement with the EPS and potential for the attainment of socially valued cultural capital in the form of school credentials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In fault detection and diagnostics, limitations coming from the sensor network architecture are one of the main challenges in evaluating a system’s health status. Usually the design of the sensor network architecture is not solely based on diagnostic purposes, other factors like controls, financial constraints, and practical limitations are also involved. As a result, it quite common to have one sensor (or one set of sensors) monitoring the behaviour of two or more components. This can significantly extend the complexity of diagnostic problems. In this paper a systematic approach is presented to deal with such complexities. It is shown how the problem can be formulated as a Bayesian network based diagnostic mechanism with latent variables. The developed approach is also applied to the problem of fault diagnosis in HVAC systems, an application area with considerable modeling and measurement constraints.