253 resultados para Subset Sum Problem


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the declaration by the United Nations that awareness raising should be a key part of efforts to combat human trafficking, government and non-government organisations have produced numerous public awareness campaigns designed to capture the public’s attention and sympathy. These campaigns represent the ‘problem’ of trafficking in specific ways, creating heroes and villains by placing the blame for trafficking on some, while obscuring the responsibility of others. This paper adopts Carol Bacchi’s ‘What is the problem represented to be?’ framework for examining the politicisation of problem representation in 18 anti-trafficking awareness campaigns. It is argued that these campaigns construct a narrow understanding of the problem through the depiction of ‘ideal offenders’. In particular, a strong focus on the demand for commercial sex as causative of human trafficking serves to obscure the problematic role of consumerism in a wide range of industries, and perpetuates an understanding of trafficking that fails to draw a necessary distinction between the demand for labour, and the demand for ‘exploitable’ labour. This problem representation also obscures the role governments in destination countries may play in causing trafficking through imposing restrictive migration regimes that render migrants vulnerable to traffickers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-government actors such as think-tanks are playing an important role in Australian policy work. As governments increasingly outsource policy work previously done by education departments and academics to these new policy actors, more think-tanks have emerged that represent a wide range of political views and ideological positions. This paper looks at the emergence of the Grattan Institute as one significant player in Australian education policy with a particular emphasis on Grattan’s report ‘Turning around low-performing schools’. Grattan exemplifies many of the facets of Barber’s ‘deliverology’, as they produce reports designed to be easily digested, simply actioned and provide reassurance that there is an answer, often through focusing on ‘what works’ recipes. ‘Turning around low-performing schools’ is a perfect example of this deliverology. However, a close analysis of the Report suggests that it contains four major problems which seriously impact its usefulness for schools and policymakers: it ignores data that may be more important in explaining the turn-around of schools, the Report is overly reliant on NAPLAN data, there are reasons to be suspicious about the evidence assembled, and finally the Report falls into a classic trap of logic—the post hoc fallacy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

- Background Teamwork sits comfortably within the vocabularies of most physical education teachers. It is used to both describe and prescribe student behaviour in a variety of physical and sport-related activities. Yet while supporters of sport and PE have readily employed the term, remarkably few pedagogues have taken the time to consider what teamwork refers to, let alone what it means to teach it. - Focus of study In this paper, we examine practitioners' constructions of teamwork. - Participants and setting Data were generated with seven physical education teachers (four male and three female) at a state-funded secondary school near Brisbane, Australia. The teachers ranged in experience from three months to more than 30 years. - Research design The investigation was a case study of one physical education department at a secondary school. - Data collection Three interviews were conducted with each of the teachers. The first was biographical in nature and covered themes such as education and sporting experiences. During the second interviews, teachers produced examples and statements on the topic of teamwork as it occurs within their lessons. The material from the second set of interviews was explored in the final set where the teachers were invited to elaborate on and explain comments from their previous interviews. - Analysis Data were considered from a discursive-constructionist perspective and attention was given to linguistic and grammatical features of the teachers' commentary as well as the cultural relevance of the utterances. The notion of ‘interpretive repertoires’ – essentially cultural explanations bounded by particular socio-linguistic features – provided the central unit of analysis. - Findings The teachers in the project made use of an array of discursive resources to make sense of teamwork. These constructions often bore little resemblance to one another or to existing theories of teamwork. In some cases, the teachers offered vague descriptions or drew on alternative concepts to make sense of teamwork. - Conclusions Without a certain level of agreement in their everyday usage, teachers' constructions of teamwork fail to be convincing or useful. We maintain that a more substantive conceptualisation of teamwork is needed in the field of sport pedagogy and offer suggestions on how this might be accomplished.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the effects of a transfer on the intertemporal terms of trade are examined in the context of a simple two-country, two-period model. When intertemporal trade occurs because the two economies have different rates of time preference, a transfer improves the terms of trade of the paying country. Alternatively, when trade occurs owing to international differences in the endowments of goods over the two periods, the effect of a transfer depends on (a) the relationship between the interest rate and the rates of time preference of the two countries and (b) the relationship between their elasticities of intertemporal consumption substitution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While the two decades since the study by Kavanagh et al. (1993) has given additional insights into effective dissemination of family interventions, the accompanying papers show that progress remains limited. The effectiveness trial that triggered this series of papers offers a cautionary tale. Despite management support, 30–35 hr of workshop training and training of local supervisors who could act as champions, use of the full intervention was limited. In part, this seemed due to the demanding nature of the intervention and its incompatibility with practitioners’ roles, in part, to limitations in the training, among other factors. While the accompanying papers note these and other barriers to dissemination, they miss a more disturbing finding in the original paper: Practitioners said they were using several aspects in routine care, despite being unable to accurately describe what they were. This finding highlights the risks in taking practitioners’ reports of their practice in files or supervision sessions at face value and potentially has implications for reports of other clinical work. The fidelity of disseminated treatments can only be assured by audits of practice, accompanied by affirming but also corrective feedback.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Generating discriminative input features is a key requirement for achieving highly accurate classifiers. The process of generating features from raw data is known as feature engineering and it can take significant manual effort. In this paper we propose automated feature engineering to derive a suite of additional features from a given set of basic features with the aim of both improving classifier accuracy through discriminative features, and to assist data scientists through automation. Our implementation is specific to HTTP computer network traffic. To measure the effectiveness of our proposal, we compare the performance of a supervised machine learning classifier built with automated feature engineering versus one using human-guided features. The classifier addresses a problem in computer network security, namely the detection of HTTP tunnels. We use Bro to process network traffic into base features and then apply automated feature engineering to calculate a larger set of derived features. The derived features are calculated without favour to any base feature and include entropy, length and N-grams for all string features, and counts and averages over time for all numeric features. Feature selection is then used to find the most relevant subset of these features. Testing showed that both classifiers achieved a detection rate above 99.93% at a false positive rate below 0.01%. For our datasets, we conclude that automated feature engineering can provide the advantages of increasing classifier development speed and reducing development technical difficulties through the removal of manual feature engineering. These are achieved while also maintaining classification accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Predicting temporal responses of ecosystems to disturbances associated with industrial activities is critical for their management and conservation. However, prediction of ecosystem responses is challenging due to the complexity and potential non-linearities stemming from interactions between system components and multiple environmental drivers. Prediction is particularly difficult for marine ecosystems due to their often highly variable and complex natures and large uncertainties surrounding their dynamic responses. Consequently, current management of such systems often rely on expert judgement and/or complex quantitative models that consider only a subset of the relevant ecological processes. Hence there exists an urgent need for the development of whole-of-systems predictive models to support decision and policy makers in managing complex marine systems in the context of industry based disturbances. This paper presents Dynamic Bayesian Networks (DBNs) for predicting the temporal response of a marine ecosystem to anthropogenic disturbances. The DBN provides a visual representation of the problem domain in terms of factors (parts of the ecosystem) and their relationships. These relationships are quantified via Conditional Probability Tables (CPTs), which estimate the variability and uncertainty in the distribution of each factor. The combination of qualitative visual and quantitative elements in a DBN facilitates the integration of a wide array of data, published and expert knowledge and other models. Such multiple sources are often essential as one single source of information is rarely sufficient to cover the diverse range of factors relevant to a management task. Here, a DBN model is developed for tropical, annual Halophila and temperate, persistent Amphibolis seagrass meadows to inform dredging management and help meet environmental guidelines. Specifically, the impacts of capital (e.g. new port development) and maintenance (e.g. maintaining channel depths in established ports) dredging is evaluated with respect to the risk of permanent loss, defined as no recovery within 5 years (Environmental Protection Agency guidelines). The model is developed using expert knowledge, existing literature, statistical models of environmental light, and experimental data. The model is then demonstrated in a case study through the analysis of a variety of dredging, environmental and seagrass ecosystem recovery scenarios. In spatial zones significantly affected by dredging, such as the zone of moderate impact, shoot density has a very high probability of being driven to zero by capital dredging due to the duration of such dredging. Here, fast growing Halophila species can recover, however, the probability of recovery depends on the presence of seed banks. On the other hand, slow growing Amphibolis meadows have a high probability of suffering permanent loss. However, in the maintenance dredging scenario, due to the shorter duration of dredging, Amphibolis is better able to resist the impacts of dredging. For both types of seagrass meadows, the probability of loss was strongly dependent on the biological and ecological status of the meadow, as well as environmental conditions post-dredging. The ability to predict the ecosystem response under cumulative, non-linear interactions across a complex ecosystem highlights the utility of DBNs for decision support and environmental management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence-based policy is a means of ensuring that policy is informed by more than ideology or expedience. However, what constitutes robust evidence is highly contested. In this paper, we argue policy must draw on quantitative and qualitative data. We do this in relation to a long entrenched problem in Australian early childhood education and care (ECEC) workforce policy. A critical shortage of qualified staff threatens the attainment of broader child and family policy objectives linked to the provision of ECEC and has not been successfully addressed by initiatives to date. We establish some of the limitations of existing quantitative data sets and consider the potential of qualitative studies to inform ECEC workforce policy. The adoption of both quantitative and qualitative methods is needed to illuminate the complex nature of the work undertaken by early childhood educators, as well as the environmental factors that sustain job satisfaction in a demanding and poorly understood working environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing numbers of medical schools in Australia and overseas have moved away from didactic teaching methodologies and embraced problem-based learning (PBL) to improve clinical reasoning skills and communication skills as well as to encourage self-directed lifelong learning. In January 2005, the first cohort of students entered the new MBBS program at the Griffith University School of Medicine, Gold Coast, to embark upon an exciting, fully integrated curriculum using PBL, combining electronic delivery, communication and evaluation systems incorporating cognitive principles that underpin the PBL process. This chapter examines the educational philosophies and design of the e-learning environment underpinning the processes developed to deliver, monitor and evaluate the curriculum. Key initiatives taken to promote student engagement and innovative and distinctive approaches to student learning at Griffith promoted within the conceptual model for the curriculum are (a) Student engagement, (b) Pastoral care, (c) Staff engagement, (d) Monitoring and (e) Curriculum/Program Review. © 2007 Springer-Verlag Berlin Heidelberg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The irreversible epidermal growth factor receptor (EGFR) inhibitors have demonstrated efficacy in NSCLC patients with activating EGFR mutations, but it is unknown if they are superior to the reversible inhibitors. Dacomitinib is an oral, small-molecule irreversible inhibitor of all enzymatically active HER family tyrosine kinases. Methods: The ARCHER 1009 (NCT01360554) and A7471028 (NCT00769067) studies randomized patients with locally advanced/metastatic NSCLC following progression with one or two prior chemotherapy regimens to dacomitinib or erlotinib. EGFR mutation testing was performed centrally on archived tumor samples. We pooled patients with exon 19 deletion and L858R EGFR mutations from both studies to compare the efficacy of dacomitinib to erlotinib. Results: One hundred twenty-one patients with any EGFR mutation were enrolled; 101 had activating mutations in exon 19 or 21. For patients with exon19/21 mutations, the median progression-free survival was 14.6 months [95% confidence interval (CI) 9.0–18.2] with dacomitinib and 9.6 months (95% CI 7.4–12.7) with erlotinib [unstratified hazard ratio (HR) 0.717 (95% CI 0.458–1.124), two-sided log-rank, P = 0.146]. The median survival was 26.6 months (95% CI 21.6–41.5) with dacomitinib versus 23.2 months (95% CI 16.0–31.8) with erlotinib [unstratified HR 0.737 (95% CI 0.431–1.259), two-sided log-rank, P = 0.265]. Dacomitinib was associated with a higher incidence of diarrhea and mucositis in both studies compared with erlotinib. Conclusions: Dacomitinib is an active agent with comparable efficacy to erlotinib in the EGFR mutated patients. The subgroup with exon 19 deletion had favorable outcomes with dacomitinib. An ongoing phase III study will compare dacomitinib to gefitinib in first-line therapy of patients with NSCLC harboring common activating EGFR mutations (ARCHER 1050; NCT01774721). Clinical trials number: ARCHER 1009 (NCT01360554) and A7471028 (NCT00769067).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the challenges of flood mapping using multispectral images. Quantitative flood mapping is critical for flood damage assessment and management. Remote sensing images obtained from various satellite or airborne sensors provide valuable data for this application, from which the information on the extent of flood can be extracted. However the great challenge involved in the data interpretation is to achieve more reliable flood extent mapping including both the fully inundated areas and the 'wet' areas where trees and houses are partly covered by water. This is a typical combined pure pixel and mixed pixel problem. In this paper, an extended Support Vector Machines method for spectral unmixing developed recently has been applied to generate an integrated map showing both pure pixels (fully inundated areas) and mixed pixels (trees and houses partly covered by water). The outputs were compared with the conventional mean based linear spectral mixture model, and better performance was demonstrated with a subset of Landsat ETM+ data recorded at the Daly River Basin, NT, Australia, on 3rd March, 2008, after a flood event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most difficult operation in the flood inundation mapping using optical flood images is to separate fully inundated areas from the ‘wet’ areas where trees and houses are partly covered by water. This can be referred as a typical problem the presence of mixed pixels in the images. A number of automatic information extraction image classification algorithms have been developed over the years for flood mapping using optical remote sensing images. Most classification algorithms generally, help in selecting a pixel in a particular class label with the greatest likelihood. However, these hard classification methods often fail to generate a reliable flood inundation mapping because the presence of mixed pixels in the images. To solve the mixed pixel problem advanced image processing techniques are adopted and Linear Spectral unmixing method is one of the most popular soft classification technique used for mixed pixel analysis. The good performance of linear spectral unmixing depends on two important issues, those are, the method of selecting endmembers and the method to model the endmembers for unmixing. This paper presents an improvement in the adaptive selection of endmember subset for each pixel in spectral unmixing method for reliable flood mapping. Using a fixed set of endmembers for spectral unmixing all pixels in an entire image might cause over estimation of the endmember spectra residing in a mixed pixel and hence cause reducing the performance level of spectral unmixing. Compared to this, application of estimated adaptive subset of endmembers for each pixel can decrease the residual error in unmixing results and provide a reliable output. In this current paper, it has also been proved that this proposed method can improve the accuracy of conventional linear unmixing methods and also easy to apply. Three different linear spectral unmixing methods were applied to test the improvement in unmixing results. Experiments were conducted in three different sets of Landsat-5 TM images of three different flood events in Australia to examine the method on different flooding conditions and achieved satisfactory outcomes in flood mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most difficult operation in flood inundation mapping using optical flood images is to map the ‘wet’ areas where trees and houses are partly covered by water. This can be referred to as a typical problem of the presence of mixed pixels in the images. A number of automatic information extracting image classification algorithms have been developed over the years for flood mapping using optical remote sensing images, with most labelling a pixel as a particular class. However, they often fail to generate reliable flood inundation mapping because of the presence of mixed pixels in the images. To solve this problem, spectral unmixing methods have been developed. In this thesis, methods for selecting endmembers and the method to model the primary classes for unmixing, the two most important issues in spectral unmixing, are investigated. We conduct comparative studies of three typical spectral unmixing algorithms, Partial Constrained Linear Spectral unmixing, Multiple Endmember Selection Mixture Analysis and spectral unmixing using the Extended Support Vector Machine method. They are analysed and assessed by error analysis in flood mapping using MODIS, Landsat and World View-2 images. The Conventional Root Mean Square Error Assessment is applied to obtain errors for estimated fractions of each primary class. Moreover, a newly developed Fuzzy Error Matrix is used to obtain a clear picture of error distributions at the pixel level. This thesis shows that the Extended Support Vector Machine method is able to provide a more reliable estimation of fractional abundances and allows the use of a complete set of training samples to model a defined pure class. Furthermore, it can be applied to analysis of both pure and mixed pixels to provide integrated hard-soft classification results. Our research also identifies and explores a serious drawback in relation to endmember selections in current spectral unmixing methods which apply fixed sets of endmember classes or pure classes for mixture analysis of every pixel in an entire image. However, as it is not accurate to assume that every pixel in an image must contain all endmember classes, these methods usually cause an over-estimation of the fractional abundances in a particular pixel. In this thesis, a subset of adaptive endmembers in every pixel is derived using the proposed methods to form an endmember index matrix. The experimental results show that using the pixel-dependent endmembers in unmixing significantly improves performance.