362 resultados para problem complexity
Resumo:
Staffing rural and remote schools is an important policy issue for the public good. This paper examines the private issues it also poses for teachers with families working in these communities, as they seek to reconcile careers with educational choices for children. The paper first considers historical responses to staffing rural and remote schools in Australia, and the emergence of neoliberal policy encouraging marketisation of the education sector. We report on interviews about considerations motivating household mobility with 11 teachers across regional, rural and remote communities in Queensland. Like other middle-class parents, these teachers prioritised their children’s educational opportunities over career opportunities. The analysis demonstrates how teachers in rural and remote communities constitute a special group of educational consumers with insider knowledge and unique dilemmas around school choice. Their heightened anxieties around school choice under neoliberal policy are shown to contribute to the public issue of staffing rural and remote schools.
Resumo:
The development and maintenance of large and complex ontologies are often time-consuming and error-prone. Thus, automated ontology learning and revision have attracted intensive research interest. In data-centric applications where ontologies are designed or automatically learnt from the data, when new data instances are added that contradict to the ontology, it is often desirable to incrementally revise the ontology according to the added data. This problem can be intuitively formulated as the problem of revising a TBox by an ABox. In this paper we introduce a model-theoretic approach to such an ontology revision problem by using a novel alternative semantic characterisation of DL-Lite ontologies. We show some desired properties for our ontology revision. We have also developed an algorithm for reasoning with the ontology revision without computing the revision result. The algorithm is efficient as its computational complexity is in coNP in the worst case and in PTIME when the size of the new data is bounded.
Resumo:
Smart Card Automated Fare Collection (AFC) data has been extensively exploited to understand passenger behavior, passenger segment, trip purpose and improve transit planning through spatial travel pattern analysis. The literature has been evolving from simple to more sophisticated methods such as from aggregated to individual travel pattern analysis, and from stop-to-stop to flexible stop aggregation. However, the issue of high computing complexity has limited these methods in practical applications. This paper proposes a new algorithm named Weighted Stop Density Based Scanning Algorithm with Noise (WS-DBSCAN) based on the classical Density Based Scanning Algorithm with Noise (DBSCAN) algorithm to detect and update the daily changes in travel pattern. WS-DBSCAN converts the classical quadratic computation complexity DBSCAN to a problem of sub-quadratic complexity. The numerical experiment using the real AFC data in South East Queensland, Australia shows that the algorithm costs only 0.45% in computation time compared to the classical DBSCAN, but provides the same clustering results.
Resumo:
Biomedical systems involve a large number of entities and intricate interactions between these. Their direct analysis is, therefore, difficult, and it is often necessary to rely on computational models. These models require significant resources and parallel computing solutions. These approaches are particularly suited, given parallel aspects in the nature of biomedical systems. Model hybridisation also permits the integration and simultaneous study of multiple aspects and scales of these systems, thus providing an efficient platform for multidisciplinary research.
A framework for understanding and generating integrated solutions for residential peak energy demand
Resumo:
Supplying peak energy demand in a cost effective, reliable manner is a critical focus for utilities internationally. Successfully addressing peak energy concerns requires understanding of all the factors that affect electricity demand especially at peak times. This paper is based on past attempts of proposing models designed to aid our understanding of the influences on residential peak energy demand in a systematic and comprehensive way. Our model has been developed through a group model building process as a systems framework of the problem situation to model the complexity within and between systems and indicate how changes in one element might flow on to others. It is comprised of themes (social, technical and change management options) networked together in a way that captures their influence and association with each other and also their influence, association and impact on appliance usage and residential peak energy demand. The real value of the model is in creating awareness, understanding and insight into the complexity of residential peak energy demand and in working with this complexity to identify and integrate the social, technical and change management option themes and their impact on appliance usage and residential energy demand at peak times.
Resumo:
The mining industry is highly suitable for the application of robotics and automation technology since the work is both arduous and dangerous. However, while the industry makes extensive use of mechanisation it has shown a slow uptake of automation. A major cause of this is the complexity of the task, and the limitations of existing automation technology which is predicated on a structured and time invariant working environment. Here we discuss the topic of mining automation from a robotics and computer vision perspective — as a problem in sensor based robot control, an issue which the robotics community has been studying for nearly two decades. We then describe two of our current mining automation projects to demonstrate what is possible for both open-pit and underground mining operations.
Resumo:
Many researchers in the field of civil structural health monitoring (SHM) have developed and tested their methods on simple to moderately complex laboratory structures such as beams, plates, frames, and trusses. Fieldwork has also been conducted by many researchers and practitioners on more complex operating bridges. Most laboratory structures do not adequately replicate the complexity of truss bridges. Informed by a brief review of the literature, this paper documents the design and proposed test plan of a structurally complex laboratory bridge model that has been specifically designed for the purpose of SHM research. Preliminary results have been presented in the companion paper.
Resumo:
Many researchers in the field of civil structural health monitoring have developed and tested their methods on simple to moderately complex laboratory structures such as beams, plates, frames, and trusses. Field work has also been conducted by many researchers and practitioners on more complex operating bridges. Most laboratory structures do not adequately replicate the complexity of truss bridges. This paper presents some preliminary results of experimental modal testing and analysis of the bridge model presented in the companion paper, using the peak picking method, and compares these results with those of a simple numerical model of the structure. Three dominant modes of vibration were experimentally identified under 15 Hz. The mode shapes and order of the modes matched those of the numerical model; however, the frequencies did not match.
Resumo:
Urbanisation significantly changes the characteristics of a catchment as natural areas are transformed to impervious surfaces such as roads, roofs and parking lots. The increased fraction of impervious surfaces leads to changes to the stormwater runoff characteristics, whilst a variety of anthropogenic activities common to urban areas generate a range of pollutants such as nutrients, solids and organic matter. These pollutants accumulate on catchment surfaces and are removed and trans- ported by stormwater runoff and thereby contribute pollutant loads to receiving waters. In summary, urbanisation influences the stormwater characteristics of a catchment, including hydrology and water quality. Due to the growing recognition that stormwater pollution is a significant environmental problem, the implementation of mitigation strategies to improve the quality of stormwater runoff is becoming increasingly common in urban areas. A scientifically robust stormwater quality treatment strategy is an essential requirement for effective urban stormwater management. The efficient design of treatment systems is closely dependent on the state of knowledge in relation to the primary factors influencing stormwater quality. In this regard, stormwater modelling outcomes provide designers with important guidance and datasets which significantly underpin the design of effective stormwater treatment systems. Therefore, the accuracy of modelling approaches and the reliability modelling outcomes are of particular concern. This book discusses the inherent complexity and key characteristics in the areas of urban hydrology and stormwater quality, based on the influence exerted by a range of rainfall and catchment characteristics. A comprehensive field sampling and testing programme in relation to pollutant build-up, an urban catchment monitoring programme in relation to stormwater quality and the outcomes from advanced statistical analyses provided the platform for the knowledge creation. Two case studies and two real-world applications are discussed to illustrate the translation of the knowledge created to practical use in relation to the role of rainfall and catchment characteristics on urban stormwater quality. An innovative rainfall classification based on stormwater quality was developed to support the effective and scientifically robust design of stormwater treatment systems. Underpinned by the rainfall classification methodology, a reliable approach for design rainfall selection is proposed in order to optimise stormwater treatment based on both, stormwater quality and quantity. This is a paradigm shift from the common approach where stormwater treatment systems are designed based solely on stormwater quantity data. Additionally, how pollutant build-up and stormwater runoff quality vary with a range of catchment characteristics was also investigated. Based on the study out- comes, it can be concluded that the use of only a limited number of catchment parameters such as land use and impervious surface percentage, as it is the case in current modelling approaches, could result in appreciable error in water quality estimation. Influential factors which should be incorporated into modelling in relation to catchment characteristics, should also include urban form and impervious surface area distribution. The knowledge created through the research investigations discussed in this monograph is expected to make a significant contribution to engineering practice such as hydrologic and stormwater quality modelling, stormwater treatment design and urban planning, as the study outcomes provide practical approaches and recommendations for urban stormwater quality enhancement. Furthermore, this monograph also demonstrates how fundamental knowledge of stormwater quality processes can be translated to provide guidance on engineering practice, the comprehensive application of multivariate data analyses techniques and a paradigm on integrative use of computer models and mathematical models to derive practical outcomes.
Resumo:
Guaranteeing Quality of Service (QoS) with minimum computation cost is the most important objective of cloud-based MapReduce computations. Minimizing the total computation cost of cloud-based MapReduce computations is done through MapReduce placement optimization. MapReduce placement optimization approaches can be classified into two categories: homogeneous MapReduce placement optimization and heterogeneous MapReduce placement optimization. It is generally believed that heterogeneous MapReduce placement optimization is more effective than homogeneous MapReduce placement optimization in reducing the total running cost of cloud-based MapReduce computations. This paper proposes a new approach to the heterogeneous MapReduce placement optimization problem. In this new approach, the heterogeneous MapReduce placement optimization problem is transformed into a constrained combinatorial optimization problem and is solved by an innovative constructive algorithm. Experimental results show that the running cost of the cloud-based MapReduce computation platform using this new approach is 24:3%-44:0% lower than that using the most popular homogeneous MapReduce placement approach, and 2:0%-36:2% lower than that using the heterogeneous MapReduce placement approach not considering the spare resources from the existing MapReduce computations. The experimental results have also demonstrated the good scalability of this new approach.
Resumo:
Trade union membership, both in aggregate numbers and in density, has declined in the majority of advanced economies globally over recent decades (Blanchflower, 2007). In Australia, the decline in the 1990s was somewhat more precipitate than in most countries (Peetz, 1998). As discussed in Chapter 1, reasons for the decline are multifactorial, including a more hostile environment to unionism created by employers and the state, difficulties ·with workplace union organisation, and structural change in the economy (Bryson and Gomez, 2005; Bryson et a!., 2011; Ebbinghaus et al., 2011; Payne, 1989; Waddington and Kerr, 2002; Waddington and Whitson, 1997). Our purpose in this chapter is to look beyond aggregate Australian union density data, to examine how age relates to membership decline, and how different age groups, particularly younger workers, are located in the story of union decline. The practical implications of this research are that understanding how unions relate to workers of different age groups, and to workers of different genders amongst those age groups, may lead to improved recruitment and better union organisation.
Resumo:
Particle Swarm Optimization (PSO) is a biologically inspired computational search and optimization method based on the social behaviors of birds flocking or fish schooling. Although, PSO is represented in solving many well-known numerical test problems, but it suffers from the premature convergence. A number of basic variations have been developed due to solve the premature convergence problem and improve quality of solution founded by the PSO. This study presents a comprehensive survey of the various PSO-based algorithms. As part of this survey, the authors have included a classification of the approaches and they have identify the main features of each proposal. In the last part of the study, some of the topics within this field that are considered as promising areas of future research are listed.
Resumo:
In the past two decades, complexity thinking has emerged as an important theoretical response to the limitations of orthodox ways of understanding educational phenomena. Complexity provides ways of understanding that embrace uncertainty, non-linearity and the inevitable ‘messiness’ that is inherent in educational settings, paying attention to the ways in which the whole is greater than the sum of its parts. This is the first book to focus on complexity thinking in the context of physical education, enabling fresh ways of thinking about research, teaching, curriculum and learning. Written by a team of leading international physical education scholars, the book highlights how the considerable theoretical promise of complexity can be reflected in the actual policies, pedagogies and practices of physical education (PE). It encourages teachers, educators and researchers to embrace notions of learning that are more organic and emergent, to allow the inherent complexity of pedagogical work in PE to be examined more broadly and inclusively. In doing so, Complexity Thinking in Physical Education makes a major contribution to our understanding of pedagogy, curriculum design and development, human movement and educational practice.
Resumo:
Index tracking is an investment approach where the primary objective is to keep portfolio return as close as possible to a target index without purchasing all index components. The main purpose is to minimize the tracking error between the returns of the selected portfolio and a benchmark. In this paper, quadratic as well as linear models are presented for minimizing the tracking error. The uncertainty is considered in the input data using a tractable robust framework that controls the level of conservatism while maintaining linearity. The linearity of the proposed robust optimization models allows a simple implementation of an ordinary optimization software package to find the optimal robust solution. The proposed model of this paper employs Morgan Stanley Capital International Index as the target index and the results are reported for six national indices including Japan, the USA, the UK, Germany, Switzerland and France. The performance of the proposed models is evaluated using several financial criteria e.g. information ratio, market ratio, Sharpe ratio and Treynor ratio. The preliminary results demonstrate that the proposed model lowers the amount of tracking error while raising values of portfolio performance measures.
Resumo:
Lattice-based cryptographic primitives are believed to offer resilience against attacks by quantum computers. We demonstrate the practicality of post-quantum key exchange by constructing cipher suites for the Transport Layer Security (TLS) protocol that provide key exchange based on the ring learning with errors (R-LWE) problem, we accompany these cipher suites with a rigorous proof of security. Our approach ties lattice-based key exchange together with traditional authentication using RSA or elliptic curve digital signatures: the post-quantum key exchange provides forward secrecy against future quantum attackers, while authentication can be provided using RSA keys that are issued by today's commercial certificate authorities, smoothing the path to adoption. Our cryptographically secure implementation, aimed at the 128-bit security level, reveals that the performance price when switching from non-quantum-safe key exchange is not too high. With our R-LWE cipher suites integrated into the Open SSL library and using the Apache web server on a 2-core desktop computer, we could serve 506 RLWE-ECDSA-AES128-GCM-SHA256 HTTPS connections per second for a 10 KiB payload. Compared to elliptic curve Diffie-Hellman, this means an 8 KiB increased handshake size and a reduction in throughput of only 21%. This demonstrates that provably secure post-quantum key-exchange can already be considered practical.