648 resultados para Relocation reuse
Resumo:
Problem, research strategy and findings: On January 10, 2011, the town of Grantham, Queensland (Australia), was inundated with a flash flood in which 12 of the town's 370 residents drowned. The overall damage bill in Queensland was AUD∃2.38 billion (USD∃2.4 billion) with 35 deaths, and more than three-quarters of the state was declared a flood disaster zone. In this study, we focus on the unusual and even rare decision to relocate Grantham in March 2011. The Lockyer Valley Regional Council (LVRC) acquired a 377-hectare (932-acre) site to enable a voluntary swap of equivalent-sized lots. In addition, planning regulations were set aside to streamline the relocation of a portion of the town. We review the natural hazard literature as it relates to community relocation, state and local government documents related to Grantham, and reports and newspaper articles related to the flood. We also analyze data from interviews with key stakeholders. We document the process of community relocation, assess the relocation process in Grantham against best practice, examine whether the process of community relocation can be upscaled and if the Grantham relocation is an example of good planning or good politics. Takeaway for practice: Our study reveals two key messages for practice. Community relocation (albeit a small one) is possible, and the process can be done quickly; some Grantham residents moved into their new, relocated homes in December 2012, just 11 months after the flood. Moreover, the role of existing planning regulations can be a hindrance to quick action; political leadership, particularly at the local level, is key to implementing the relocation.
Resumo:
Bearing faults are the most common cause of wind turbine failures. Unavailability and maintenance cost of wind turbines are becoming critically important, with their fast growing in electric networks. Early fault detection can reduce outage time and costs. This paper proposes Anomaly Detection (AD) machine learning algorithms for fault diagnosis of wind turbine bearings. The application of this method on a real data set was conducted and is presented in this paper. For validation and comparison purposes, a set of baseline results are produced using the popular one-class SVM methods to examine the ability of the proposed technique in detecting incipient faults.
Resumo:
Overvoltage and overloading due to high utilization of PVs are the main power quality concerns for future distribution power systems. This paper proposes a distributed control coordination strategy to manage multiple PVs within a network to overcome these issues. PVs reactive power is used to deal with over-voltages and PVs active power curtailment are regulated to avoid overloading. The proposed control structure is used to share the required contribution fairly among PVs, in proportion to their ratings. This approach is examined on a practical distribution network with multiple PVs.
Resumo:
Understanding the dynamics of disease spread is of crucial importance, in contexts such as estimating load on medical services to risk assessment and intervention policies against large-scale epidemic outbreaks. However, most of the information is available after the spread itself, and preemptive assessment is far from trivial. Here, we investigate the use of agent-based simulations to model such outbreaks in a stylised urban environment. For most diseases, infection of a new individual may occur from casual contact in crowds as well as from repeated interactions with social partners such as work colleagues or family members. Our model therefore accounts for these two phenomena.Presented in this paper is the initial framework for such a model, detailing implementation of geographical features and generation of social structures. Preliminary results are a promising step towards large-scale simulations and evaluation of potential intervention policies.
Resumo:
Several algorithms and techniques widely used in Computer Science have been adapted from, or inspired by, known biological phenomena. This is a consequence of the multidisciplinary background of most early computer scientists. The field has now matured, and permits development of tools and collaborative frameworks which play a vital role in advancing current biomedical research. In this paper, we briefly present examples of the former, and elaborate upon two of the latter, applied to immunological modelling and as a new paradigm in gene expression.
Resumo:
In this paper, we investigate the effect of mobility constraints on epidemic broadcast mechanisms in DTNs (Delay-Tolerant Networks). Major factors affecting epidemic broadcast performances are its forwarding algorithm and node mobility. The impact of forwarding algorithm and node mobility on epidemic broadcast mechanisms has been actively studied in the literature, but those studies generally use unconstrained mobility models. The objective of this paper is therefore to quantitatively investigate the effect of mobility constraints on epidemic broadcast mechanisms. We evaluate the performances of three classes of epidemic broadcast mechanisms - P-BCAST (PUSH-based BroadCast), SA-BCAST (Self-Adaptive BroadCast), and HP-BCAST (History-based P-BCAST) - with a random waypoint mobility model with mobility constraints. Our finding includes that the existence of mobility constraints significantly improves the reach ability and dissemination speed of epidemic broadcast mechanisms while degrading their efficiency.
Resumo:
In some delay-tolerant communication systems such as vehicular ad-hoc networks, information flow can be represented as an infectious process, where each entity having already received the information will try to share it with its neighbours. The random walk and random waypoint models are popular analysis tools for these epidemic broadcasts, and represent two types of random mobility. In this paper, we introduce a simulation framework investigating the impact of a gradual increase of bias in path selection (i.e. reduction of randomness), when moving from the former to the latter. Randomness in path selection can significantly alter the system performances, in both regular and irregular network structures. The implications of these results for real systems are discussed in details.
Resumo:
One of the main challenges in data analytics is that discovering structures and patterns in complex datasets is a computer-intensive task. Recent advances in high-performance computing provide part of the solution. Multicore systems are now more affordable and more accessible. In this paper, we investigate how this can be used to develop more advanced methods for data analytics. We focus on two specific areas: model-driven analysis and data mining using optimisation techniques.
Resumo:
As computational models in fields such as medicine and engineering get more refined, resource requirements are increased. In a first instance, these needs have been satisfied using parallel computing and HPC clusters. However, such systems are often costly and lack flexibility. HPC users are therefore tempted to move to elastic HPC using cloud services. One difficulty in making this transition is that HPC and cloud systems are different, and performance may vary. The purpose of this study is to evaluate cloud services as a means to minimise both cost and computation time for large-scale simulations, and to identify which system properties have the most significant impact on performance. Our simulation results show that, while the performance of Virtual CPU (VCPU) is satisfactory, network throughput may lead to difficulties.
Resumo:
Most real-life data analysis problems are difficult to solve using exact methods, due to the size of the datasets and the nature of the underlying mechanisms of the system under investigation. As datasets grow even larger, finding the balance between the quality of the approximation and the computing time of the heuristic becomes non-trivial. One solution is to consider parallel methods, and to use the increased computational power to perform a deeper exploration of the solution space in a similar time. It is, however, difficult to estimate a priori whether parallelisation will provide the expected improvement. In this paper we consider a well-known method, genetic algorithms, and evaluate on two distinct problem types the behaviour of the classic and parallel implementations.
Resumo:
In providing simultaneous information on expression profiles for thousands of genes, microarray technologies have, in recent years, been largely used to investigate mechanisms of gene expression. Clustering and classification of such data can, indeed, highlight patterns and provide insight on biological processes. A common approach is to consider genes and samples of microarray datasets as nodes in a bipartite graphs, where edges are weighted e.g. based on the expression levels. In this paper, using a previously-evaluated weighting scheme, we focus on search algorithms and evaluate, in the context of biclustering, several variations of Genetic Algorithms. We also introduce a new heuristic “Propagate”, which consists in recursively evaluating neighbour solutions with one more or one less active conditions. The results obtained on three well-known datasets show that, for a given weighting scheme,optimal or near-optimal solutions can be identified.
Resumo:
The control of environmental factors in open-office environments, such as lighting and temperature is becoming increasingly automated. This development means that office inhabitants are losing the ability to manually adjust environmental conditions according to their needs. In this paper we describe the design, use and evaluation of MiniOrb, a system that employs ambient and tangible interaction mechanisms to allow inhabitants of office environments to maintain awareness of environmental factors, report on their own subjectively perceived office comfort levels and see how these compare to group average preferences. The system is complemented by a mobile application, which enables users to see and set the same sensor values and preferences, but using a screen-based interface. We give an account of the system’s design and outline the results of an in-situ trial and user study. Our results show that devices that combine ambient and tangible interaction approaches are well suited to the task of recording indoor climate preferences and afford a rich set of possible interactions that can complement those enabled by more conventional screen-based interfaces.
Resumo:
This study aimed at presenting the intra-tester reliability of the static load bearing exercises (LBEs) performed by individuals with transfemoral amputation (TFA) fitted with an osseointegrated implant to stimulate the bone remodelling process. There is a need for a better understanding of the implementation of these exercises particularly the reliability. The intra-tester reliability is discussed with a particular emphasis on inter-load prescribed, inter-axis and inter-component reliabilities as well as the effect of body weight normalisation. Eleven unilateral TFAs fitted with an OPRA implant performed five trials in four loading conditions. The forces and moments on the three axes of the implant were measured directly with an instrumented pylon including a six-channel transducer. Reliability of loading variables was assessed using intraclass correlation coefficients (ICCs) and percentage standard error of measurement values (%SEMs). The ICCs of all variables were above 0.9 and the %SEM values ranged between 0 and 87%. This study showed a high between-participants’ variance highlighting the lack of loading consistency typical of symptomatic population as well as a high reliability between the loading sessions indicating a plausible correct repetition of the LBE by the participants. However, these outcomes must be understood within the framework of the proposed experimental protocol.
Resumo:
This paper reviews the state-of-the-art in the automation of underground mining vehicles and reports on the development of an autonomous navigation system under development through the CMTE with sponsorship arranged by AMIRA. Past attempts at automating LHDs and haul trucks are described and their particular strengths and weaknesses are discussed. The auto-guidance system being developed overcomes some of the limitations of state-of-the-art prototype æcommercialÆ systems. It can be retrofitted to existing remote controlled vehicles, uses minimum installed infrastructure and is flexible enough for rapid relocation to alternate routes. The navigation techniques use data fusion of two separate sets of sensors combining natural feature recognition, nodal maps and inertial navigation techniques. Collision detection is incorporated and people and other traffic are excluded from the tramming area. This paper describes the work being done by the group with regard to auto-tramming and also outlines the future goals.
Resumo:
As a result of the more distributed nature of organisations and the inherently increasing complexity of their business processes, a significant effort is required for the specification and verification of those processes. The composition of the activities into a business process that accomplishes a specific organisational goal has primarily been a manual task. Automated planning is a branch of artificial intelligence (AI) in which activities are selected and organised by anticipating their expected outcomes with the aim of achieving some goal. As such, automated planning would seem to be a natural fit to the BPM domain to automate the specification of control flow. A number of attempts have been made to apply automated planning to the business process and service composition domain in different stages of the BPM lifecycle. However, a unified adoption of these techniques throughout the BPM lifecycle is missing. As such, we propose a new intention-centric BPM paradigm, which aims on minimising the specification effort by exploiting automated planning techniques to achieve a pre-stated goal. This paper provides a vision on the future possibilities of enhancing BPM using automated planning. A research agenda is presented, which provides an overview of the opportunities and challenges for the exploitation of automated planning in BPM.