896 resultados para graph entropy
Resumo:
A graph theoretic approach is developed for accurately computing haulage costs in earthwork projects. This is vital as haulage is a predominant factor in the real cost of earthworks. A variety of metrics can be used in our approach, but a fuel consumption proxy is recommended. This approach is novel as it considers the constantly changing terrain that results from cutting and filling activities and replaces inaccurate “static” calculations that have been used previously. The approach is also capable of efficiently correcting the violation of top down cutting and bottom up filling conditions that can be found in existing earthwork assignments and sequences. This approach assumes that the project site is partitioned into uniform blocks. A directed graph is then utilised to describe the terrain surface. This digraph is altered after each cut and fill, in order to reflect the true state of the terrain. A shortest path algorithm is successively applied to calculate the cost of each haul and these costs are summed to provide a total cost of haulage
Resumo:
This thesis establishes performance properties for approximate filters and controllers that are designed on the basis of approximate dynamic system representations. These performance properties provide a theoretical justification for the widespread application of approximate filters and controllers in the common situation where system models are not known with complete certainty. This research also provides useful tools for approximate filter designs, which are applied to hybrid filtering of uncertain nonlinear systems. As a contribution towards applications, this thesis also investigates air traffic separation control in the presence of measurement uncertainties.
Resumo:
Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future data set drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature to rapidly obtain samples from the posterior is importance sampling, using the prior as the importance distribution. However, importance sampling will tend to break down if there is a reasonable number of experimental observations and/or the model parameter is high dimensional. In this paper we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times which produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.
Resumo:
The assessment of choroidal thickness from optical coherence tomography (OCT) images of the human choroid is an important clinical and research task, since it provides valuable information regarding the eye’s normal anatomy and physiology, and changes associated with various eye diseases and the development of refractive error. Due to the time consuming and subjective nature of manual image analysis, there is a need for the development of reliable objective automated methods of image segmentation to derive choroidal thickness measures. However, the detection of the two boundaries which delineate the choroid is a complicated and challenging task, in particular the detection of the outer choroidal boundary, due to a number of issues including: (i) the vascular ocular tissue is non-uniform and rich in non-homogeneous features, and (ii) the boundary can have a low contrast. In this paper, an automatic segmentation technique based on graph-search theory is presented to segment the inner choroidal boundary (ICB) and the outer choroidal boundary (OCB) to obtain the choroid thickness profile from OCT images. Before the segmentation, the B-scan is pre-processed to enhance the two boundaries of interest and to minimize the artifacts produced by surrounding features. The algorithm to detect the ICB is based on a simple edge filter and a directional weighted map penalty, while the algorithm to detect the OCB is based on OCT image enhancement and a dual brightness probability gradient. The method was tested on a large data set of images from a pediatric (1083 B-scans) and an adult (90 B-scans) population, which were previously manually segmented by an experienced observer. The results demonstrate the proposed method provides robust detection of the boundaries of interest and is a useful tool to extract clinical data.
Resumo:
This thesis presents a novel approach to mobile robot navigation using visual information towards the goal of long-term autonomy. A novel concept of a continuous appearance-based trajectory is proposed in order to solve the limitations of previous robot navigation systems, and two new algorithms for mobile robots, CAT-SLAM and CAT-Graph, are presented and evaluated. These algorithms yield performance exceeding state-of-the-art methods on public benchmark datasets and large-scale real-world environments, and will help enable widespread use of mobile robots in everyday applications.
Resumo:
This thesis introduces improved techniques towards automatically estimating the pose of humans from video. It examines a complete workflow to estimating pose, from the segmentation of the raw video stream to extract silhouettes, to using the silhouettes in order to determine the relative orientation of parts of the human body. The proposed segmentation algorithms have improved performance and reduced complexity, while the pose estimation shows superior accuracy during difficult cases of self occlusion.
Resumo:
This paper presents a long-term experiment where a mobile robot uses adaptive spherical views to localize itself and navigate inside a non-stationary office environment. The office contains seven members of staff and experiences a continuous change in its appearance over time due to their daily activities. The experiment runs as an episodic navigation task in the office over a period of eight weeks. The spherical views are stored in the nodes of a pose graph and they are updated in response to the changes in the environment. The updating mechanism is inspired by the concepts of long- and short-term memories. The experimental evaluation is done using three performance metrics which evaluate the quality of both the adaptive spherical views and the navigation over time.
Resumo:
Organizational transformations reliant on successful ICT system developments (continue to) fail to deliver projected benefits even when contemporary governance models are applied rigorously. Modifications to traditional program, project and systems development management methods have produced little material improvement to successful transformation as they are unable to routinely address the complexity and uncertainty of dynamic alignment of IS investments and innovation. Complexity theory provides insight into why this phenomenon occurs and is used to develop a conceptualization of complexity in IS-driven organizational transformations. This research-in-progress aims to identify complexity formulations relevant to organizational transformation. Political/power based influences, interrelated business rules, socio-technical innovation, impacts on stakeholders and emergent behaviors are commonly considered as characterizing complexity while the proposed conceptualization accommodates these as connectivity, irreducibility, entropy and/or information gain in hierarchically approximation and scaling, number of states in a finite automata and/or dimension of attractor, and information and/or variety.
Resumo:
The diagnostics of mechanical components operating in transient conditions is still an open issue, in both research and industrial field. Indeed, the signal processing techniques developed to analyse stationary data are not applicable or are affected by a loss of effectiveness when applied to signal acquired in transient conditions. In this paper, a suitable and original signal processing tool (named EEMED), which can be used for mechanical component diagnostics in whatever operating condition and noise level, is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED) and the analytical approach of the Hilbert transform. The proposed tool is able to supply diagnostic information on the basis of experimental vibrations measured in transient conditions. The tool has been originally developed in order to detect localized faults on bearings installed in high speed train traction equipments and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on spectral kurtosis or envelope analysis, which represent until now the landmark for bearings diagnostics.
Resumo:
Diagnostics of rolling element bearings is usually performed by means of vibration signals measured by accelerometers placed in the proximity of the bearing under investigation. The aim is to monitor the integrity of the bearing components, in order to avoid catastrophic failures, or to implement condition based maintenance strategies. In particular, the trend in this field is to combine in a single algorithm different signal-enhancement and signal-analysis techniques. Among the first ones, Minimum Entropy Deconvolution (MED) has been pointed out as a key tool able to highlight the effect of a possible damage in one of the bearing components within the vibration signal. This paper presents the application of this technique to signals collected on a simple test-rig, able to test damaged industrial roller bearings in different working conditions. The effectiveness of the technique has been tested, comparing the results of one undamaged bearing with three bearings artificially damaged in different locations, namely on the inner race, outer race and rollers. Since MED performances are dependent on the filter length, the most suitable value of this parameter is defined on the basis of both the application and measured signals. This represents an original contribution of the paper.
Resumo:
The signal processing techniques developed for the diagnostics of mechanical components operating in stationary conditions are often not applicable or are affected by a loss of effectiveness when applied to signals measured in transient conditions. In this chapter, an original signal processing tool is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition, Minimum Entropy Deconvolution and the analytical approach of the Hilbert transform. The tool has been developed to detect localized faults on bearings of traction systems of high speed trains and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on envelope analysis or spectral kurtosis, which represent until now the landmark for bearings diagnostics.
Resumo:
A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.
Resumo:
This study investigated movement synchronization of players within and between teams during competitive association football performance. Cluster phase analysis was introduced as a method to assess synchronies between whole teams and between individual players with their team as a function of time, ball possession and field direction. Measures of dispersion (SD) and regularity (sample entropy – SampEn – and cross sample entropy – Cross-SampEn) were used to quantify the magnitude and structure of synchrony. Large synergistic relations within each professional team sport collective were observed, particularly in the longitudinal direction of the field (0.89 ± 0.12) compared to the lateral direction (0.73 ± 0.16, p < .01). The coupling between the group measures of the two teams also revealed that changes in the synchrony of each team were intimately related (Cross-SampEn values of 0.02 ± 0.01). Interestingly, ball possession did not influence team synchronization levels. In player–team synchronization, individuals tended to be coordinated under near in-phase modes with team behavior (mean ranges between −7 and 5° of relative phase). The magnitudes of variations were low, but more irregular in time, for the longitudinal (SD: 18 ± 3°; SampEn: 0.07 ± 0.01), compared to the lateral direction (SD: 28 ± 5°; SampEn: 0.06 ± 0.01, p < .05) on-field. Increases in regularity were also observed between the first (SampEn: 0.07 ± 0.01) and second half (SampEn: 0.06 ± 0.01, p < .05) of the observed competitive game. Findings suggest that the method of analysis introduced in the current study may offer a suitable tool for examining team’s synchronization behaviors and the mutual influence of each team’s cohesiveness in competing social collectives.
Resumo:
This study investigated changes in the complexity (magnitude and structure of variability) of the collective behaviours of association football teams during competitive performance. Raw positional data from an entire competitive match between two professional teams were obtained with the ProZone® tracking system. Five compound positional variables were used to investigate the collective patterns of performance of each team including: surface area, stretch index, team length, team width, and geometrical centre. Analyses involve the coefficient of variation (%CV) and approximate entropy (ApEn), as well as the linear association between both parameters. Collective measures successfully captured the idiosyncratic behaviours of each team and their variations across the six time periods of the match. Key events such as goals scored and game breaks (such as half time and full time) seemed to influence the collective patterns of performance. While ApEn values significantly decreased during each half, the %CV increased. Teams seem to become more regular and predictable, but with increased magnitudes of variation in their organisational shape over the natural course of a match.
Resumo:
A decision-making framework for image-guided radiotherapy (IGRT) is being developed using a Bayesian Network (BN) to graphically describe, and probabilistically quantify, the many interacting factors that are involved in this complex clinical process. Outputs of the BN will provide decision-support for radiation therapists to assist them to make correct inferences relating to the likelihood of treatment delivery accuracy for a given image-guided set-up correction. The framework is being developed as a dynamic object-oriented BN, allowing for complex modelling with specific sub-regions, as well as representation of the sequential decision-making and belief updating associated with IGRT. A prototype graphic structure for the BN was developed by analysing IGRT practices at a local radiotherapy department and incorporating results obtained from a literature review. Clinical stakeholders reviewed the BN to validate its structure. The BN consists of a sub-network for evaluating the accuracy of IGRT practices and technology. The directed acyclic graph (DAG) contains nodes and directional arcs representing the causal relationship between the many interacting factors such as tumour site and its associated critical organs, technology and technique, and inter-user variability. The BN was extended to support on-line and off-line decision-making with respect to treatment plan compliance. Following conceptualisation of the framework, the BN will be quantified. It is anticipated that the finalised decision-making framework will provide a foundation to develop better decision-support strategies and automated correction algorithms for IGRT.