838 resultados para heterogeneous delays
Resumo:
There is an increased interested in Uninhabited Aerial Vehicle (UAV) operations and research into advanced methods for commanding and controlling multiple heterogeneous UAVs. Research into areas of supervisory control has rapidly increased. Past research has investigated various approaches of autonomous control and operator limitation to improve mission commanders' Situation Awareness (SA) and cognitive workload. The aim of this paper is to address this challenge through a visualisation framework of UAV information constructed from Information Abstraction (IA). This paper presents the concept and process of IA, and the visualisation framework (constructed using IA), the concept associated with the Level Of Detail (LOD) indexing method, the visualisation of an example of the framework. Experiments will test the hypothesis that, the operator will be able to achieve increased SA and reduced cognitive load with the proposed framework.
Resumo:
The reliability of urban passenger trains is a critical performance measure for passenger satisfaction and ultimately market share. A delay to one train in a peak period can have a severe effect on the schedule adherence of other trains. This paper presents an analytically based model to quantify the expected positive delay for individual passenger trains and track links in an urban rail network. The model specifically addresses direct delay to trains, knock-on delays to other trains, and delays at scheduled connections. A solution to the resultant system of equations is found using an iterative refinement algorithm. Model validation, which is carried out using a real-life suburban train network consisting of 157 trains, shows the model estimates to be on average within 8% of those obtained from a large scale simulation. Also discussed, is the application of the model to assess the consequences of increased scheduled slack time as well as investment strategies designed to reduce delay.
Resumo:
A Jacobian-free variable-stepsize method is developed for the numerical integration of the large, stiff systems of differential equations encountered when simulating transport in heterogeneous porous media. Our method utilises the exponential Rosenbrock-Euler method, which is explicit in nature and requires a matrix-vector product involving the exponential of the Jacobian matrix at each step of the integration process. These products can be approximated using Krylov subspace methods, which permit a large integration stepsize to be utilised without having to precondition the iterations. This means that our method is truly "Jacobian-free" - the Jacobian need never be formed or factored during the simulation. We assess the performance of the new algorithm for simulating the drying of softwood. Numerical experiments conducted for both low and high temperature drying demonstrates that the new approach outperforms (in terms of accuracy and efficiency) existing simulation codes that utilise the backward Euler method via a preconditioned Newton-Krylov strategy.
Resumo:
The paper presents a detailed analysis on the collective dynamics and delayed state feedback control of a three-dimensional delayed small-world network. The trivial equilibrium of the model is first investigated, showing that the uncontrolled model exhibits complicated unbounded behavior. Then three control strategies, namely a position feedback control, a velocity feedback control, and a hybrid control combined velocity with acceleration feedback, are then introduced to stabilize this unstable system. It is shown in these three control schemes that only the hybrid control can easily stabilize the 3-D network system. And with properly chosen delay and gain in the delayed feedback path, the hybrid controlled model may have stable equilibrium, or periodic solutions resulting from the Hopf bifurcation, or complex stranger attractor from the period-doubling bifurcation. Moreover, the direction of Hopf bifurcation and stability of the bifurcation periodic solutions are analyzed. The results are further extended to any "d" dimensional network. It shows that to stabilize a "d" dimensional delayed small-world network, at least a "d – 1" order completed differential feedback is needed. This work provides a constructive suggestion for the high dimensional delayed systems.
Resumo:
Background Cancer outlier profile analysis (COPA) has proven to be an effective approach to analyzing cancer expression data, leading to the discovery of the TMPRSS2 and ETS family gene fusion events in prostate cancer. However, the original COPA algorithm did not identify down-regulated outliers, and the currently available R package implementing the method is similarly restricted to the analysis of over-expressed outliers. Here we present a modified outlier detection method, mCOPA, which contains refinements to the outlier-detection algorithm, identifies both over- and under-expressed outliers, is freely available, and can be applied to any expression dataset. Results We compare our method to other feature-selection approaches, and demonstrate that mCOPA frequently selects more-informative features than do differential expression or variance-based feature selection approaches, and is able to recover observed clinical subtypes more consistently. We demonstrate the application of mCOPA to prostate cancer expression data, and explore the use of outliers in clustering, pathway analysis, and the identification of tumour suppressors. We analyse the under-expressed outliers to identify known and novel prostate cancer tumour suppressor genes, validating these against data in Oncomine and the Cancer Gene Index. We also demonstrate how a combination of outlier analysis and pathway analysis can identify molecular mechanisms disrupted in individual tumours. Conclusions We demonstrate that mCOPA offers advantages, compared to differential expression or variance, in selecting outlier features, and that the features so selected are better able to assign samples to clinically annotated subtypes. Further, we show that the biology explored by outlier analysis differs from that uncovered in differential expression or variance analysis. mCOPA is an important new tool for the exploration of cancer datasets and the discovery of new cancer subtypes, and can be combined with pathway and functional analysis approaches to discover mechanisms underpinning heterogeneity in cancers
Resumo:
In this paper, we analyse the impact of a (small) heterogeneity of jump type on the most simple localized solutions of a 3-component FitzHugh–Nagumo-type system. We show that the heterogeneity can pin a 1-front solution, which travels with constant (non-zero) speed in the homogeneous setting, to a fixed, explicitly determined, distance from the heterogeneity. Moreover, we establish the stability of this heterogeneous pinned 1-front solution. In addition, we analyse the pinning of 1-pulse, or 2-front, solutions. The paper is concluded with simulations in which we consider the dynamics and interactions of N-front patterns in domains with M heterogeneities of jump type (N = 3, 4, M ≥ 1).
Resumo:
An increased interest in utilising groups of Unmanned Aerial Vehicles (UAVs) with heterogeneous capabilities and autonomy is presenting the challenge to effectively manage such during missions and operations. This has been the focus of research in recent years, moving from a traditional UAV management paradigm of n-to-1 (n operators for one UAV, with n being at least two operators) toward 1-to-n (one operator, multiple UAVs). This paper has expanded on the authors’ previous work on UAV functional capability framework, by incorporating the concept of Functional Level of Autonomy (F-LOA) with two configurations: The lower F-LOA configuration contains sufficient information for the operator to generate solutions and make decisions to address perturbation events. Alternatively, the higher F-LOA configuration presents information reflecting on the F-LOA of the UAV, allowing the operator to interpret solutions and decisions generated autonomously, and decide whether to veto from this decision.
Resumo:
This research was a step forward in developing a data integration framework for Electronic Health Records. The outcome of the research is a conceptual and logical Data Warehousing model for integrating Cardiac Surgery electronic data records. This thesis investigated the main obstacles for the healthcare data integration and proposes a data warehousing model suitable for integrating fragmented data in a Cardiac Surgery Unit.
Resumo:
The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.
Resumo:
Risk identification is one of the most challenging stages in the risk management process. Conventional risk management approaches provide little guidance and companies often rely on the knowledge of experts for risk identification. In this paper we demonstrate how risk indicators can be used to predict process delays via a method for configuring so-called Process Risk Indicators(PRIs). The method learns suitable configurations from past process behaviour recorded in event logs. To validate the approach we have implemented it as a plug-in of the ProM process mining framework and have conducted experiments using various data sets from a major insurance company.
Resumo:
We propose a cluster ensemble method to map the corpus documents into the semantic space embedded in Wikipedia and group them using multiple types of feature space. A heterogeneous cluster ensemble is constructed with multiple types of relations i.e. document-term, document-concept and document-category. A final clustering solution is obtained by exploiting associations between document pairs and hubness of the documents. Empirical analysis with various real data sets reveals that the proposed meth-od outperforms state-of-the-art text clustering approaches.
Resumo:
In particle-strengthened metallic alloys, fatigue damage incubates at inclusion particles near the surface or at the change of geometries. Micromechanical simulation of inclusions such that the fatigue damage incubation mechanisms can be categorized. As micro-plasticity gradient field around different inclusions is different, a novel concept for nonlocal evaluation of micro-plasticity intensity is introduced. The effects of void aspects ration and spatial distributions are quantified for fatigue incubation life in the high-cycle fatigue regime. At last, these effects are integrated based on the statistical facts of inclusions to predict the fatigue life of structural components.
Resumo:
In this work the electrochemical formation of porous Cu/Ag materials is reported via the simple and quick method of hydrogen bubble templating. The bulk and surface composition ratio between Ag and Cu was varied in a systematic manner and was readily controlled by the concentration of precursor metal salts in the electrolyte. The incorporation of Ag within the Cu scaffold only affected the formation of well-defined pores at high Ag loading whereas the internal pore wall structure gradually transformed from dendritic to cube like and finally needle like structures, which was due to the concomitant formation of Cu2O within the structure. The materials were characterised by scanning electron microscopy (SEM), X-ray diffraction (XRD), and X-ray photoelectron spectroscopy (XPS). Their surface properties were further investigated by surface enhanced Raman spectroscopy (SERS) and electrochemically probed by recording the hydrogen evolution reaction (HER) which is highly sensitive to the nature of the surface. The effect of surface composition was then investigated for its influence on two catalytic reactions namely the reduction of ferricyanide ions with thiosulphate ions and the reduction of 4-nitrophenol with NaBH4 in aqueous solution where it was found that the presence of Ag had a beneficial effect in both cases but more so in the case of nitrophenol reduction. It is believed that this material may have many more potential applications in the area of catalysis, electrocatalysis and photocatalysis.
Resumo:
Addressing possibilities for authentic combinations of diverse media within an installation setting, this research tested hybrid blends of the physical, digital and temporal to explore liminal space and image. The practice led research reflected on creation of artworks from three perspectives – material, immaterial and hybrid – and in doing so, developed a new methodological structure that extends conventional forms of triangulation. This study explored how physical and digital elements each sought hierarchical presence, yet simultaneously coexisted, thereby extending the visual and conceptual potential of the work. Outcomes demonstrated how utilising and recording transitional processes of hybrid imagery achieved a convergence of diverse, experiential forms. "Hybrid authority" – an authentic convergence of disparate elements – was articulated in the creation and public sharing of processual works and the creation of an innovative framework for hybrid art practice.
Resumo:
The interest in utilising multiple heterogeneous Unmanned Aerial Vehicles (UAVs) in close proximity is growing rapidly. As such, many challenges are presented in the effective coordination and management of these UAVs; converting the current n-to-1 paradigm (n operators operating a single UAV) to the 1-to-n paradigm (one operator managing n UAVs). This paper introduces an Information Abstraction methodology used to produce the functional capability framework initially proposed by Chen et al. and its Level Of Detail (LOD) indexing scale. This framework was validated through comparing the operator workload and Situation Awareness (SA) of three experiment scenarios involving multiple autonomously heterogeneous UAVs. The first scenario was set in a high LOD configuration with highly abstracted UAV functional information; the second scenario was set in a mixed LOD configuration; and the final scenario was set in a low LOD configuration with maximal UAV functional information. Results show that there is a significant statistical decrease in operator workload when a UAV’s functional information is displayed at its physical form (low LOD - maximal information) when comparing to the mixed LOD configuration.