711 resultados para Heterogeneous systems

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The requirement of distributed computing of all-to-all comparison (ATAC) problems in heterogeneous systems is increasingly important in various domains. Though Hadoop-based solutions are widely used, they are inefficient for the ATAC pattern, which is fundamentally different from the MapReduce pattern for which Hadoop is designed. They exhibit poor data locality and unbalanced allocation of comparison tasks, particularly in heterogeneous systems. The results in massive data movement at runtime and ineffective utilization of computing resources, affecting the overall computing performance significantly. To address these problems, a scalable and efficient data and task distribution strategy is presented in this paper for processing large-scale ATAC problems in heterogeneous systems. It not only saves storage space but also achieves load balancing and good data locality for all comparison tasks. Experiments of bioinformatics examples show that about 89\% of the ideal performance capacity of the multiple machines have be achieved through using the approach presented in this paper.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

It is not uncommon for enterprises today to be faced with the demand to integrate and incor- porate many different and possibly heterogeneous systems which are generally independently designed and developed, to allow seamless access. In effect, the integration of these systems results in one large whole system that must be able, at the same time, to maintain the local autonomy and to continue working as an independent entity. This problem has introduced a new distributed architecture called federated systems. The most challenging issue in federated systems is to find answers for the question of how to efficiently cooperate while preserving their autonomous characteristic, especially the security autonomy. This thesis intends to address this issue. The thesis reviews the evolution of the concept of federated systems and discusses the organisational characteristics as well as remaining security issues with the existing approaches. The thesis examines how delegation can be used as means to achieve better security, especially authorisation while maintaining autonomy for the participating member of the federation. A delegation taxonomy is proposed as one of the main contributions. The major contribution of this thesis is to study and design a mechanism to support dele- gation within and between multiple security domains with constraint management capability. A novel delegation framework is proposed including two modules: Delegation Constraint Man- agement module and Policy Management module. The first module is designed to effectively create, track and manage delegation constraints, especially for delegation processes which require re-delegation (indirect delegation). The first module employs two algorithms to trace the root authority of a delegation constraint chain and to prevent the potential conflict when creating a delegation constraint chain if necessary. The first module is designed for conflict prevention not conflict resolution. The second module is designed to support the first module via the policy comparison capability. The major function of this module is to provide the delegation framework the capability to compare policies and constraints (written under the format of a policy). The module is an extension of Lin et al.'s work on policy filtering and policy analysis. Throughout the thesis, some case studies are used as examples to illustrate the discussed concepts. These two modules are designed to capture one of the most important aspects of the delegation process: the relationships between the delegation transactions and the involved constraints, which are not very well addressed by the existing approaches. This contribution is significant because the relationships provide information to keep track and en- force the involved delegation constraints and, therefore, play a vital role in maintaining and enforcing security for transactions across multiple security domains.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Solving large-scale all-to-all comparison problems using distributed computing is increasingly significant for various applications. Previous efforts to implement distributed all-to-all comparison frameworks have treated the two phases of data distribution and comparison task scheduling separately. This leads to high storage demands as well as poor data locality for the comparison tasks, thus creating a need to redistribute the data at runtime. Furthermore, most previous methods have been developed for homogeneous computing environments, so their overall performance is degraded even further when they are used in heterogeneous distributed systems. To tackle these challenges, this paper presents a data-aware task scheduling approach for solving all-to-all comparison problems in heterogeneous distributed systems. The approach formulates the requirements for data distribution and comparison task scheduling simultaneously as a constrained optimization problem. Then, metaheuristic data pre-scheduling and dynamic task scheduling strategies are developed along with an algorithmic implementation to solve the problem. The approach provides perfect data locality for all comparison tasks, avoiding rearrangement of data at runtime. It achieves load balancing among heterogeneous computing nodes, thus enhancing the overall computation time. It also reduces data storage requirements across the network. The effectiveness of the approach is demonstrated through experimental studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research presented in this thesis addresses inherent problems in signaturebased intrusion detection systems (IDSs) operating in heterogeneous environments. The research proposes a solution to address the difficulties associated with multistep attack scenario specification and detection for such environments. The research has focused on two distinct problems: the representation of events derived from heterogeneous sources and multi-step attack specification and detection. The first part of the research investigates the application of an event abstraction model to event logs collected from a heterogeneous environment. The event abstraction model comprises a hierarchy of events derived from different log sources such as system audit data, application logs, captured network traffic, and intrusion detection system alerts. Unlike existing event abstraction models where low-level information may be discarded during the abstraction process, the event abstraction model presented in this work preserves all low-level information as well as providing high-level information in the form of abstract events. The event abstraction model presented in this work was designed independently of any particular IDS and thus may be used by any IDS, intrusion forensic tools, or monitoring tools. The second part of the research investigates the use of unification for multi-step attack scenario specification and detection. Multi-step attack scenarios are hard to specify and detect as they often involve the correlation of events from multiple sources which may be affected by time uncertainty. The unification algorithm provides a simple and straightforward scenario matching mechanism by using variable instantiation where variables represent events as defined in the event abstraction model. The third part of the research looks into the solution to address time uncertainty. Clock synchronisation is crucial for detecting multi-step attack scenarios which involve logs from multiple hosts. Issues involving time uncertainty have been largely neglected by intrusion detection research. The system presented in this research introduces two techniques for addressing time uncertainty issues: clock skew compensation and clock drift modelling using linear regression. An off-line IDS prototype for detecting multi-step attacks has been implemented. The prototype comprises two modules: implementation of the abstract event system architecture (AESA) and of the scenario detection module. The scenario detection module implements our signature language developed based on the Python programming language syntax and the unification-based scenario detection engine. The prototype has been evaluated using a publicly available dataset of real attack traffic and event logs and a synthetic dataset. The distinct features of the public dataset are the fact that it contains multi-step attacks which involve multiple hosts with clock skew and clock drift. These features allow us to demonstrate the application and the advantages of the contributions of this research. All instances of multi-step attacks in the dataset have been correctly identified even though there exists a significant clock skew and drift in the dataset. Future work identified by this research would be to develop a refined unification algorithm suitable for processing streams of events to enable an on-line detection. In terms of time uncertainty, identified future work would be to develop mechanisms which allows automatic clock skew and clock drift identification and correction. The immediate application of the research presented in this thesis is the framework of an off-line IDS which processes events from heterogeneous sources using abstraction and which can detect multi-step attack scenarios which may involve time uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of designing a surveillance system to detect a broad range of invasive species across a heterogeneous sampling frame. We present a model to detect a range of invertebrate invasives whilst addressing the challenges of multiple data sources, stratifying for differential risk, managing labour costs and providing sufficient power of detection.We determine the number of detection devices required and their allocation across the landscape within limiting resource constraints. The resulting plan will lead to reduced financial and ecological costs and an optimal surveillance system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the findings of an investigation into the rate-limiting mechanism for the heterogeneous burning in oxygen under normal gravity and microgravity of cylindrical iron rods. The original objective of the work was to determine why the observed melting rate for burning 3.2-mm diameter iron rods is significantly higher in microgravity than in normal gravity. This work, however, also provided fundamental insight into the rate-limiting mechanism for heterogeneous burning. The paper includes a summary of normal-gravity and microgravity experimental results, heat transfer analysis and post-test microanalysis of quenched samples. These results are then used to show that heat transfer across the solid/liquid interface is the rate-limiting mechanism for melting and burning, limited by the interfacial surface area between the molten drop and solid rod. In normal gravity, the work improves the understanding of trends reported during standard flammability testing for metallic materials, such as variations in melting rates between test specimens with the same cross-sectional area but different crosssectional shape. The work also provides insight into the effects of configuration and orientation, leading to an improved application of standard test results in the design of oxygen system components. For microgravity applications, the work enables the development of improved methods for lower cost metallic material flammability testing programs. In these ways, the work provides fundamental insight into the heterogeneous burning process and contributes to improved fire safety for oxygen systems in applications involving both normal-gravity and microgravity environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a proposed qualitative framework to discuss the heterogeneous burning of metallic materials, through parameters and factors that influence the melting rate of the solid metallic fuel (either in a standard test or in service). During burning, the melting rate is related to the burning rate and is therefore an important parameter for describing and understanding the burning process, especially since the melting rate is commonly recorded during standard flammability testing for metallic materials and is incorporated into many relative flammability ranking schemes. However, whilst the factors that influence melting rate (such as oxygen pressure or specimen diameter) have been well characterized, there is a need for an improved understanding of how these parameters interact as part of the overall melting and burning of the system. Proposed here is the ‘Melting Rate Triangle’, which aims to provide this focus through a conceptual framework for understanding how the melting rate (of solid fuel) is determined and regulated during heterogeneous burning. In the paper, the proposed conceptual model is shown to be both (a) consistent with known trends and previously observed results, and (b)capable of being expanded to incorporate new data. Also shown are examples of how the Melting Rate Triangle can improve the interpretation of flammability test results. Slusser and Miller previously published an ‘Extended Fire Triangle’ as a useful conceptual model of ignition and the factors affecting ignition, providing industry with a framework for discussion. In this paper it is shown that a ‘Melting Rate Triangle’ provides a similar qualitative framework for burning, leading to an improved understanding of the factors affecting fire propagation and extinguishment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Dynamic Data eXchange (DDX) is our third generation platform for building distributed robot controllers. DDX allows a coalition of programs to share data at run-time through an efficient shared memory mechanism managed by a store. Further, stores on multiple machines can be linked by means of a global catalog and data is moved between the stores on an as needed basis by multi-casting. Heterogeneous computer systems are handled. We describe the architecture of DDX and the standard clients we have developed that let us rapidly build complex control systems with minimal coding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a ∑GIi/D/1/∞ queue with heterogeneous input/output slot times. This queueing model can be regarded as an extension of the ordinary GI/D/1/∞ model. For this ∑GIi/D/1/∞ queue, we assume that several input streams arrive at the system according to different slot times. In other words, there are different slot times for different input/output processes in the queueing model. The queueing model can therefore be used for an ATM multiplexer with heterogeneous input/output link capacities. Several cases of the queueing model are discussed to reflect different relationships among the input/output link capacities of an ATM multiplexer. In the queueing analysis, two approaches: the Markov model and the probability generating function technique, are adopted to develop the queue length distributions observed at different epochs. This model is particularly useful in the performance analysis of ATM multiplexers with heterogeneous input/output link capacities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The heterogeneous photocatalytic water purification process has gained wide attention due to its effectiveness in degrading and mineralizing the recalcitrant organic compounds as well as the possibility of utilizing the solar UV and visible light spectrum. This paper aims to review and summarize the recently published works in the field of photocatalytic oxidation of toxic organic compounds such as phenols and dyes, predominant in waste water effluent. In this review, the effects of various operating parameters on the photocatalytic degradation of phenols and dyes are presented. Recent findings suggested that different parameters, such as type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidizing agents/electron acceptors, mode of catalyst application, and calcinations temperature can play an important role on the photocatlytic degradation of organic compounds in water environment. Extensive research has focused on the enhancement of photocatalysis by modification of TiO2 employing metal, non-metal and ion doping. Recent advances in TiO2 photocatalysis for the degradation of various phenols and dyes are also highlighted in this review.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the application of heterogeneous photocatalytic water purification process has gained wide attention due to its effectiveness in degrading and mineralizing the recalcitrant organic compounds as well as the possibility of utilizing the solar UV and visible light spectrum. This paper aims to review and summarize the recently published works on the titanium dioxide (TiO2) photocatalytic oxidation of pesticides and phenolic compounds, predominant in storm and waste water effluents. The effect of various operating parameters on the photocatalytic degradation of pesticides and phenols are discussed. Results reported here suggested that the photocatalytic degradation of organic compounds depends on the type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidizing agents/electron acceptors, catalyst application mode, and calcinations temperature in water environment. A substantial amount of research has focused on the enhancement of TiO2 photocatalysis by modification with metal, non-metal and ion doping. Recent developments in TiO2 photocatalysis for the degradation of various pesticides and phenols are also highlighted in this review. It is evident from the literature survey that photocatalysis has shown good potential for the removal of various organic pollutants. However, still there is a need to find out the practical utility of this technique on commercial scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been an enormous amount of research and development in the area of heterogeneous photocatalytic water purification process due to its effectiveness in degrading and mineralising the recalcitrant organic compounds as well as the possibility of utilising the solar UV and visible spectrum. One hundred and twenty recently published papers are reviewed and summarised here with the focus being on the photocatalytic oxidation of phenols and their derivatives, predominant in waste water effluent. In this review, the effects of various operating parameters on the photocatalytic degradation of phenols and substituted phenols are presented. Recent findings suggested that different parameters, such as type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidising agents/electron acceptors, mode of catalyst application, and calcination temperatures can play an important role on the photocatalytic degradation of phenolic compounds in wastewater. Extensive research has focused on the enhancement of photocatalysis by modification of TiO2 employing metal, non-metal and ion doping. Recent developments in TiO2 photocatalysis for the degradation of various phenols and substituted phenols are also reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Jacobian-free variable-stepsize method is developed for the numerical integration of the large, stiff systems of differential equations encountered when simulating transport in heterogeneous porous media. Our method utilises the exponential Rosenbrock-Euler method, which is explicit in nature and requires a matrix-vector product involving the exponential of the Jacobian matrix at each step of the integration process. These products can be approximated using Krylov subspace methods, which permit a large integration stepsize to be utilised without having to precondition the iterations. This means that our method is truly "Jacobian-free" - the Jacobian need never be formed or factored during the simulation. We assess the performance of the new algorithm for simulating the drying of softwood. Numerical experiments conducted for both low and high temperature drying demonstrates that the new approach outperforms (in terms of accuracy and efficiency) existing simulation codes that utilise the backward Euler method via a preconditioned Newton-Krylov strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective management of chronic diseases is a global health priority. A healthcare information system offers opportunities to address challenges of chronic disease management. However, the requirements of health information systems are often not well understood. The accuracy of requirements has a direct impact on the successful design and implementation of a health information system. Our research describes methods used to understand the requirements of health information systems for advanced prostate cancer management. The research conducted a survey to identify heterogeneous sources of clinical records. Our research showed that the General Practitioner was the common source of patient's clinical records (41%) followed by the Urologist (14%) and other clinicians (14%). Our research describes a method to identify diverse data sources and proposes a novel patient journey browser prototype that integrates disparate data sources.