895 resultados para Heterogeneous multiprocessors
Resumo:
The heterogeneous photocatalytic water purification process has gained wide attention due to its effectiveness in degrading and mineralizing the recalcitrant organic compounds as well as the possibility of utilizing the solar UV and visible light spectrum. This paper aims to review and summarize the recently published works in the field of photocatalytic oxidation of toxic organic compounds such as phenols and dyes, predominant in waste water effluent. In this review, the effects of various operating parameters on the photocatalytic degradation of phenols and dyes are presented. Recent findings suggested that different parameters, such as type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidizing agents/electron acceptors, mode of catalyst application, and calcinations temperature can play an important role on the photocatlytic degradation of organic compounds in water environment. Extensive research has focused on the enhancement of photocatalysis by modification of TiO2 employing metal, non-metal and ion doping. Recent advances in TiO2 photocatalysis for the degradation of various phenols and dyes are also highlighted in this review.
Resumo:
In recent years, the application of heterogeneous photocatalytic water purification process has gained wide attention due to its effectiveness in degrading and mineralizing the recalcitrant organic compounds as well as the possibility of utilizing the solar UV and visible light spectrum. This paper aims to review and summarize the recently published works on the titanium dioxide (TiO2) photocatalytic oxidation of pesticides and phenolic compounds, predominant in storm and waste water effluents. The effect of various operating parameters on the photocatalytic degradation of pesticides and phenols are discussed. Results reported here suggested that the photocatalytic degradation of organic compounds depends on the type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidizing agents/electron acceptors, catalyst application mode, and calcinations temperature in water environment. A substantial amount of research has focused on the enhancement of TiO2 photocatalysis by modification with metal, non-metal and ion doping. Recent developments in TiO2 photocatalysis for the degradation of various pesticides and phenols are also highlighted in this review. It is evident from the literature survey that photocatalysis has shown good potential for the removal of various organic pollutants. However, still there is a need to find out the practical utility of this technique on commercial scale.
Resumo:
In recent years, there has been an enormous amount of research and development in the area of heterogeneous photocatalytic water purification process due to its effectiveness in degrading and mineralising the recalcitrant organic compounds as well as the possibility of utilising the solar UV and visible spectrum. One hundred and twenty recently published papers are reviewed and summarised here with the focus being on the photocatalytic oxidation of phenols and their derivatives, predominant in waste water effluent. In this review, the effects of various operating parameters on the photocatalytic degradation of phenols and substituted phenols are presented. Recent findings suggested that different parameters, such as type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidising agents/electron acceptors, mode of catalyst application, and calcination temperatures can play an important role on the photocatalytic degradation of phenolic compounds in wastewater. Extensive research has focused on the enhancement of photocatalysis by modification of TiO2 employing metal, non-metal and ion doping. Recent developments in TiO2 photocatalysis for the degradation of various phenols and substituted phenols are also reviewed.
Resumo:
We present a mass-conservative vertex-centred finite volume method for efficiently solving the mixed form of Richards’ equation in heterogeneous porous media. The spatial discretisation is particularly well-suited to heterogeneous media because it produces consistent flux approximations at quadrature points where material properties are continuous. Combined with the method of lines, the spatial discretisation gives a set of differential algebraic equations amenable to solution using higher-order implicit solvers. We investigate the solution of the mixed form using a Jacobian-free inexact Newton solver, which requires the solution of an extra variable for each node in the mesh compared to the pressure-head form. By exploiting the structure of the Jacobian for the mixed form, the size of the preconditioner is reduced to that for the pressure-head form, and there is minimal computational overhead for solving the mixed form. The proposed formulation is tested on two challenging test problems. The solutions from the new formulation offer conservation of mass at least one order of magnitude more accurate than a pressure head formulation, and the higher-order temporal integration significantly improves both the mass balance and computational efficiency of the solution.
Resumo:
An improved mesoscopic model is presented for simulating the drying of porous media. The aim of this model is to account for two scales simultaneously: the scale of the whole product and the scale of the heterogeneities of the porous medium. The innovation of this method is the utilization of a new mass-conservative scheme based on the Control-Volume Finite-Element (CV-FE) method that partitions the moisture content field over the individual sub-control volumes surrounding each node within the mesh. Although the new formulation has potential for application across a wide range of transport processes in heterogeneous porous media, the focus here is on applying the model to the drying of small sections of softwood consisting of several growth rings. The results conclude that, when compared to a previously published scheme, only the new mass-conservative formulation correctly captures the true moisture content evolution in the earlywood and latewood components of the growth rings during drying.
Resumo:
There is an increased interested in Uninhabited Aerial Vehicle (UAV) operations and research into advanced methods for commanding and controlling multiple heterogeneous UAVs. Research into areas of supervisory control has rapidly increased. Past research has investigated various approaches of autonomous control and operator limitation to improve mission commanders' Situation Awareness (SA) and cognitive workload. The aim of this paper is to address this challenge through a visualisation framework of UAV information constructed from Information Abstraction (IA). This paper presents the concept and process of IA, and the visualisation framework (constructed using IA), the concept associated with the Level Of Detail (LOD) indexing method, the visualisation of an example of the framework. Experiments will test the hypothesis that, the operator will be able to achieve increased SA and reduced cognitive load with the proposed framework.
Resumo:
A Jacobian-free variable-stepsize method is developed for the numerical integration of the large, stiff systems of differential equations encountered when simulating transport in heterogeneous porous media. Our method utilises the exponential Rosenbrock-Euler method, which is explicit in nature and requires a matrix-vector product involving the exponential of the Jacobian matrix at each step of the integration process. These products can be approximated using Krylov subspace methods, which permit a large integration stepsize to be utilised without having to precondition the iterations. This means that our method is truly "Jacobian-free" - the Jacobian need never be formed or factored during the simulation. We assess the performance of the new algorithm for simulating the drying of softwood. Numerical experiments conducted for both low and high temperature drying demonstrates that the new approach outperforms (in terms of accuracy and efficiency) existing simulation codes that utilise the backward Euler method via a preconditioned Newton-Krylov strategy.
Resumo:
Background Cancer outlier profile analysis (COPA) has proven to be an effective approach to analyzing cancer expression data, leading to the discovery of the TMPRSS2 and ETS family gene fusion events in prostate cancer. However, the original COPA algorithm did not identify down-regulated outliers, and the currently available R package implementing the method is similarly restricted to the analysis of over-expressed outliers. Here we present a modified outlier detection method, mCOPA, which contains refinements to the outlier-detection algorithm, identifies both over- and under-expressed outliers, is freely available, and can be applied to any expression dataset. Results We compare our method to other feature-selection approaches, and demonstrate that mCOPA frequently selects more-informative features than do differential expression or variance-based feature selection approaches, and is able to recover observed clinical subtypes more consistently. We demonstrate the application of mCOPA to prostate cancer expression data, and explore the use of outliers in clustering, pathway analysis, and the identification of tumour suppressors. We analyse the under-expressed outliers to identify known and novel prostate cancer tumour suppressor genes, validating these against data in Oncomine and the Cancer Gene Index. We also demonstrate how a combination of outlier analysis and pathway analysis can identify molecular mechanisms disrupted in individual tumours. Conclusions We demonstrate that mCOPA offers advantages, compared to differential expression or variance, in selecting outlier features, and that the features so selected are better able to assign samples to clinically annotated subtypes. Further, we show that the biology explored by outlier analysis differs from that uncovered in differential expression or variance analysis. mCOPA is an important new tool for the exploration of cancer datasets and the discovery of new cancer subtypes, and can be combined with pathway and functional analysis approaches to discover mechanisms underpinning heterogeneity in cancers
Resumo:
In this paper, we analyse the impact of a (small) heterogeneity of jump type on the most simple localized solutions of a 3-component FitzHugh–Nagumo-type system. We show that the heterogeneity can pin a 1-front solution, which travels with constant (non-zero) speed in the homogeneous setting, to a fixed, explicitly determined, distance from the heterogeneity. Moreover, we establish the stability of this heterogeneous pinned 1-front solution. In addition, we analyse the pinning of 1-pulse, or 2-front, solutions. The paper is concluded with simulations in which we consider the dynamics and interactions of N-front patterns in domains with M heterogeneities of jump type (N = 3, 4, M ≥ 1).
Resumo:
An increased interest in utilising groups of Unmanned Aerial Vehicles (UAVs) with heterogeneous capabilities and autonomy is presenting the challenge to effectively manage such during missions and operations. This has been the focus of research in recent years, moving from a traditional UAV management paradigm of n-to-1 (n operators for one UAV, with n being at least two operators) toward 1-to-n (one operator, multiple UAVs). This paper has expanded on the authors’ previous work on UAV functional capability framework, by incorporating the concept of Functional Level of Autonomy (F-LOA) with two configurations: The lower F-LOA configuration contains sufficient information for the operator to generate solutions and make decisions to address perturbation events. Alternatively, the higher F-LOA configuration presents information reflecting on the F-LOA of the UAV, allowing the operator to interpret solutions and decisions generated autonomously, and decide whether to veto from this decision.
Resumo:
This research was a step forward in developing a data integration framework for Electronic Health Records. The outcome of the research is a conceptual and logical Data Warehousing model for integrating Cardiac Surgery electronic data records. This thesis investigated the main obstacles for the healthcare data integration and proposes a data warehousing model suitable for integrating fragmented data in a Cardiac Surgery Unit.
Resumo:
The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.
Resumo:
We propose a cluster ensemble method to map the corpus documents into the semantic space embedded in Wikipedia and group them using multiple types of feature space. A heterogeneous cluster ensemble is constructed with multiple types of relations i.e. document-term, document-concept and document-category. A final clustering solution is obtained by exploiting associations between document pairs and hubness of the documents. Empirical analysis with various real data sets reveals that the proposed meth-od outperforms state-of-the-art text clustering approaches.