135 resultados para CFD, computer modelling, DEM, sugar processing
Prediction of slurry transport in SAG mills using SPH fluid flow in a dynamic DEM based porous media
Resumo:
DEM modelling of the motion of coarse fractions of the charge inside SAG mills has now been well established for more than a decade. In these models the effect of slurry has broadly been ignored due to its complexity. Smoothed particle hydrodynamics (SPH) provides a particle based method for modelling complex free surface fluid flows and is well suited to modelling fluid flow in mills. Previous modelling has demonstrated the powerful ability of SPH to capture dynamic fluid flow effects such as lifters crashing into slurry pools, fluid draining from lifters, flow through grates and pulp lifter discharge. However, all these examples were limited by the ability to model only the slurry in the mill without the charge. In this paper, we represent the charge as a dynamic porous media through which the SPH fluid is then able to flow. The porous media properties (specifically the spatial distribution of porosity and velocity) are predicted by time averaging the mill charge predicted using a large scale DEM model. This allows prediction of transient and steady state slurry distributions in the mill and allows its variation with operating parameters, slurry viscosity and slurry volume, to be explored. (C) 2006 Published by Elsevier Ltd.
Resumo:
A complete workflow specification requires careful integration of many different process characteristics. Decisions must be made as to the definitions of individual activities, their scope, the order of execution that maintains the overall business process logic, the rules governing the discipline of work list scheduling to performers, identification of time constraints and more. The goal of this paper is to address an important issue in workflows modelling and specification, which is data flow, its modelling, specification and validation. Researchers have neglected this dimension of process analysis for some time, mainly focussing on structural considerations with limited verification checks. In this paper, we identify and justify the importance of data modelling in overall workflows specification and verification. We illustrate and define several potential data flow problems that, if not detected prior to workflow deployment may prevent the process from correct execution, execute process on inconsistent data or even lead to process suspension. A discussion on essential requirements of the workflow data model in order to support data validation is also given..
Resumo:
This paper proposes some variants of Temporal Defeasible Logic (TDL) to reason about normative modifications. These variants make it possible to differentiate cases in which, for example, modifications at some time change legal rules but their conclusions persist afterwards from cases where also their conclusions are blocked.
Resumo:
A Geographic Information System (GIS) was used to model datasets of Leyte Island, the Philippines, to identify land which was suitable for a forest extension program on the island. The datasets were modelled to provide maps of the distance of land from cities and towns, land which was a suitable elevation and slope for smallholder forestry and land of various soil types. An expert group was used to assign numeric site suitabilities to the soil types and maps of site suitability were used to assist the selection of municipalities for the provision of extension assistance to smallholders. Modelling of the datasets was facilitated by recent developments of the ArcGIS® suite of computer programs and derivation of elevation and slope was assisted by the availability of digital elevation models (DEM) produced by the Shuttle Radar Topography (SRTM) mission. The usefulness of GIS software as a decision support tool for small-scale forestry extension programs is discussed.
Resumo:
OctVCE is a cartesian cell CFD code produced especially for numerical simulations of shock and blast wave interactions with complex geometries. Virtual Cell Embedding (VCE) was chosen as its cartesian cell kernel as it is simple to code and sufficient for practical engineering design problems. This also makes the code much more ‘user-friendly’ than structured grid approaches as the gridding process is done automatically. The CFD methodology relies on a finite-volume formulation of the unsteady Euler equations and is solved using a standard explicit Godonov (MUSCL) scheme. Both octree-based adaptive mesh refinement and shared-memory parallel processing capability have also been incorporated. For further details on the theory behind the code, see the companion report 2007/12.
Resumo:
The cost of spatial join processing can be very high because of the large sizes of spatial objects and the computation-intensive spatial operations. While parallel processing seems a natural solution to this problem, it is not clear how spatial data can be partitioned for this purpose. Various spatial data partitioning methods are examined in this paper. A framework combining the data-partitioning techniques used by most parallel join algorithms in relational databases and the filter-and-refine strategy for spatial operation processing is proposed for parallel spatial join processing. Object duplication caused by multi-assignment in spatial data partitioning can result in extra CPU cost as well as extra communication cost. We find that the key to overcome this problem is to preserve spatial locality in task decomposition. We show in this paper that a near-optimal speedup can be achieved for parallel spatial join processing using our new algorithms.
Resumo:
We use a spatially explicit population model to explore the population consequences of different habitat selection mechanisms on landscapes with fractal variation in habitat quality. We consider dispersal strategies ranging from random walks to perfect habitat selectors for two species of arboreal marsupial, the greater glider (Petauroides volans) and the mountain brushtail possum (Trichosurus caninus). In this model increasing habitat selection means individuals obtain higher quality territories, but experience increased mortality during dispersal. The net effect is that population sizes are smaller when individuals actively select habitat. We find positive relationships between habitat quality and population size can occur when individuals do not use information about the entire landscape when habitat quality is spatially autocorrelated. We also find that individual behaviour can mitigate the negative effects of spatial variation on population average survival and fecundity. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
Inhibitors of proteolytic enzymes (proteases) are emerging as prospective treatments for diseases such as AIDS and viral infections, cancers, inflammatory disorders, and Alzheimer's disease. Generic approaches to the design of protease inhibitors are limited by the unpredictability of interactions between, and structural changes to, inhibitor and protease during binding. A computer analysis of superimposed crystal structures for 266 small molecule inhibitors bound to 48 proteases (16 aspartic, 17 serine, 8 cysteine, and 7 metallo) provides the first conclusive proof that inhibitors, including substrate analogues, commonly bind in an extended beta-strand conformation at the active sites of all these proteases. Representative superimposed structures are shown for (a) multiple inhibitors bound to a protease of each class, (b) single inhibitors each bound to multiple proteases, and (c) conformationally constrained inhibitors bound to proteases. Thus inhibitor/substrate conformation, rather than sequence/composition alone, influences protease recognition, and this has profound implications for inhibitor design. This conclusion is supported by NMR, CD, and binding studies for HIV-1 protease inhibitors/ substrates which, when preorganized in an extended conformation, have significantly higher protease affinity. Recognition is dependent upon conformational equilibria since helical and turn peptide conformations are not processed by proteases. Conformational selection explains the resistance of folded/structured regions of proteins to proteolytic degradation, the susceptibility of denatured proteins to processing, and the higher affinity of conformationally constrained 'extended' inhibitors/substrates for proteases. Other approaches to extended inhibitor conformations should similarly lead to high-affinity binding to a protease.
Resumo:
Numerical methods ave used to solve double diffusion driven reactive flow transport problems in deformable fluid-saturated porous media. in particular, thp temperature dependent reaction rate in the non-equilibrium chemical reactions is considered. A general numerical solution method, which is a combination of the finite difference method in FLAG and the finite element method in FIDAP, to solve the fully coupled problem involving material deformation, pore-fluid flow, heat transfer and species transport/chemical reactions in deformable fluid-saturated porous media has been developed The coupled problem is divided into two subproblems which are solved interactively until the convergence requirement is met. Owing to the approximate nature of the numerical method, if is essential to justify the numerical solutions through some kind of theoretical analysis. This has been highlighted in this paper The related numerical results, which are justified by the theoretical analysis, have demonstrated that the proposed solution method is useful for and applicable to a wide range of fully coupled problems in the field of science and engineering.
Resumo:
Normal mixture models are being increasingly used to model the distributions of a wide variety of random phenomena and to cluster sets of continuous multivariate data. However, for a set of data containing a group or groups of observations with longer than normal tails or atypical observations, the use of normal components may unduly affect the fit of the mixture model. In this paper, we consider a more robust approach by modelling the data by a mixture of t distributions. The use of the ECM algorithm to fit this t mixture model is described and examples of its use are given in the context of clustering multivariate data in the presence of atypical observations in the form of background noise.