813 resultados para Constraint based modelling
Resumo:
Mineralogical analysis is often used to assess the liberation properties of particles. A direct method of estimating liberation is to actually break particles and then directly obtain liberation information from applying mineralogical analysis to each size-class of the product. Another technique is to artificially apply random breakage to the feed particle sections to estimate the resultant distribution of product particle sections. This technique provides a useful alternative estimation method. Because this technique is applied to particle sections, the actual liberation properties for particles can only be estimated by applying stereological correction. A recent stereological technique has been developed that allows the discrepancy between the linear intercept composition distribution and the particle section composition distribution to be used as guide for estimating the particle composition distribution. The paper will show results validating this new technique using numerical simulation. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Modelling and optimization of the power draw of large SAG/AG mills is important due to the large power draw which modern mills require (5-10 MW). The cost of grinding is the single biggest cost within the entire process of mineral extraction. Traditionally, modelling of the mill power draw has been done using empirical models. Although these models are reliable, they cannot model mills and operating conditions which are not within the model database boundaries. Also, due to its static nature, the impact of the changing conditions within the mill on the power draw cannot be determined using such models. Despite advances in computing power, discrete element method (DEM) modelling of large mills with many thousands of particles could be a time consuming task. The speed of computation is determined principally by two parameters: number of particles involved and material properties. The computational time step is determined by the size of the smallest particle present in the model and material properties (stiffness). In the case of small particles, the computational time step will be short, whilst in the case of large particles; the computation time step will be larger. Hence, from the point of view of time required for modelling (which usually corresponds to time required for 3-4 mill revolutions), it will be advantageous that the smallest particles in the model are not unnecessarily too small. The objective of this work is to compare the net power draw of the mill whose charge is characterised by different size distributions, while preserving the constant mass of the charge and mill speed. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view population-based optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the population-based incremental learning algorithm, and the generalized mean shift clustering framework. Experimental results are presented that demonstrate the dynamics of the new algorithm on a set of simple test problems.
Resumo:
We have determined the three-dimensional structure of the protein complex between latexin and carboxypeptidase A using a combination of chemical cross-linking, mass spectrometry and molecular docking. The locations of three intermolecular cross-links were identified using mass spectrometry and these constraints were used in combination with a speed-optimised docking algorithm allowing us to evaluate more than 3 x 10(11) possible conformations. While cross-links represent only limited structural constraints, the combination of only three experimental cross-links with very basic molecular docking was sufficient to determine the complex structure. The crystal structure of the complex between latexin and carboxypeptidase A4 determined recently allowed us to assess the success of this structure determination approach. Our structure was shown to be within 4 angstrom r.m.s. deviation of C alpha atoms of the crystal structure. The study demonstrates that cross-linking in combination with mass spectrometry can lead to efficient and accurate structural modelling of protein complexes.
Resumo:
This thesis describes research into business user involvement in the information systems application building process. The main interest of this research is in establishing and testing techniques to quantify the relationships between identified success factors and the outcome effectiveness of 'business user development' (BUD). The availability of a mechanism to measure the levels of the success factors, and quantifiably relate them to outcome effectiveness, is important in that it provides an organisation with the capability to predict and monitor effects on BUD outcome effectiveness. This is particularly important in an era where BUD levels have risen dramatically, user centred information systems development benefits are recognised as significant, and awareness of the risks of uncontrolled BUD activity is becoming more widespread. This research targets the measurement and prediction of BUD success factors and implementation effectiveness for particular business users. A questionnaire instrument and analysis technique has been tested and developed which constitutes a tool for predicting and monitoring BUD outcome effectiveness, and is based on the BUDES (Business User Development Effectiveness and Scope) research model - which is introduced and described in this thesis. The questionnaire instrument is designed for completion by 'business users' - the target community being more explicitly defined as 'people who primarily have a business role within an organisation'. The instrument, named BUD ESP (Business User Development Effectiveness and Scope Predictor), can readily be used with survey participants, and has been shown to give meaningful and representative results.
Resumo:
The topic of this thesis is the development of knowledge based statistical software. The shortcomings of conventional statistical packages are discussed to illustrate the need to develop software which is able to exhibit a greater degree of statistical expertise, thereby reducing the misuse of statistical methods by those not well versed in the art of statistical analysis. Some of the issues involved in the development of knowledge based software are presented and a review is given of some of the systems that have been developed so far. The majority of these have moved away from conventional architectures by adopting what can be termed an expert systems approach. The thesis then proposes an approach which is based upon the concept of semantic modelling. By representing some of the semantic meaning of data, it is conceived that a system could examine a request to apply a statistical technique and check if the use of the chosen technique was semantically sound, i.e. will the results obtained be meaningful. Current systems, in contrast, can only perform what can be considered as syntactic checks. The prototype system that has been implemented to explore the feasibility of such an approach is presented, the system has been designed as an enhanced variant of a conventional style statistical package. This involved developing a semantic data model to represent some of the statistically relevant knowledge about data and identifying sets of requirements that should be met for the application of the statistical techniques to be valid. Those areas of statistics covered in the prototype are measures of association and tests of location.
Resumo:
With new and emerging e-business technologies to transform business processes, it is important to understand how those technologies will affect the performance of a business. Will the overall business process be cheaper, faster and more accurate or will a sub-optimal change have been implemented? The use of simulation to model the behaviour of business processes is well established, and it has been applied to e-business processes to understand their performance in terms of measures such as lead-time, cost and responsiveness. This paper introduces the concept of simulation components that enable simulation models of e-business processes to be built quickly from generic e-business templates. The paper demonstrates how these components were devised, as well as the results from their application through case studies.
Resumo:
Parameter optimization of a two-stage Raman fibre converters (RFC) based on phosphosilicate core fiber was presented. The optimal operational regime was determined and tolerance of the converter against variations of laser parameters was analyzed. Converter was pumped by ytterbium-doped double-clad fibre laser with a maximum output power of 3.8W at 1061 nm. A phosphosilicate-core RFC with enhanced performance was fabricated using the results of numerical modelling.