12 resultados para Project of product
em University of Queensland eSpace - Australia
Resumo:
Process optimisation and optimal control of batch and continuous drum granulation processes are studied in this paper. The main focus of the current research has been: (i) construction of optimisation and control relevant, population balance models through the incorporation of moisture content, drum rotation rate and bed depth into the coalescence kernels; (ii) investigation of optimal operational conditions using constrained optimisation techniques; (iii) development of optimal control algorithms based on discretized population balance equations; and (iv) comprehensive simulation studies on optimal control of both batch and continuous granulation processes. The objective of steady state optimisation is to minimise the recycle rate with minimum cost for continuous processes. It has been identified that the drum rotation-rate, bed depth (material charge), and moisture content of solids are practical decision (design) parameters for system optimisation. The objective for the optimal control of batch granulation processes is to maximize the mass of product-sized particles with minimum time and binder consumption. The objective for the optimal control of the continuous process is to drive the process from one steady state to another in a minimum time with minimum binder consumption, which is also known as the state-driving problem. It has been known for some time that the binder spray-rate is the most effective control (manipulative) variable. Although other possible manipulative variables, such as feed flow-rate and additional powder flow-rate have been investigated in the complete research project, only the single input problem with the binder spray rate as the manipulative variable is addressed in the paper to demonstrate the methodology. It can be shown from simulation results that the proposed models are suitable for control and optimisation studies, and the optimisation algorithms connected with either steady state or dynamic models are successful for the determination of optimal operational conditions and dynamic trajectories with good convergence properties. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
This paper summarises test results that were used to validate a model and scale-up procedure of the high pressure grinding roll (HPGR) which was developed at the JKMRC by Morrell et al. [Morrell, Lim, Tondo, David,1996. Modelling the high pressure grinding rolls. In: Mining Technology Conference, pp. 169-176.]. Verification of the model is based on results from four data sets that describe the performance of three industrial scale units fitted with both studded and smooth roll surfaces. The industrial units are currently in operation within the diamond mining industry and are represented by De Beers, BHP Billiton and Rio Tinto. Ore samples from the De Beers and BHP Billiton operations were sent to the JKMRC for ore characterisation and HPGR laboratory-scale tests. Rio Tinto contributed an historical data set of tests completed during a previous research project. The results conclude that the modelling of the HPGR process has matured to a point where the model may be used to evaluate new and to optimise existing comminution circuits. The model prediction of product size distribution is good and has been found to be strongly dependent of the characteristics of the material being tested. The prediction of throughput and corresponding power draw (based on throughput) is sensitive to inconsistent gap/diameter ratios observed between laboratory-scale tests and full-scale operations. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Mineralogical analysis is often used to assess the liberation properties of particles. A direct method of estimating liberation is to actually break particles and then directly obtain liberation information from applying mineralogical analysis to each size-class of the product. Another technique is to artificially apply random breakage to the feed particle sections to estimate the resultant distribution of product particle sections. This technique provides a useful alternative estimation method. Because this technique is applied to particle sections, the actual liberation properties for particles can only be estimated by applying stereological correction. A recent stereological technique has been developed that allows the discrepancy between the linear intercept composition distribution and the particle section composition distribution to be used as guide for estimating the particle composition distribution. The paper will show results validating this new technique using numerical simulation. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Software Configuration Management is the discipline of managing large collections of software development artefacts from which software products are built. Software configuration management tools typically deal with artefacts at fine levels of granularity - such as individual source code files - and assist with coordination of changes to such artefacts. This paper describes a lightweight tool, designed to be used on top of a traditional file-based configuration management system. The add-on tool support enables users to flexibly define new hierarchical views of product structure, independent of the underlying artefact-repository structure. The tool extracts configuration and change data with respect to the user-defined hierarchy, leading to improved visibility of how individual subsystems have changed. The approach yields a range of new capabilities for build managers, and verification and validation teams. The paper includes a description of our experience using the tool in an organization that builds large embedded software systems.
Resumo:
A comparison is made between Arrhenius and transition-state analyses of the temperature dependence of rate constants reported in four published biosensor studies. Although the Eyring transition-state theory seemingly affords a more definitive solution to the problem of characterizing the activation energetics, the analysis is equivocal because of inherent assumptions about reaction mechanism and the magnitude of the transmission coefficient. In view of those uncertainties it is suggested that a preferable course of action entails reversion to the empirical Arrhenius analysis with regard to the energy of activation and a preexponential factor. The former is essentially equivalent to the enthalpy of activation, whereas the magnitude of the latter indicates directly the extent of disparity between the frequency of product formation and the universal frequency factor (temperature multiplied by the ratio of the Boltzmann and Planck constants) and hence the likelihood of a more complicated kinetic mechanism than that encompassed by the Eyring transition-state theory. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
The problem of distributed compression for correlated quantum sources is considered. The classical version of this problem was solved by Slepian and Wolf, who showed that distributed compression could take full advantage of redundancy in the local sources created by the presence of correlations. Here it is shown that, in general, this is not the case for quantum sources, by proving a lower bound on the rate sum for irreducible sources of product states which is stronger than the one given by a naive application of Slepian-Wolf. Nonetheless, strategies taking advantage of correlation do exist for some special classes of quantum sources. For example, Devetak and Winter demonstrated the existence of such a strategy when one of the sources is classical. Optimal nontrivial strategies for a different extreme, sources of Bell states, are presented here. In addition, it is explained how distributed compression is connected to other problems in quantum information theory, including information-disturbance questions, entanglement distillation and quantum error correction.
Resumo:
Product warranty is an important part of new product marketing and sales. Offering warranty implies additional costs in the form of warranty servicing cost. Product reliability has a serious impact on the warranty servicing cost. As such, effective management of product reliability must take into account the link between warranty and reliability. This paper deals with this topic and develops a framework needed for effective management of product reliability. It reviews the relevant literature and defines topics for future research.
Resumo:
Despite wide application of cellulose-azure as a substrate for measuring cellulase activity, there is no quantification of hydrolysis rate or enzymatic activities using this substrate. The aim of this study was to quantify the hydrolysis rate in terms of product formation and dye released using cellulose-azure. The amount of dye released was correlated with the production of glucose and the enzyme concentrations. It is shown that the lack of correlation can be due to (1) repression of the release of the azure-dye when azure-dye accumulates, (2) presence of degradable substrates in the cellulase powder which inflate the glucose measurements and (3) the degradation of cellulose which is not linked to the dye in the cellulose-azure. Based on the lack of correlation, it is recommended that cellulose-azure should only be applied in assays when the aim is to compare relative activities of different enzymatic systems. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Significant advances have been made in the last decade to quantify the process of wet granulation. The attributes of product granules from the granulation process are controlled by a combination of three groups of processes occurring in the granulator: (1) wetting and nucleation, (2) growth and consolidation and (3) breakage and attrition. For the first two of these processes, the key controlling dimensionless groups are defined and regime maps are presented and validated with data from tumbling and mixer granulators. Granulation is an example of particle design. For quantitative analysis, both careful characterisation of the feed formulation and knowledge of operating parameters are required. A key thesis of this paper is that the design, scaleup and operation of granulation processes can now be considered as quantitative engineering rather than a black art. Résumé