985 resultados para Process optimisation
Resumo:
Rapport de stage présenté à la Faculté des sciences infirmières en vue de l'obtention du grade de Maître ès sciences (M.Sc.) en sciences infirmières option expertise-conseil en soins infirmiers
Resumo:
A chitinolytic fungus, Beau6eria bassiana was isolated from marine sediment and significant process parameters influencing chitinase production in solid state fermentation using wheat bran were optimised. The organism was strongly alkalophilic and produced maximum chitinase at pH 9·20. The NaCl and colloidal chitin requirements varied with the type of moistening medium used. Vegetative (mycelial) inoculum was more suitable than conidial inoculum for obtaining maximal enzyme yield. The addition of phosphate and yeast extract resulted in enhancement of chitinase yield. After optimisation, the maximum enzyme yield was 246·6 units g 1 initial dry substrate (U gIDS 1). This is the first report of the production of chitinase from a marine fungus.
Nonlinear system identification using particle swarm optimisation tuned radial basis function models
Resumo:
A novel particle swarm optimisation (PSO) tuned radial basis function (RBF) network model is proposed for identification of non-linear systems. At each stage of orthogonal forward regression (OFR) model construction process, PSO is adopted to tune one RBF unit's centre vector and diagonal covariance matrix by minimising the leave-one-out (LOO) mean square error (MSE). This PSO aided OFR automatically determines how many tunable RBF nodes are sufficient for modelling. Compared with the-state-of-the-art local regularisation assisted orthogonal least squares algorithm based on the LOO MSE criterion for constructing fixed-node RBF network models, the PSO tuned RBF model construction produces more parsimonious RBF models with better generalisation performance and is often more efficient in model construction. The effectiveness of the proposed PSO aided OFR algorithm for constructing tunable node RBF models is demonstrated using three real data sets.
Resumo:
We develop a particle swarm optimisation (PSO) aided orthogonal forward regression (OFR) approach for constructing radial basis function (RBF) classifiers with tunable nodes. At each stage of the OFR construction process, the centre vector and diagonal covariance matrix of one RBF node is determined efficiently by minimising the leave-one-out (LOO) misclassification rate (MR) using a PSO algorithm. Compared with the state-of-the-art regularisation assisted orthogonal least square algorithm based on the LOO MR for selecting fixednode RBF classifiers, the proposed PSO aided OFR algorithm for constructing tunable-node RBF classifiers offers significant advantages in terms of better generalisation performance and smaller model size as well as imposes lower computational complexity in classifier construction process. Moreover, the proposed algorithm does not have any hyperparameter that requires costly tuning based on cross validation.
Resumo:
A novel algorithm for solving nonlinear discrete time optimal control problems with model-reality differences is presented. The technique uses Dynamic Integrated System Optimisation and Parameter Estimation (DISOPE) which has been designed to achieve the correct optimal solution in spite of deficiencies in the mathematical model employed in the optimisation procedure. A method based on Broyden's ideas is used for approximating some derivative trajectories required. Ways for handling con straints on both manipulated and state variables are described. Further, a method for coping with batch-to- batch dynamic variations in the process, which are common in practice, is introduced. It is shown that the iterative procedure associated with the algorithm naturally suits applications to batch processes. The algorithm is success fully applied to a benchmark problem consisting of the input profile optimisation of a fed-batch fermentation process.
Resumo:
In most commercially available predictive control packages, there is a separation between economic optimisation and predictive control, although both algorithms may be part of the same software system. This method is compared in this article with two alternative approaches where the economic objectives are directly included in the predictive control algorithm. Simulations are carried out using the Tennessee Eastman process model.
Resumo:
This paper describes the recent developments and improvements made to the variable radius niching technique called Dynamic Niche Clustering (DNC). DNC is fitness sharing based technique that employs a separate population of overlapping fuzzy niches with independent radii which operate in the decoded parameter space, and are maintained alongside the normal GA population. We describe a speedup process that can be applied to the initial generation which greatly reduces the complexity of the initial stages. A split operator is also introduced that is designed to counteract the excessive growth of niches, and it is shown that this improves the overall robustness of the technique. Finally, the effect of local elitism is documented and compared to the performance of the basic DNC technique on a selection of 2D test functions. The paper is concluded with a view to future work to be undertaken on the technique.
Resumo:
Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.
Resumo:
High ionic calcium concentration and the absence of caseinmacropeptides (CMP) in acid whey could influence the production of angiotensin-I-converting enzyme (ACE)-inhibitory hydrolysate and its bioactivity through the application of the integrative process. Therefore, the aim of the present study was to produce a hydrolysate from acid whey applying the integrative process. Process performance was evaluated based on protein adsorption capacity and conversion in relation to ACE-inhibitory activity (ACEi%) and ionic calcium concentration. Hydrolysates with high potency of their biological activity were produced (IC50 = 206-353 μg mL-1). High ionic calcium concentration in acid whey contributed to ACE-inhibitory activity. However, low β-lactoglobulin adsorption and conversion was observed. Optimisation of the resin volume increased the adsorption of β-lactoglobulin significantly but with lower selectivity. The changes in conversion value were not significant even at higher concentration of enzyme. Several ACE inhibitors derived from β-lactoglobulin that were identified before in sweet whey hydrolysates such as, IIAEKT, IIAE, IVTQ, LIVTQ, LIVTQT, LDAQ and LIVT were found. New peptides such as, SNICNI and ECCHGD derived from α-lactalbumin and BSA respectively were identified.
Resumo:
Increasing costs and competitive business strategies are pushing sawmill enterprises to make an effort for optimization of their process management. Organizational decisions mainly concentrate on performance and reduction of operational costs in order to maintain profit margins. Although many efforts have been made, effective utilization of resources, optimal planning and maximum productivity in sawmill are still challenging to sawmill industries. Many researchers proposed the simulation models in combination with optimization techniques to address problems of integrated logistics optimization. The combination of simulation and optimization technique identifies the optimal strategy by simulating all complex behaviours of the system under consideration including objectives and constraints. During the past decade, an enormous number of studies were conducted to simulate operational inefficiencies in order to find optimal solutions. This paper gives a review on recent developments and challenges associated with simulation and optimization techniques. It was believed that the review would provide a perfect ground to the authors in pursuing further work in optimizing sawmill yard operations.
Resumo:
This study investigates the utilisation of a simplified model in the transient analysis of a thermal cooling process. In such process the external thermal resistance between the surface and the surroundings is high compared to the system internal thermal resistance, so that the first controls the heat transfer process. In this case the Biot number is lower than 0.1. Aluminium reels were utilised, which, with proper internal instrumentation, furnished experimental results for the thermal cooling process. Based on experimental data, a simplified model for the determination of the process film coefficient was used. Subsequently, experimental and theoretical results were compared. The change of the airflow direction was also investigated for the cooling process, aiming at process time optimisation. (C) 2001 Elsevier B.V. Ltd.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Amino acids are well metabolized by Streptomyces clavuligerus during the production of clavulanic acid using glycerol as main carbon and energy source. However, only a few amino acids such as arginine and ornithine are favorable for CA biosynthesis. The aim of this work was to optimize the glycerol:ornithine molar ratio in the feed medium containing only these compounds to maximize CA production in continuous cultivation. A minimum number of experiments were performed by means of a simple two-level full-factorial central composite design to investigate the combined effect of glycerol and ornithine feeding on the CA concentration during the intermittent and continuous process in shake-flasks. Statistical analysis of the experimental data using the response surface methodology showed that a glycerol-to-ornithine molar ratio of approximately 40:1 in the feed medium resulted in the highest CA concentration when fermentation was stopped. Under these optimized conditions, in bench-scale fermentor runs, the CA concentration reached more than double the concentration obtained in shake-flasks runs. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this work, the transesterification of jupati (Raphia taedigera Mart.) oil using ethanol and acid catalyst was examined. The production of biodiesel was performed using a central composite design (CCD). A range of values for catalyst concentration (1 to 4.21%), temperature (70-80 °C), and the molar ratio of alcohol to oil (6:1-13.83:1) were tested, and ester content, viscosity, and yield were the response variables. The synthesis process was optimised using response surface methodology (RSM), resulting in the following optimal conditions for the production of jupati ethyl esters: a catalyst concentration of 3.85% at 80 °C and an alcohol-to-oil molar ratio of 10:1.
Resumo:
This thesis deals with an investigation of combinatorial and robust optimisation models to solve railway problems. Railway applications represent a challenging area for operations research. In fact, most problems in this context can be modelled as combinatorial optimisation problems, in which the number of feasible solutions is finite. Yet, despite the astonishing success in the field of combinatorial optimisation, the current state of algorithmic research faces severe difficulties with highly-complex and data-intensive applications such as those dealing with optimisation issues in large-scale transportation networks. One of the main issues concerns imperfect information. The idea of Robust Optimisation, as a way to represent and handle mathematically systems with not precisely known data, dates back to 1970s. Unfortunately, none of those techniques proved to be successfully applicable in one of the most complex and largest in scale (transportation) settings: that of railway systems. Railway optimisation deals with planning and scheduling problems over several time horizons. Disturbances are inevitable and severely affect the planning process. Here we focus on two compelling aspects of planning: robust planning and online (real-time) planning.