115 resultados para OPTIMIZATION PROCESS
em University of Queensland eSpace - Australia
Resumo:
A parallel computing environment to support optimization of large-scale engineering systems is designed and implemented on Windows-based personal computer networks, using the master-worker model and the Parallel Virtual Machine (PVM). It is involved in decomposition of a large engineering system into a number of smaller subsystems optimized in parallel on worker nodes and coordination of subsystem optimization results on the master node. The environment consists of six functional modules, i.e. the master control, the optimization model generator, the optimizer, the data manager, the monitor, and the post processor. Object-oriented design of these modules is presented. The environment supports steps from the generation of optimization models to the solution and the visualization on networks of computers. User-friendly graphical interfaces make it easy to define the problem, and monitor and steer the optimization process. It has been verified by an example of a large space truss optimization. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Conventionally, protein structure prediction via threading relies on some nonoptimal method to align a protein sequence to each member of a library of known structures. We show how a score function (force field) can be modified so as to allow the direct application of a dynamic programming algorithm to the problem. This involves an approximation whose damage can be minimized by an optimization process during score function parameter determination. The method is compared to sequence to structure alignments using a more conventional pair-wise score function and the frozen approximation. The new method produces results comparable to the frozen approximation, but is faster and has fewer adjustable parameters. It is also free of memory of the template's original amino acid sequence, and does not suffer from a problem of nonconvergence, which can be shown to occur with the frozen approximation. Alignments generated by the simplified score function can then be ranked using a second score function with the approximations removed. (C) 1999 John Wiley & Sons, Inc.
Resumo:
A steady state mathematical model for co-current spray drying was developed for sugar-rich foods with the application of the glass transition temperature concept. Maltodextrin-sucrose solution was used as a sugar-rich food model. The model included mass, heat and momentum balances for a single droplet drying as well as temperature and humidity profile of the drying medium. A log-normal volume distribution of the droplets was generated at the exit of the rotary atomizer. This generation created a certain number of bins to form a system of non-linear first-order differential equations as a function of the axial distance of the drying chamber. The model was used to calculate the changes of droplet diameter, density, temperature, moisture content and velocity in association with the change of air properties along the axial distance. The difference between the outlet air temperature and the glass transition temperature of the final products (AT) was considered as an indicator of stickiness of the particles in spray drying process. The calculated and experimental AT values were close, indicating successful validation of the model. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Response of an aerobic upflow sludge blanket (AUSB) reactor system to the changes in operating conditions was investigated by varying two principle operating variables: the oxygenation pressure and the flow recirculation rate. The oxygenation pressure was varied between 0 and 25 psig (relative), while flow recirculation rates were between 1,300 and 600% correspondingly. The AUSB reactor system was able to handle a volumetric loading of as high as 3.8 kg total organic carbon (TOC)/m(3) day, with a removal efficiency of 92%. The rate of TOC removal by AUSB was highest at a pressure of 20 psig and it decreased when the pressure was increased to 25 psig and the flow recirculation rate was reduced to 600%. The TOC removal rate also decreased when the operating pressure was reduced to 0 and 15 psig, with corresponding increase in flow recirculation rates to 1,300 and 1,000%, respectively. Maintenance of a high dissolved oxygen level and a high flow recirculation rate was found to improve the substrate removal capacity of the AUSB system. The AUSB system was extremely effective in retaining the produced biomass despite a high upflow velocity and the overall sludge yield was only 0.24-0.32 g VSS/g TOC removed. However, the effluent TOC was relatively high due to the system's operation at a high organic loading.
Resumo:
Urban growth and change presents numerous challenges for planners and policy makers. Effective and appropriate strategies for managing growth and change must address issues of social, environmental and economic sustainability. Doing so in practical terms is a difficult task given the uncertainty associated with likely growth trends not to mention the uncertainty associated with how social and environmental structures will respond to such change. An optimization based approach is developed for evaluating growth and change based upon spatial restrictions and impact thresholds. The spatial optimization model is integrated with a cellular automata growth simulation process. Application results are presented and discussed with respect to possible growth scenarios in south east Queensland, Australia.
Resumo:
Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view population-based optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the population-based incremental learning algorithm, and the generalized mean shift clustering framework. Experimental results are presented that demonstrate the dynamics of the new algorithm on a set of simple test problems.
Resumo:
Coal fired power generation will continue to provide energy to the world for the foreseeable future. However, this energy use is a significant contributor to increased atmospheric CO2 concentration and, hence, global warming. Capture and disposal Of CO2 has received increased R&D attention in the last decade as the technology promises to be the most cost effective for large scale reductions in CO2 emissions. This paper addresses CO2 transport via pipeline from capture site to disposal site, in terms of system optimization, energy efficiency and overall economics. Technically, CO2 can be transported through pipelines in the form of a gas, a supercritical. fluid or in the subcooled liquid state. Operationally, most CO2 pipelines used for enhanced oil recovery transport CO2 as a supercritical fluid. In this paper, supercritical fluid and subcooled liquid transport are examined and compared, including their impacts on energy efficiency and cost. Using a commercially available process simulator, ASPEN PLUS 10.1, the results show that subcooled liquid transport maximizes the energy efficiency and minimizes the Cost Of CO2 transport over long distances under both isothermal and adiabatic conditions. Pipeline transport of subcooled liquid CO2 can be ideally used in areas of cold climate or by burying and insulating the pipeline. In very warm climates, periodic refrigeration to cool the CO2 below its critical point of 31.1 degrees C, may prove economical. Simulations have been used to determine the maximum safe pipeline distances to subsequent booster stations as a function of inlet pressure, environmental temperature and ground level heat flux conditions. (c) 2005 Published by Elsevier Ltd.
Resumo:
The integrated chemical-biological degradation combining advanced oxidation by UV/H2O2 followed by aerobic biodegradation was used to degrade C.I. Reactive Azo Red 195A, commonly used in the textile industry in Australia. An experimental design based on the response surface method was applied to evaluate the interactive effects of influencing factors (UV irradiation time, initial hydrogen peroxide dosage and recirculation ratio of the system) on decolourisation efficiency and optimizing the operating conditions of the treatment process. The effects were determined by the measurement of dye concentration and soluble chemical oxygen demand (S-COD). The results showed that the dye and S-COD removal were affected by all factors individually and interactively. Maximal colour degradation performance was predicted, and experimentally validated, with no recirculation, 30 min UV irradiation and 500 mg H2O2/L. The model predictions for colour removal, based on a three-factor/five-level Box-Wilson central composite design and the response surface method analysis, were found to be very close to additional experimental results obtained under near optimal conditions. This demonstrates the benefits of this approach in achieving good predictions while minimising the number of experiments required. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The results presented in this report form a part of a larger global study on the major issues in BPM. Only one part of the larger study is reported here, viz. interviews with BPM experts. Interviews of BPM tool vendors together with focus groups involving user organizations, are continuing in parallel and will set the groundwork for the identification of BPM issues on a global scale via a survey (including a Delphi study). Through this multi-method approach, we identify four distinct sets of outcomes. First, as is the focus of this report, we identify the BPM issues as perceived by BPM experts. Second, the research design allows us to gain insight into the opinions of organisations deploying BPM solutions. Third, an understanding of organizations’ misconceptions of BPM technologies, as confronted by BPM tool vendors is obtained. Last, we seek to gain an understanding of BPM issues on a global scale, together with knowledge of matters of concern. This final outcome is aimed to produce an industry driven research agenda which will inform practitioners and in particular, the research community world-wide on issues and challenges that are prevalent or emerging in BPM and related areas.
Resumo:
We investigate analytically the first and the second law characteristics of fully developed forced convection inside a porous-saturated duct of rectangular cross-section. The Darcy-Brinkman flow model is employed. Three different types of thermal boundary conditions are examined. Expressions for the Nusselt number, the Bejan number, and the dimensionless entropy generation rate are presented in terms of the system parameters. The conclusions of this analytical study will make it possible to compare, evaluate, and optimize alternative rectangular duct design options in terms of heat transfer, pressure drop, and entropy generation. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Rupture of a light cellophane diaphragm in an expansion tube has been studied by an optical method. The influence of the light diaphragm on test flow generation has long been recognised, however the diaphragm rupture mechanism is less well known. It has been previously postulated that the diaphragm ruptures around its periphery due to the dynamic pressure loading of the shock wave, with the diaphragm material at some stage being removed from the flow to allow the shock to accelerate to the measured speeds downstream. The images obtained in this series of experiments are the first to show the mechanism of diaphragm rupture and mass removal in an expansion tube. A light diaphragm was impulsively loaded via a shock wave and a series of images was recorded holographically throughout the rupture process, showing gradual destruction of the diaphragm. Features such as the diaphragm material, the interface between gases, and a reflected shock were clearly visualised. Both qualitative and quantitative aspects of the rupture dynamics were derived from the images and compared with existing one-dimensional theory.
Resumo:
Over the last decade, ambitious claims have been made in the management literature about the contribution of emotional intelligence to success and performance. Writers in this genre have predicted that individuals with high emotional intelligence perform better in all aspects of management. This paper outlines the development of a new emotional intelligence measure, the Workgroup Emotional Intelligence Profile, Version 3 (WEIP-3), which was designed specifically to profile the emotional intelligence of individuals in work teams. We applied the scale in a study of the link between emotional intelligence and two measures of team performance: team process effectiveness and team goal focus. The results suggest that the average level of emotional intelligence of team members, as measured by the WEIP-3, is reflected in the initial performance of teams. In our study, low emotional intelligence teams initially performed at a lower level than the high emotional intelligence teams. Over time, however, teams with low average emotional intelligence raised their performance to match that of teams with high emotional intelligence.