963 resultados para Design optimisation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple and general design procedure is presented for the polarisation diversity of arbitrary conformal arrays; this procedure is based on the mathematical framework of geometric algebra and can be solved optimally using convex optimisation. Aside from being simpler and more direct than other derivations in the literature, this derivation is also entirely general in that it expresses the transformations in terms of rotors in geometric algebra which can easily be formulated for any arbitrary conformal array geometry. Convex optimisation has a number of advantages; solvers are widespread and freely available, the process generally requires a small number of iterations and a wide variety of constraints can be readily incorporated. The study outlines a two-step approach for addressing polarisation diversity in arbitrary conformal arrays: first, the authors obtain the array polarisation patterns using geometric algebra and secondly use a convex optimisation approach to find the optimal weights for the polarisation diversity problem. The versatility of this approach is illustrated via simulations of a 7×10 cylindrical conformal array. © 2012 The Institution of Engineering and Technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel way to speed up the evaluation time of a boosting classifier. We make a shallow (flat) network deep (hierarchical) by growing a tree from decision regions of a given boosting classifier. The tree provides many short paths for speeding up while preserving the reasonably smooth decision regions of the boosting classifier for good generalisation. For converting a boosting classifier into a decision tree, we formulate a Boolean optimization problem, which has been previously studied for circuit design but limited to a small number of binary variables. In this work, a novel optimisation method is proposed for, firstly, several tens of variables i.e. weak-learners of a boosting classifier, and then any larger number of weak-learners by using a two-stage cascade. Experiments on the synthetic and face image data sets show that the obtained tree achieves a significant speed up both over a standard boosting classifier and the Fast-exit-a previously described method for speeding-up boosting classification, at the same accuracy. The proposed method as a general meta-algorithm is also useful for a boosting cascade, where it speeds up individual stage classifiers by different gains. The proposed method is further demonstrated for fast-moving object tracking and segmentation problems. © 2011 Springer Science+Business Media, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An investigation into the potential for reducing road damage by optimising the design of heavy vehicle suspensions is described. In the first part of the paper two simple mathematical models are used to study the optimisation of conventional passive suspensions. Simple modifications are made to the steel spring suspension of a tandem axle trailer and it is found experimentally that RMS dynamic tyre forces can be reduced by 15% and theoretical road damage by 5.2%. A mathematical model of an air-sprung articulated vehicle is validated, and its suspension is optimised according to the simple models. This vehicle generates about 9% less damage than the leaf-sprung vehicle in the unmodified state and it is predicted that, for the operating conditions examined, the road damage caused by this vehicle can be reduced by a further 5.4%. Finally, it is shown experimentally that computer-controlled semi-active dampers have the potential to reduce road damage by a further 5-6%, compared to an air suspension with optimum passive damping. © Copyright 1994 Society of Automotive Engineers, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The most common approach to decision making in multi-objective optimisation with metaheuristics is a posteriori preference articulation. Increased model complexity and a gradual increase of optimisation problems with three or more objectives have revived an interest in progressively interactive decision making, where a human decision maker interacts with the algorithm at regular intervals. This paper presents an interactive approach to multi-objective particle swarm optimisation (MOPSO) using a novel technique to preference articulation based on decision space interaction and visual preference articulation. The approach is tested on a 2D aerofoil design case study and comparisons are drawn to non-interactive MOPSO. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are developing a wind turbine blade optimisation package CoBOLDT (COmputa- tional Blade Optimisation and Load De ation Tool) for the optimisation of large horizontal- axis wind turbines. The core consists of the Multi-Objective Tabu Search (MOTS), which controls a spline parameterisation module, a fast geometry generation and a stationary Blade Element Momentum (BEM) code to optimise an initial wind turbine blade design. The objective functions we investigate are the Annual Energy Production (AEP) and the fl apwise blade root bending moment (MY0) for a stationary wind speed of 50 m/s. For this task we use nine parameters which define the blade chord, the blade twist (4 parameters each) and the blade radius. Throughout the optimisation a number of binary constraints are defined to limit the noise emission, to allow for transportation on land and to control the aerodynamic conditions during all phases of turbine operation. The test case shows that MOTS is capable to find enhanced designs very fast and eficiently and will provide a rich and well explored Pareto front for the designer to chose from. The optimised blade de- sign could improve the AEP of the initial blade by 5% with the same flapwise root bending moment or reduce MY0 by 7.5% with the original energy yield. Due to the fast runtime of order 10 seconds per design, a huge number of optimisation iterations is possible without the need for a large computing cluster. This also allows for increased design flexibility through the introduction of more parameters per blade function or parameterisation of the airfoils in future. © 2012 by Nordex Energy GmbH.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are developing a wind turbine blade optimisation package CoBOLDT (COmputa- tional Blade Optimisation and Load Deation Tool) for the optimisation of large horizontal- axis wind turbines. The core consists of the Multi-Objective Tabu Search (MOTS), which controls a spline parameterisation module, a fast geometry generation and a stationary Blade Element Momentum (BEM) code to optimise an initial wind turbine blade design. The objective functions we investigate are the Annual Energy Production (AEP) and the apwise blade root bending moment (MY0) for a stationary wind speed of 50 m/s. For this task we use nine parameters which define the blade chord, the blade twist (4 parameters each) and the blade radius. Throughout the optimisation a number of binary constraints are defined to limit the noise emission, to allow for transportation on land and to control the aerodynamic conditions during all phases of turbine operation. The test case shows that MOTS is capable to find enhanced designs very fast and efficiently and will provide a rich and well explored Pareto front for the designer to chose from. The optimised blade de- sign could improve the AEP of the initial blade by 5% with the same apwise root bending moment or reduce MY0 by 7.5% with the original energy yield. Due to the fast runtime of order 10 seconds per design, a huge number of optimisation iterations is possible without the need for a large computing cluster. This also allows for increased design flexibility through the introduction of more parameters per blade function or parameterisation of the airfoils in future. © 2012 AIAA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes work carried out on the design of new routes to a range of bisindolylmaleimide and indolo[2,3-a]carbazole analogs, and investigation of their potential as successful anti-cancer agents. Following initial investigation of classical routes to indolo[2,3-a]pyrrolo[3,4-c]carbazole aglycons, a new strategy employing base-mediated condensation of thiourea and guanidine with a bisindolyl β-ketoester intermediate afforded novel 5,6-bisindolylpyrimidin-4(3H)-ones in moderate yields. Chemical diversity within this H-bonding scaffold was then studied by substitution with a panel of biologically relevant electrophiles, and by reductive desulfurisation. Optimisation of difficult heterogeneous literature conditions for oxidative desulfurisation of thiouracils was also accomplished, enabling a mild route to a novel 5,6-bisindolyluracil pharmacophore to be developed within this work. The oxidative cyclisation of selected acyclic bisindolyl systems to form a new planar class of indolo[2,3-a]pyrimido[5,4-c]carbazoles was also investigated. Successful conditions for this transformation, as well as the limitations currently prevailing for this approach are discussed. Synthesis of 3,4-bisindolyl-5-aminopyrazole as a potential isostere of bisindolylmaleimide agents was encountered, along with a comprehensive derivatisation study, in order to probe the chemical space for potential protein backbone H-bonding interactions. Synthesis of a related 3,4-arylindolyl-5-aminopyrazole series was also undertaken, based on identification of potent kinase inhibition within a closely related heterocyclic template. Following synthesis of approximately 50 novel compounds with a diversity of H-bonding enzyme-interacting potential within these classes, biological studies confirmed that significant topo II inhibition was present for 9 lead compounds, in previously unseen pyrazolo[1,5-a]pyrimidine, indolo[2,3-c]carbazole and branched S,N-disubstituted thiouracil derivative series. NCI-60 cancer cell line growth inhibition data for 6 representative compounds also revealed interesting selectivity differences between each compound class, while a new pyrimido[5,4-c]carbazole agent strongly inhibited cancer cell division at 10 µM, with appreciable cytotoxic activity observed across several tumour types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this PhD study, mathematical modelling and optimisation of granola production has been carried out. Granola is an aggregated food product used in breakfast cereals and cereal bars. It is a baked crispy food product typically incorporating oats, other cereals and nuts bound together with a binder, such as honey, water and oil, to form a structured unit aggregate. In this work, the design and operation of two parallel processes to produce aggregate granola products were incorporated: i) a high shear mixing granulation stage (in a designated granulator) followed by drying/toasting in an oven. ii) a continuous fluidised bed followed by drying/toasting in an oven. In addition, the particle breakage of granola during pneumatic conveying produced by both a high shear granulator (HSG) and fluidised bed granulator (FBG) process were examined. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. It was observed that the least amount of breakage occurred in the straight pipe while the most breakage occurred at 90° bend pipe. Moreover, lower levels of breakage were observed in two 45° bend pipe than the 90° bend vi pipe configuration. In general, increasing the impact angle increases the degree of breakage. Additionally for the granules produced in the HSG, those produced at 300 rpm have the lowest breakage rates while the granules produced at 150 rpm have the highest breakage rates. This effect clearly the importance of shear history (during granule production) on breakage rates during subsequent processing. In terms of the FBG there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. A population balance model was developed to analyse the particle breakage occurring during pneumatic conveying. The population balance equations that govern this breakage process are solved using discretization. The Markov chain method was used for the solution of PBEs for this process. This study found that increasing the air velocity (by increasing the air pressure to the rig), results in increased breakage among granola aggregates. Furthermore, the analysis carried out in this work provides that a greater degree of breakage of granola aggregates occur in line with an increase in bend angle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study has considered the optimisation of granola breakfast cereal manufacturing processes by wet granulation and pneumatic conveying. Granola is an aggregated food product used as a breakfast cereal and in cereal bars. Processing of granola involves mixing the dry ingredients (typically oats, nuts, etc.) followed by the addition of a binder which can contain honey, water and/or oil. In this work, the design and operation of two parallel wet granulation processes to produce aggregate granola products were incorporated: a) a high shear mixing granulation process followed by drying/toasting in an oven. b) a continuous fluidised bed followed by drying/toasting in an oven. In high shear granulation the influence of process parameters on key granule aggregate quality attributes such as granule size distribution and textural properties of granola were investigated. The experimental results show that the impeller rotational speed is the single most important process parameter which influences granola physical and textural properties. After that binder addition rate and wet massing time also show significant impacts on granule properties. Increasing the impeller speed and wet massing time increases the median granule size while also presenting a positive correlation with density. The combination of high impeller speed and low binder addition rate resulted in granules with the highest levels of hardness and crispness. In the fluidised bed granulation process the effect of nozzle air pressure and binder spray rate on key aggregate quality attributes were studied. The experimental results show that a decrease in nozzle air pressure leads to larger in mean granule size. The combination of lowest nozzle air pressure and lowest binder spray rate results in granules with the highest levels of hardness and crispness. Overall, the high shear granulation process led to larger, denser, less porous and stronger (less likely to break) aggregates than the fluidised bed process. The study also examined the particle breakage of granola during pneumatic conveying produced by both the high shear granulation and the fluidised bed granulation process. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. Particle breakage increases with applied pressure drop, and a 90° bend pipe results in more attrition for all conveying velocities relative to other pipe geometry. Additionally for the granules produced in the high shear granulator; those produced at the highest impeller speed, while being the largest also have the lowest levels of proportional breakage while smaller granules produced at the lowest impeller speed have the highest levels of breakage. This effect clearly shows the importance of shear history (during granule production) on breakage during subsequent processing. In terms of the fluidised bed granulation, there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. Finally, a simple power law breakage model based on process input parameters was developed for both manufacturing processes. It was found suitable for predicting the breakage of granola breakfast cereal at various applied air velocities using a number of pipe configurations, taking into account shear histories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the proliferation of mobile wireless communication and embedded systems, the energy efficiency becomes a major design constraint. The dissipated energy is often referred as the product of power dissipation and the input-output delay. Most of electronic design automation techniques focus on optimising only one of these parameters either power or delay. Industry standard design flows integrate systematic methods of optimising either area or timing while for power consumption optimisation one often employs heuristics which are characteristic to a specific design. In this work we answer three questions in our quest to provide a systematic approach to joint power and delay Optimisation. The first question of our research is: How to build a design flow which incorporates academic and industry standard design flows for power optimisation? To address this question, we use a reference design flow provided by Synopsys and integrate in this flow academic tools and methodologies. The proposed design flow is used as a platform for analysing some novel algorithms and methodologies for optimisation in the context of digital circuits. The second question we answer is: Is possible to apply a systematic approach for power optimisation in the context of combinational digital circuits? The starting point is a selection of a suitable data structure which can easily incorporate information about delay, power, area and which then allows optimisation algorithms to be applied. In particular we address the implications of a systematic power optimisation methodologies and the potential degradation of other (often conflicting) parameters such as area or the delay of implementation. Finally, the third question which this thesis attempts to answer is: Is there a systematic approach for multi-objective optimisation of delay and power? A delay-driven power and power-driven delay optimisation is proposed in order to have balanced delay and power values. This implies that each power optimisation step is not only constrained by the decrease in power but also the increase in delay. Similarly, each delay optimisation step is not only governed with the decrease in delay but also the increase in power. The goal is to obtain multi-objective optimisation of digital circuits where the two conflicting objectives are power and delay. The logic synthesis and optimisation methodology is based on AND-Inverter Graphs (AIGs) which represent the functionality of the circuit. The switching activities and arrival times of circuit nodes are annotated onto an AND-Inverter Graph under the zero and a non-zero-delay model. We introduce then several reordering rules which are applied on the AIG nodes to minimise switching power or longest path delay of the circuit at the pre-technology mapping level. The academic Electronic Design Automation (EDA) tool ABC is used for the manipulation of AND-Inverter Graphs. We have implemented various combinatorial optimisation algorithms often used in Electronic Design Automation such as Simulated Annealing and Uniform Cost Search Algorithm. Simulated Annealing (SMA) is a probabilistic meta heuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. We used SMA to probabilistically decide between moving from one optimised solution to another such that the dynamic power is optimised under given delay constraints and the delay is optimised under given power constraints. A good approximation to the global optimum solution of energy constraint is obtained. Uniform Cost Search (UCS) is a tree search algorithm used for traversing or searching a weighted tree, tree structure, or graph. We have used Uniform Cost Search Algorithm to search within the AIG network, a specific AIG node order for the reordering rules application. After the reordering rules application, the AIG network is mapped to an AIG netlist using specific library cells. Our approach combines network re-structuring, AIG nodes reordering, dynamic power and longest path delay estimation and optimisation and finally technology mapping to an AIG netlist. A set of MCNC Benchmark circuits and large combinational circuits up to 100,000 gates have been used to validate our methodology. Comparisons for power and delay optimisation are made with the best synthesis scripts used in ABC. Reduction of 23% in power and 15% in delay with minimal overhead is achieved, compared to the best known ABC results. Also, our approach is also implemented on a number of processors with combinational and sequential components and significant savings are achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer represents a leading of cause of death in the developed world, inflicting tremendous suffering and plundering billions from health budgets. The traditional treatment approaches of surgery, radiotherapy and chemotherapy have achieved little in terms of cure for this deadly disease. Instead, life is prolonged for many, with dubious quality of life, only for disease to reappear with the inevitable fatal outcome. “Blue sky” thinking is required to tackle this disease and improve outcomes. The realisation and acceptance of the intrinsic role of the immune system in cancer pathogenesis, pathophysiology and treatment represented such a “blue sky” thought. Moreover, the embracement of immunotherapy, the concept of targeting immune cells rather than the tumour cells themselves, represents a paradigm shift in the approach to cancer therapy. The harnessing of immunotherapy demands radical and innovative therapeutic endeavours – endeavours such as gene and cell therapies and RNA interference, which two decades ago existed as mere concepts. This thesis straddles the frontiers of fundamental tumour immunobiology and novel therapeutic discovery, design and delivery. The work undertaken focused on two distinct immune cell populations known to undermine the immune response to cancer – suppressive T cells and macrophages. Novel RNAi mediators were designed, validated and incorporated into clinically relevant gene therapy vectors – involving a traditional lentiviral vector approach, and a novel bacterial vector strategy. Chapter 2 deals with the design of novel RNAi mediators against FOXP3 – a crucial regulator of the immunosuppressive regulatory T cell population. Two mediators were tested and validated. The superior mediator was taken forward as part of work in chapter 3. Chapter 3 deals with transposing the RNA sequence from chapter 2 into a DNA-based construct and subsequent incorporation into a lentiviral-based vector system. The lentiviral vector was shown to mediate gene delivery in vitro and functional RNAi was achieved against FOXP3. Proof of gene delivery was further confirmed in vivo in tumour-bearing animals. Chapter 4 focuses on a different immune cell population – tumour-associated macrophages. Non-invasive bacteria were explored as a specific means of delivering gene therapy to this phagocytic cell type. Proof of delivery was shown in vitro and in vivo. Moreover, in vivo delivery of a gene by this method achieved the desired immune response in terms of cytokine profile. Overall, the data presented here advance exploration within the field of cancer immunotherapy, introduce novel delivery and therapeutic strategies, and demonstrate pre-clinically the potential for such novel anti-cancer therapies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of a new bioprocess requires several steps from initial concept to a practical and feasible application. Industrial applications of fungal pigments will depend on: (i) safety of consumption, (ii) stability of the pigments to the food processing conditions required by the products where they will be incorporated and (iii) high production yields so that production costs are reasonable. Of these requirements the first involves the highest research costs and the practical application of this type of processes may face several hurdles until final regulatory approval as a new food ingredient. Therefore, before going through expensive research to have them accepted as new products, the process potential should be assessed early on, and this brings forward pigment stability studies and process optimisation goals. Only ingredients that are usable in economically feasible conditions should progress to regulatory approval. This thesis covers these two aspects, stability and process optimisation, for a potential new ingredient; natural red colour, produced by microbial fermentation. The main goal was to design, optimise and scale-up the production process of red pigments by Penicillium purpurogenum GH2. The approach followed to reach this objective was first to establish that pigments produced by Penicillium purpurogenum GH2 are sufficiently stable under different processing conditions (thermal and non-thermal) that can be found in food and textile industries. Once defined that pigments were stable enough, the work progressed towards process optimisation, aiming for the highest productivity using submerged fermentation as production culture. Optimum production conditions defined at flask scale were used to scale up the pigment production process to a pilot reactor scale. Finally, the potential applications of the pigments were assessed. Based on this sequence of specific targets, the thesis was structured in six parts, containing a total of nine chapters. Engineering design of a bioprocess for the production of natural red colourants by submerged fermentation of the thermophilic fungus Penicillium purpurogenum GH2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper describes the design of an efficient and robust genetic algorithm for the nuclear fuel loading problem (i.e., refuellings: the in-core fuel management problem) - a complex combinatorial, multimodal optimisation., Evolutionary computation as performed by FUELGEN replaces heuristic search of the kind performed by the FUELCON expert system (CAI 12/4), to solve the same problem. In contrast to the traditional genetic algorithm which makes strong requirements on the representation used and its parameter setting in order to be efficient, the results of recent research results on new, robust genetic algorithms show that representations unsuitable for the traditional genetic algorithm can still be used to good effect with little parameter adjustment. The representation presented here is a simple symbolic one with no linkage attributes, making the genetic algorithm particularly easy to apply to fuel loading problems with differing core structures and assembly inventories. A nonlinear fitness function has been constructed to direct the search efficiently in the presence of the many local optima that result from the constraint on solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FEA and CFD analysis is becoming ever more complex with an emerging demand for simulation software technologies that can address ranges of problems that involve combinations of interactions amongst varying physical phenomena over a variety of time and length scales. Computation modelling of such problems requires software technologies that enable the representation of these complex suites of 'physical' interactions. This functionality requires the structuring of simulation modules for specific physical phemonmena so that the coupling can be effectiely represented. These 'multi-physics' and 'multi-scale' computations are very compute intensive and so the simulation software must operate effectively in parallel if it is to be used in this context. Of course the objective of 'multi-physics' and 'multi-scale' simulation is the optimal design of engineered systems so optimistation is an important feature of such classes of simulation. In this presentation, a multi-disciplinary approach to simulation based optimisation is described with some key examples of application to challenging engineering problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of integrating computational mechanics (FEA and CFD) and optimization tools is to speed up dramatically the design process in different application areas concerning reliability in electronic packaging. Design engineers in the electronics manufacturing sector may use these tools to predict key design parameters and configurations (i.e. material properties, product dimensions, design at PCB level. etc) that will guarantee the required product performance. In this paper a modeling strategy coupling computational mechanics techniques with numerical optimization is presented and demonstrated with two problems. The integrated modeling framework is obtained by coupling the multi-physics analysis tool PHYSICA - with the numerical optimization package - Visua/DOC into a fuJly automated design tool for applications in electronic packaging. Thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and life-time under thermal cycling. Also a thermal management design based on multi-physics analysis with coupled thermal-flow-stress modeling is discussed. The Response Surface Modeling Approach in conjunction with Design of Experiments statistical tools is demonstrated and used subsequently by the numerical optimization techniques as a part of this modeling framework. Predictions for reliable electronic assemblies are achieved in an efficient and systematic manner.