23 resultados para D-optimal design

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimal design for parameter estimation in Gaussian process regression models with input-dependent noise is examined. The motivation stems from the area of computer experiments, where computationally demanding simulators are approximated using Gaussian process emulators to act as statistical surrogates. In the case of stochastic simulators, which produce a random output for a given set of model inputs, repeated evaluations are useful, supporting the use of replicate observations in the experimental design. The findings are also applicable to the wider context of experimental design for Gaussian process regression and kriging. Designs are proposed with the aim of minimising the variance of the Gaussian process parameter estimates. A heteroscedastic Gaussian process model is presented which allows for an experimental design technique based on an extension of Fisher information to heteroscedastic models. It is empirically shown that the error of the approximation of the parameter variance by the inverse of the Fisher information is reduced as the number of replicated points is increased. Through a series of simulation experiments on both synthetic data and a systems biology stochastic simulator, optimal designs with replicate observations are shown to outperform space-filling designs both with and without replicate observations. Guidance is provided on best practice for optimal experimental design for stochastic response models. © 2013 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Renewable energy forms have been widely used in the past decades highlighting a "green" shift in energy production. An actual reason behind this turn to renewable energy production is EU directives which set the Union's targets for energy production from renewable sources, greenhouse gas emissions and increase in energy efficiency. All member countries are obligated to apply harmonized legislation and practices and restructure their energy production networks in order to meet EU targets. Towards the fulfillment of 20-20-20 EU targets, in Greece a specific strategy which promotes the construction of large scale Renewable Energy Source plants is promoted. In this paper, we present an optimal design of the Greek renewable energy production network applying a 0-1 Weighted Goal Programming model, considering social, environmental and economic criteria. In the absence of a panel of experts Data Envelopment Analysis (DEA) approach is used in order to filter the best out of the possible network structures, seeking for the maximum technical efficiency. Super-Efficiency DEA model is also used in order to reduce the solutions and find the best out of all the possible. The results showed that in order to achieve maximum efficiency, the social and environmental criteria must be weighted more than the economic ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supply chain operations directly affect service levels. Decision on amendment of facilities is generally decided based on overall cost, leaving out the efficiency of each unit. Decomposing the supply chain superstructure, efficiency analysis of the facilities (warehouses or distribution centers) that serve customers can be easily implemented. With the proposed algorithm, the selection of a facility is based on service level maximization and not just cost minimization as this analysis filters all the feasible solutions utilizing Data Envelopment Analysis (DEA) technique. Through multiple iterations, solutions are filtered via DEA and only the efficient ones are selected leading to cost minimization. In this work, the problem of optimal supply chain networks design is addressed based on a DEA based algorithm. A Branch and Efficiency (B&E) algorithm is deployed for the solution of this problem. Based on this DEA approach, each solution (potentially installed warehouse, plant etc) is treated as a Decision Making Unit, thus is characterized by inputs and outputs. The algorithm through additional constraints named “efficiency cuts”, selects only efficient solutions providing better objective function values. The applicability of the proposed algorithm is demonstrated through illustrative examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple and efficient approach to the optimal design of 3-wavelength backward-pumped Raman amplifiers is proposed. Gain flatness of 1.7 dB is demonstrated in a spectral range of 1520-1595 nm using only three pumps with wavelengths within the 1420-1480 nm interval.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Block copolymers are versatile designer macromolecules where a “bottom-up” approach can be used to create tailored materials with unique properties. These simple building blocks allow us to create actuators that convert energy from a variety of sources (such as chemical, electrical and heat) into mechanical energy. In this review we will discuss the advantages and potential pitfalls of using block copolymers to create actuators, putting emphasis on the ways in which these materials can be synthesised and processed. Particular attention will be given to the theoretical background of microphase separation and how the phase diagram can be used during the design process of actuators. Different types of actuation will be discussed throughout.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper disputes the fact that product design determines 70% of costs and the implications that follow for design evaluation tools. Using the idea of decision chains, it is argued that such tools need to consider more of the downstream business activities and should take into account the current and future state of the business rather than some idealized view of it. To illustrate the argument, a series of experiments using an enterprise simulator are described that show the benefit from the application of a more holistic 'design for' technique. Design For the Existing Environment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Product reliability and its environmental performance have become critical elements within a product's specification and design. To obtain a high level of confidence in the reliability of the design it is customary to test the design under realistic conditions in a laboratory. The objective of the work is to examine the feasibility of designing mechanical test rigs which exhibit prescribed dynamical characteristics. The design is then attached to the rig and excitation is applied to the rig, which then transmits representative vibration levels into the product. The philosophical considerations made at the outset of the project are discussed as they form the basis for the resulting design methodologies. It is attempted to directly identify the parameters of a test rig from the spatial model derived during the system identification process. It is shown to be impossible to identify a feasible test rig design using this technique. A finite dimensional optimal design methodology is developed which identifies the parameters of a discrete spring/mass system which is dynamically similar to a point coordinate on a continuous structure. This design methodology is incorporated within another procedure which derives a structure comprising a continuous element and a discrete system. This methodology is used to obtain point coordinate similarity for two planes of motion, which is validated by experimental tests. A limitation of this approach is that it is impossible to achieve multi-coordinate similarity due to an interaction of the discrete system and the continuous element at points away from the coordinate of interest. During the work the importance of the continuous element is highlighted and a design methodology is developed for continuous structures. The design methodology is based upon distributed parameter optimal design techniques and allows an initial poor design estimate to be moved in a feasible direction towards an acceptable design solution. Cumulative damage theory is used to provide a quantitative method of assessing the quality of dynamic similarity. It is shown that the combination of modal analysis techniques and cumulative damage theory provides a feasible design synthesis methodology for representative test rigs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In analysing manufacturing systems, for either design or operational reasons, failure to account for the potentially significant dynamics could produce invalid results. There are many analysis techniques that can be used, however, simulation is unique in its ability to assess detailed, dynamic behaviour. The use of simulation to analyse manufacturing systems would therefore seem appropriate if not essential. Many simulation software products are available but their ease of use and scope of application vary greatly. This is illustrated at one extreme by simulators which offer rapid but limited application whilst at the other simulation languages which are extremely flexible but tedious to code. Given that a typical manufacturing engineer does not posses in depth programming and simulation skills then the use of simulators over simulation languages would seem a more appropriate choice. Whilst simulators offer ease of use their limited functionality may preclude their use in many applications. The construction of current simulators makes it difficult to amend or extend the functionality of the system to meet new challenges. Some simulators could even become obsolete as users, demand modelling functionality that reflects the latest manufacturing system design and operation concepts. This thesis examines the deficiencies in current simulation tools and considers whether they can be overcome by the application of object-oriented principles. Object-oriented techniques have gained in popularity in recent years and are seen as having the potential to overcome any of the problems traditionally associated with software construction. There are a number of key concepts that are exploited in the work described in this thesis: the use of object-oriented techniques to act as a framework for abstracting engineering concepts into a simulation tool and the ability to reuse and extend object-oriented software. It is argued that current object-oriented simulation tools are deficient and that in designing such tools, object -oriented techniques should be used not just for the creation of individual simulation objects but for the creation of the complete software. This results in the ability to construct an easy to use simulator that is not limited by its initial functionality. The thesis presents the design of an object-oriented data driven simulator which can be freely extended. Discussion and work is focused on discrete parts manufacture. The system developed retains the ease of use typical of data driven simulators. Whilst removing any limitation on its potential range of applications. Reference is given to additions made to the simulator by other developers not involved in the original software development. Particular emphasis is put on the requirements of the manufacturing engineer and the need for Ihe engineer to carrv out dynamic evaluations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Agent-based technology is playing an increasingly important role in today’s economy. Usually a multi-agent system is needed to model an economic system such as a market system, in which heterogeneous trading agents interact with each other autonomously. Two questions often need to be answered regarding such systems: 1) How to design an interacting mechanism that facilitates efficient resource allocation among usually self-interested trading agents? 2) How to design an effective strategy in some specific market mechanisms for an agent to maximise its economic returns? For automated market systems, auction is the most popular mechanism to solve resource allocation problems among their participants. However, auction comes in hundreds of different formats, in which some are better than others in terms of not only the allocative efficiency but also other properties e.g., whether it generates high revenue for the auctioneer, whether it induces stable behaviour of the bidders. In addition, different strategies result in very different performance under the same auction rules. With this background, we are inevitably intrigued to investigate auction mechanism and strategy designs for agent-based economics. The international Trading Agent Competition (TAC) Ad Auction (AA) competition provides a very useful platform to develop and test agent strategies in Generalised Second Price auction (GSP). AstonTAC, the runner-up of TAC AA 2009, is a successful advertiser agent designed for GSP-based keyword auction. In particular, AstonTAC generates adaptive bid prices according to the Market-based Value Per Click and selects a set of keyword queries with highest expected profit to bid on to maximise its expected profit under the limit of conversion capacity. Through evaluation experiments, we show that AstonTAC performs well and stably not only in the competition but also across a broad range of environments. The TAC CAT tournament provides an environment for investigating the optimal design of mechanisms for double auction markets. AstonCAT-Plus is the post-tournament version of the specialist developed for CAT 2010. In our experiments, AstonCAT-Plus not only outperforms most specialist agents designed by other institutions but also achieves high allocative efficiencies, transaction success rates and average trader profits. Moreover, we reveal some insights of the CAT: 1) successful markets should maintain a stable and high market share of intra-marginal traders; 2) a specialist’s performance is dependent on the distribution of trading strategies. However, typical double auction models assume trading agents have a fixed trading direction of either buy or sell. With this limitation they cannot directly reflect the fact that traders in financial markets (the most popular application of double auction) decide their trading directions dynamically. To address this issue, we introduce the Bi-directional Double Auction (BDA) market which is populated by two-way traders. Experiments are conducted under both dynamic and static settings of the continuous BDA market. We find that the allocative efficiency of a continuous BDA market mainly comes from rational selection of trading directions. Furthermore, we introduce a high-performance Kernel trading strategy in the BDA market which uses kernel probability density estimator built on historical transaction data to decide optimal order prices. Kernel trading strategy outperforms some popular intelligent double auction trading strategies including ZIP, GD and RE in the continuous BDA market by making the highest profit in static games and obtaining the best wealth in dynamic games.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over 50% of clinically-marketed drugs target membrane proteins; in particular G protein-coupled receptors (GPCRs). GPCRs are vital to living cells, performing an active role in many processes, making them integral to drug development. In nature, GPCRs are not sufficiently abundant for research and their structural integrity is often lost during extraction from cell membranes. The objectives of this thesis were to increase recombinant yield of the GPCR, human adenosine A2A receptor (hA2AR) by investigating bioprocess conditions in large-scale Pichia pastoris and small-scale Saccharomyces cerevisiae cultivations. Extraction of hA2AR from membranes using novel polymers was also investigated. An increased yield of hA2AR from P. pastoris was achieved by investigating the methanol feeding regime. Slow, exponential feed during induction (μlow) was compared to a faster, exponential feed (μhigh) in 35 L pilot-scale bioreactors. Overall hA2AR yields were increased for the μlow cultivation (536.4pmol g-1) compared to the μhigh148.1 pmol g-1. hA2AR levels were maintained in cytotoxic methanol conditions and unexpectedly, pre-induction levels of hA2AR were detected. Small-scale bioreactor work showed that Design of Experiments (DoE) could be applied to screen for bioprocess conditions to give optimal hA2AR yields. Optimal conditions were retrieved for S. cerevisiae using a d-optimal screen and response surface methodology. The conditions were 22°C, pH 6.0, 30% DO without dimethyl sulphoxide. A polynomial equation was generated to predict hA2AR yields if conditions varied. Regarding the extraction, poly (maleic anhydride-styrene) or PMAS was successful in solubilising hA2AR from P. pastoris membranes compared with dodcecyl-β-D-maltoside (DDM) detergent. Variants of PMAS worked well as solubilising agents with either 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC) or cholesteryl hemisuccinate (CHS). Moreover, esterification of PMAS improved solubilisation, suggesting that increased hydrophobicity stabilises hA2AR during extraction. Overall, hA2AR yields were improved in both, P. pastoris and S. cerevisiae and the use of novel polymers for efficient extraction was achieved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Metal-binding polymer fibres have attracted major attention for diverse applications in membranes for metal sequestration from waste waters, non-woven wound dressings, matrices for photocatalysis, and many more. This paper reports the design and synthesis of an 8-hydroxyquinoline-based zinc-binding styrenic monomer, QuiBoc. Its subsequent polymerisation by reversible addition–fragmentation chain transfer (RAFT) yielded well-defined polymers, PQuiBoc, of controllable molar masses (6 and 12 kg mol−1) with low dispersities (Đ, Mw/Mn < 1.3). Protected (PQuiBoc) and deprotected (PQuiOH) derivatives of the polymer exhibited a high zinc-binding capacity, as determined by semi-quantitative SEM/EDXA analyses, allowing the electrospinning of microfibres from a PQuiBoc/polystyrene (PS) blend without the need for removal of the protecting group. Simple “dip-coating” of the fibrous mats into ZnO suspensions showed that PQuiBoc/PS microfibres with only 20% PQuiBoc content had almost three-fold higher loadings of ZnO (29%) in comparison to neat PS microfibres (11%).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The verification and validation of engineering designs are of primary importance as they directly influence production performance and ultimately define product functionality and customer perception. Research in aspects of verification and validation is widely spread ranging from tools employed during the digital design phase, to methods deployed for prototype verification and validation. This paper reviews the standard definitions of verification and validation in the context of engineering design and progresses to provide a coherent analysis and classification of these activities from preliminary design, to design in the digital domain and the physical verification and validation of products and processes. The scope of the paper includes aspects of system design and demonstrates how complex products are validated in the context of their lifecycle. Industrial requirements are highlighted and research trends and priorities identified. © 2010 CIRP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This is a study of heat transfer in a lift-off furnace which is employed in the batch annealing of a stack of coils of steel strip. The objective of the project is to investigate the various factors which govern the furnace design and the heat transfer resistances, so as to reduce the time of the annealing cycle, and hence minimize the operating costs. The work involved mathematical modelling of patterns of gas flow and modes of heat transfer. These models are: Heat conduction and its conjectures in the steel coils;Convective heat transfer in the plates separating the coils in the stack and in other parts of the furnace; and Radiative and convective heat transfer in the furnace by using the long furnace model. An important part of the project is the development of numerical methods and computations to solve the transient models. A limited number of temperature measurements was available from experiments on a test coil in an industrial furnace. The mathematical model agreed well with these data. The model has been used to show the following characteristics of annealing furnaces, and to suggest further developments which would lead to significant savings: - The location of the limiting temperature in a coil is nearer to the hollow core than to the outer periphery. - Thermal expansion of the steel tends to open the coils, reduces their thermal conductivity in the radial direction, and hence prolongs the annealing cycle. Increasing the tension in the coils and/or heating from the core would overcome this heat transfer resistance. - The shape and dimensions of the convective channels in the plates have significant effect on heat convection in the stack. An optimal design of a channel is shown to be of a width-to-height ratio equal to 9. - Increasing the cooling rate, by using a fluidized bed instead of the normal shell and tube exchanger, would shorten the cooling time by about 15%, but increase the temperature differential in the stack. - For a specific charge weight, a stack of different-sized coils will have a shorter annealing cycle than one of equally-sized coils, provided that production constraints allow the stacking order to be optimal. - Recycle of hot flue gases to the firing zone of the furnace would produce a. decrease in the thermal efficiency up to 30% but decreases the heating time by about 26%.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is concerned with the development of hydrogels that adhere to skin and can be used for topical or trans dermal release of active compounds for therapeutic or cosmetic use. The suitability of a range of monomers and initiator systems for the production of skin adhesive hydro gels by photopolymerisation was explored and an approximate order of monomer reactivity in aqueous solution was determined. Most notably, the increased reactivity of N-vinyl pyrrolidone within an aqueous system, as compared to its low rate of polymerisation in organic solvents, was observed. The efficacy of a series of photoinitiator systems for the preparation of sheet hydrogels was investigated. Supplementary redox and thermal initiators were also examined. The most successful initiator system was found to be Irgacure 184, which is commonly used in commercial moving web production systems that employ photopolymerisation. The influence of ionic and non-ionic monomers, crosslinking systems, water and glycerol on the adhesive and dynamic mechanical behaviour of partially hydrated hydrogel systems was examined. The aim was to manipulate hydrogel behaviour to modify topical and transdermal delivery capability and investigated the possibility of using monomer combinations that would influence the release characteristics of gels by modifying their hydrophobic and ionic nature. The copolymerisation of neutral monomers (N-vinyl pyrrolidone, N,N-dimethyl acrylamide and N-acryloyl morpholine) with ionic monomers (2-acrylamido-2-methylpropane sulphonic acid; sodium salt, and the potassium salt of 3-sulphopropyl acrylate) formed the basis of the study. Release from fully and partially hydrated hydrogels was studied, using model compounds and a non-steroidal anti-inflammatory drug, Ibuprofen. Release followed a common 3-stage kinetic profile that includes an initial burst phase, a secondary phase of approximate first order release and a final stage of infinitesimally slow release such that the compound is effectively retained within the hydrogel. Use of partition coefficients, the pKa of the active and a knowledge of charge-based and polar interactions of polymer and drug were complementary in interpreting experimental results. In summary, drug ionisation, hydrogel composition and external release medium characteristics interact to influence release behaviour. The information generated provides the basis for the optimal design of hydrogels for specific dermal release applications and some understanding of the limitations of these systems for controlled release applications.