57 resultados para Combinatorial optimisation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The microencapsulation of tuna oil in gelatin-sodium hexametaphosphate (SHMP) using complex coacervation was optimised for the stabilisation of omega-3 oils, for use as a functional food ingredient. Firstly, oil stability was optimised by comparing the accelerated stability of tuna oil in the presence of various commercial antioxidants, using a Rancimat™. Then zeta-potential (mV), turbidity and coacervate yield (%) were measured and optimised for complex coacervation. The highest yield of complex coacervate was obtained at pH 4.7 and at a gelatin to SHMP ratio of 15:1. Multi-core microcapsules were formed when the mixed microencapsulation system was cooled to 5 °C at a rate of 12 °C/h. Crosslinking with transglutaminase followed by freeze drying resulted in a dried powder with an encapsulation efficiency of 99.82% and a payload of 52.56%. Some 98.56% of the oil was successfully microencapsulated and accelerated stability using a Rancimat™ showed stability more than double that of non-encapsulated oil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motion cueing algorithms (MCAs) are playing a significant role in driving simulators, aiming to deliver the most accurate human sensation to the simulator drivers compared with a real vehicle driver, without exceeding the physical limitations of the simulator. This paper provides the optimisation design of an MCA for a vehicle simulator, in order to find the most suitable washout algorithm parameters, while respecting all motion platform physical limitations, and minimising human perception error between real and simulator driver. One of the main limitations of the classical washout filters is that it is attuned by the worst-case scenario tuning method. This is based on trial and error, and is effected by driving and programmers experience, making this the most significant obstacle to full motion platform utilisation. This leads to inflexibility of the structure, production of false cues and makes the resulting simulator fail to suit all circumstances. In addition, the classical method does not take minimisation of human perception error and physical constraints into account. Production of motion cues and the impact of different parameters of classical washout filters on motion cues remain inaccessible for designers for this reason. The aim of this paper is to provide an optimisation method for tuning the MCA parameters, based on nonlinear filtering and genetic algorithms. This is done by taking vestibular sensation error into account between real and simulated cases, as well as main dynamic limitations, tilt coordination and correlation coefficient. Three additional compensatory linear blocks are integrated into the MCA, to be tuned in order to modify the performance of the filters successfully. The proposed optimised MCA is implemented in MATLAB/Simulink software packages. The results generated using the proposed method show increased performance in terms of human sensation, reference shape tracking and exploiting the platform more efficiently without reaching the motion limitations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fast growing, highly orange color pigmented strain of Thraustochytrids was isolated from New Zealand marine waters. This strain showed efficient utilization of glycerol as carbon source and produced significant amount of cell dry biomass (2.08gL-1), TFA (30.15% of dry cell weight), DHA (27.83% of TFA) and astaxanthin (131.56μgg-1). Astaxanthin is the dominant constituent in the carotenoid profile of Thraustochytrium sp. S7 and is an important antioxidant. Different cell disruption methods were applied for efficient astaxanthin extraction. Mechanical disruption of cells via ultrasonication resulted in the highest astaxanthin yield, from 26.78±1.25μgg-1 to 156.07±4.16μgg-1. Optimization of ultrasonication process using response surface methodology resulted into significant decrease in lysis time from 30min to 10min. This strain can be used for concurrent production of lipids and high value co-products such as DHA and astaxanthin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The majority of existing application profiling techniques ag- gregate and report performance costs by method or call- ing context. Modern large-scale object-oriented applications consist of thousands of methods with complex calling pat- terns. Consequently, when profiled, their performance costs tend to be thinly distributed across many thousands of loca- tions with few easily identifiable optimisation opportunities. However experienced performance engineers know that there are repeated patterns of method calls in the execution of an application that are induced by the libraries, design patterns and coding idioms used in the software. Automati- cally identifying and aggregating costs over these patterns of method calls allows us to identify opportunities to improve performance based on optimising these patterns. We have developed an analysis technique that is able to identify the entry point methods, which we call subsuming methods, of such patterns. Our ofiine analysis runs over previously collected runtime performance data structured in a calling context tree, such as produced by a large number of existing commercial and open source profilers. We have evaluated our approach on the DaCapo bench- mark suite, showing that our analysis significantly reduces the size and complexity of the runtime performance data set, facilitating its comprehension and interpretation. We also demonstrate, with a collection of case studies, that our analysis identifies new optimisation opportunities that can lead to significant performance improvements (from 20% to over 50% improvement in our case studies).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bayesian optimisation is an efficient technique to optimise functions that are expensive to compute. In this paper, we propose a novel framework to transfer knowledge from a completed source optimisation task to a new target task in order to overcome the cold start problem. We model source data as noisy observations of the target function. The level of noise is computed from the data in a Bayesian setting. This enables flexible knowledge transfer across tasks with differing relatedness, addressing a limitation of the existing methods. We evaluate on the task of tuning hyperparameters of two machine learning algorithms. Treating a fraction of the whole training data as source and the whole as the target task, we show that our method finds the best hyperparameters in the least amount of time compared to both the state-of-art and no transfer method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, there has been studies on the cardinality constrained multi-cycle problems on directed graphs, some of which considered chains co-existing on the same digraph whilst others did not. These studies were inspired by the optimal matching of kidneys known as the Kidney Exchange Problem (KEP). In a KEP, a vertex on the digraph represents a donor-patient pair who are related, though the kidney of the donor is incompatible to the patient. When there are multiple such incompatible pairs in the kidney exchange pool, the kidney of the donor of one incompatible pair may in fact be compatible to the patient of another incompatible pair. If Donor A’s kidney is suitable for Patient B, and vice versa, then there will be arcs in both directions between Vertex A to Vertex B. Such exchanges form a 2-cycle. There may also be cycles involving 3 or more vertices. As all exchanges in a kidney exchange cycle must take place simultaneously, (otherwise a donor can drop out from the program once his/her partner has received a kidney from another donor), due to logistic and human resource reasons, only a limited number of kidney exchanges can occur simultaneously, hence the cardinality of these cycles are constrained. In recent years, kidney exchange programs around the world have altruistic donors in the pool. A sequence of exchanges that starts from an altruistic donor forms a chain instead of a cycle. We therefore have two underlying combinatorial optimization problems: Cardinality Constrained Multi-cycle Problem (CCMcP) and the Cardinality Constrained Cycles and Chains Problem (CCCCP). The objective of the KEP is either to maximize the number of kidney matches, or to maximize a certain weighted function of kidney matches. In a CCMcP, a vertex can be in at most one cycle whereas in a CCCCP, a vertex can be part of (but in no more than) a cycle or a chain. The cardinality of the cycles are constrained in all studies. The cardinality of the chains, however, are considered unconstrained in some studies, constrained but larger than that of cycles, or the same as that of cycles in others. Although the CCMcP has some similarities to the ATSP- and VRP-family of problems, there is a major difference: strong subtour elimination constraints are mostly invalid for the CCMcP, as we do allow smaller subtours as long as they do not exceed the size limit. The CCCCP has its distinctive feature that allows chains as well as cycles on the same directed graph. Hence, both the CCMcP and the CCCCP are interesting and challenging combinatorial optimization problems in their own rights. Most existing studies focused on solution methodologies, and as far as we aware, there is no polyhedral studies so far. In this paper, we will study the polyhedral structure of the natural arc-based integer programming models of the CCMcP and the CCCCP, both containing exponentially many constraints. We do so to pave the way for studying strong valid cuts we have found that can be applied in a Lagrangean relaxation-based branch-and-bound framework where at each node of the branch-and-bound tree, we may be able to obtain a relaxation that can be solved in polynomial time, with strong valid cuts dualized into the objective function and the dual multipliers optimised by subgradient optimisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing a rock bolt reinforcement system for underground excavation involves determining bolt pattern, spacing, and size. In this paper, a topology optimisation technique is presented and employed to simultaneously optimise these design variables. To improve rock bolt design, the proposed technique minimises a displacement based function around the opening after bolt installation. This optimisation technique is independent of the material model and can be easily applied to any material model for rock and bolts. It is also extremely flexible in that it can be applied to any mechanical analysis method. To illustrate the capabilities of this method, numerical examples with non-linear material models and discontinuities in the host rock are presented. It is shown that the complexity of systems optimised using this approach is only restricted by limitations of the method used to analyse mechanical system responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In-silico optimisation of a two-dimensional high performance liquid chromatography (2D-HPLC) separation protocol has been developed for the interogation of methamphetamine samples including model, real world seizure, and laboratory synthesised samples. The protocol used Drylab® software to rapidly identify the optimum separation conditions from a library of chromatography columns. The optimum separation space was provided by the Phenomonex Kinetex PFP column (first dimension) and an Agilent Poroshell 120 EC-C18 column (second dimension). To facilitate a rapid 2D-HPLC analysis the particle packed C18 column was replaced with a Phenomenex Onyx Monolithic C18 withought sacrificing separation performance. The Drylab® optimised and experimental separations matched very closely, highlighting the robust nature of HPLC simulations. The chemical information gained from an intermediate methamphetamine sample was significant and complimented that generated from a pure seizure sample. The influence of the two-dimensional separation on the analytical figures of merit was also investigated. The limits of detection for key analytes in the second dimension determined for methamphetamine (4.59 × 10-⁴ M), pseudoephedrine (4.03 × 10-4 M), caffeine (5.16 × 10-⁴ M), aspirin (9.32 × 10-4 M), paracetamol (5.93 × 10-4 M) and procaine (2.02 × 10-3 M).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolutionary algorithms (EAs) have recently been suggested as candidate for solving big data optimisation problems that involve very large number of variables and need to be analysed in a short period of time. However, EAs face scalability issue when dealing with big data problems. Moreover, the performance of EAs critically hinges on the utilised parameter values and operator types, thus it is impossible to design a single EA that can outperform all other on every problem instances. To address these challenges, we propose a heterogeneous framework that integrates a cooperative co-evolution method with various types of memetic algorithms. We use the cooperative co-evolution method to split the big problem into sub-problems in order to increase the efficiency of the solving process. The subproblems are then solved using various heterogeneous memetic algorithms. The proposed heterogeneous framework adaptively assigns, for each solution, different operators, parameter values and local search algorithm to efficiently explore and exploit the search space of the given problem instance. The performance of the proposed algorithm is assessed using the Big Data 2015 competition benchmark problems that contain data with and without noise. Experimental results demonstrate that the proposed algorithm, with the cooperative co-evolution method, performs better than without cooperative co-evolution method. Furthermore, it obtained very competitive results for all tested instances, if not better, when compared to other algorithms using a lower computational times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the empirical methods for reinforcement design of underground excavations, an even distribution of rock bolts is generally recommended. This work proves that this design is not necessarily optimal and shows how the state-of-the-art reinforcement design could be improved through topology optimisation techniques. The Bidirectional Evolutionary Structural Optimisation (BESO) method has been extended to consider nonlinear material behaviour. An elastic perfectly-plastic Mohr-Coulomb model is utilised for both original rock and reinforced rock. External work along the tunnel wall is considered as the objective function. Various in situ stress conditions with different horizontal stress ratios and different geostatic stress magnitudes are investigated through several examples. The outcomes show that the proposed approach is capable of improving tunnel reinforcement design. Also, significant difference in optimal reinforcement distribution for the cases of linear and nonlinear analysis results proves the importance of the influence of realistic nonlinear material properties on the final outcome.