259 resultados para Minimisation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A numerically stable sequential Primal–Dual LP algorithm for the reactive power optimisation (RPO) is presented in this article. The algorithm minimises the voltage stability index C 2 [1] of all the load buses to improve the system static voltage stability. Real time requirements such as numerical stability, identification of the most effective subset of controllers for curtailing the number of controllers and their movement can be handled effectively by the proposed algorithm. The algorithm has a natural characteristic of selecting the most effective subset of controllers (and hence curtailing insignificant controllers) for improving the objective. Comparison with transmission loss minimisation objective indicates that the most effective subset of controllers and their solution identified by the static voltage stability improvement objective is not the same as that of the transmission loss minimisation objective. The proposed algorithm is suitable for real time application for the improvement of the system static voltage stability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In systems biology, questions concerning the molecular and cellular makeup of an organism are of utmost importance, especially when trying to understand how unreliable components-like genetic circuits, biochemical cascades, and ion channels, among others-enable reliable and adaptive behaviour. The repertoire and speed of biological computations are limited by thermodynamic or metabolic constraints: an example can be found in neurons, where fluctuations in biophysical states limit the information they can encode-with almost 20-60% of the total energy allocated for the brain used for signalling purposes, either via action potentials or by synaptic transmission. Here, we consider the imperatives for neurons to optimise computational and metabolic efficiency, wherein benefits and costs trade-off against each other in the context of self-organised and adaptive behaviour. In particular, we try to link information theoretic (variational) and thermodynamic (Helmholtz) free-energy formulations of neuronal processing and show how they are related in a fundamental way through a complexity minimisation lemma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of L1 regularisation for sparse learning has generated immense research interest, with successful application in such diverse areas as signal acquisition, image coding, genomics and collaborative filtering. While existing work highlights the many advantages of L1 methods, in this paper we find that L1 regularisation often dramatically underperforms in terms of predictive performance when compared with other methods for inferring sparsity. We focus on unsupervised latent variable models, and develop L1 minimising factor models, Bayesian variants of "L1", and Bayesian models with a stronger L0-like sparsity induced through spike-and-slab distributions. These spike-and-slab Bayesian factor models encourage sparsity while accounting for uncertainty in a principled manner and avoiding unnecessary shrinkage of non-zero values. We demonstrate on a number of data sets that in practice spike-and-slab Bayesian methods outperform L1 minimisation, even on a computational budget. We thus highlight the need to re-assess the wide use of L1 methods in sparsity-reliant applications, particularly when we care about generalising to previously unseen data, and provide an alternative that, over many varying conditions, provides improved generalisation performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effective use of materials is one possible component of a sustainable manufacturing strategy. There are many such strategies proposed in the literature and used in practice, with confusion over what they are, what the differences among them may be and how they can be used by practitioners in design and manufacture to improve the sustainability of their product and processes. This paper reviews the literature on sustainable manufacturing strategies that deliver improved material performance. Four primary strategies were found: waste minimisation; material efficiency; resource efficiency; and eco-efficiency. The literature was analysed to determine the key characteristics of these sustainable manufacturing strategies and 17 characteristics were found. The four strategies were then compared and contrasted against all the characteristics. While current literature often uses these strategy titles in a confusing, occasionally inter-changeable manner, this study attempts to create clear separation between them. Definition, scope and practicality of measurement are shown to be key characteristics that impact upon the ability of manufacturing companies to make effective use of the proposed strategy. It is observed that the most actionable strategies may not include all of the dimensions of interest to a manufacturer wishing to become more sustainable, creating a dilemma between ease of implementation and breadth of impact. © 2008 Taylor & Francis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this article, we detail the methodology developed to construct arbitrarily high order schemes - linear and WENO - on 3D mixed-element unstructured meshes made up of general convex polyhedral elements. The approach is tailored specifically for the solution of scalar level set equations for application to incompressible two-phase flow problems. The construction of WENO schemes on 3D unstructured meshes is notoriously difficult, as it involves a much higher level of complexity than 2D approaches. This due to the multiplicity of geometrical considerations introduced by the extra dimension, especially on mixed-element meshes. Therefore, we have specifically developed a number of algorithms to handle mixed-element meshes composed of convex polyhedra with convex polygonal faces. The contribution of this work concerns several areas of interest: the formulation of an improved methodology in 3D, the minimisation of computational runtime in the implementation through the maximum use of pre-processing operations, the generation of novel methods to handle complex 3D mixed-element meshes and finally the application of the method to the transport of a scalar level set. © 2012 Global-Science Press.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is in the interests of everybody that the environment is protected. In view of the recent leaps in environmental awareness it would seem timely and sensible, therefore, for people to pool vehicle resources to minimise the damaging impact of emissions. However, this is often contrary to how complex social systems behave – local decisions made by self-interested individuals often have emergent effects that are in the interests of nobody. For software engineers a major challenge is to help facilitate individual decision-making such that individual preferences can be met, which, when accumulated, minimise adverse effects at the level of the transport system. We introduce this general problem through a concrete example based on vehicle-sharing. Firstly, we outline the kind of complex transportation problem that is directly addressed by our technology (CO2y™ - pronounced “cosy”), and also show how this differs from other more basic software solutions. The CO2y™ architecture is then briefly introduced. We outline the practical advantages of the advanced, intelligent software technology that is designed to satisfy a number of individual preference criteria and thereby find appropriate matches within a population of vehicle-share users. An example scenario of use is put forward, i.e., minimisation of grey-fleets within a medium-sized company. Here we comment on some of the underlying assumptions of the scenario, and how in a detailed real-world situation such assumptions might differ between different companies, and individual users. Finally, we summarise the paper, and conclude by outlining how the problem of pooled transportation is likely to benefit from the further application of emergent, nature-inspired computing technologies. These technologies allow systems-level behaviour to be optimised with explicit representation of individual actors. With these techniques we hope to make real progress in facing the complexity challenges that transportation problems produce.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Buildings consume 40% of Ireland's total annual energy translating to 3.5 billion (2004). The EPBD directive (effective January 2003) places an onus on all member states to rate the energy performance of all buildings in excess of 50m2. Energy and environmental performance management systems for residential buildings do not exist and consist of an ad-hoc integration of wired building management systems and Monitoring & Targeting systems for non-residential buildings. These systems are unsophisticated and do not easily lend themselves to cost effective retrofit or integration with other enterprise management systems. It is commonly agreed that a 15-40% reduction of building energy consumption is achievable by efficiently operating buildings when compared with typical practice. Existing research has identified that the level of information available to Building Managers with existing Building Management Systems and Environmental Monitoring Systems (BMS/EMS) is insufficient to perform the required performance based building assessment. The cost of installing additional sensors and meters is extremely high, primarily due to the estimated cost of wiring and the needed labour. From this perspective wireless sensor technology provides the capability to provide reliable sensor data at the required temporal and spatial granularity associated with building energy management. In this paper, a wireless sensor network mote hardware design and implementation is presented for a building energy management application. Appropriate sensors were selected and interfaced with the developed system based on user requirements to meet both the building monitoring and metering requirements. Beside the sensing capability, actuation and interfacing to external meters/sensors are provided to perform different management control and data recording tasks associated with minimisation of energy consumption in the built environment and the development of appropriate Building information models(BIM)to enable the design and development of energy efficient spaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For pt.I. see ibid. vol.1, p.301 (1985). In the first part of this work a general definition of an inverse problem with discrete data has been given and an analysis in terms of singular systems has been performed. The problem of the numerical stability of the solution, which in that paper was only briefly discussed, is the main topic of this second part. When the condition number of the problem is too large, a small error on the data can produce an extremely large error on the generalised solution, which therefore has no physical meaning. The authors review most of the methods which have been developed for overcoming this difficulty, including numerical filtering, Tikhonov regularisation, iterative methods, the Backus-Gilbert method and so on. Regularisation methods for the stable approximation of generalised solutions obtained through minimisation of suitable seminorms (C-generalised solutions), such as the method of Phillips (1962), are also considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The central product of the DRAMA (Dynamic Re-Allocation of Meshes for parallel Finite Element Applications) project is a library comprising a variety of tools for dynamic re-partitioning of unstructured Finite Element (FE) applications. The input to the DRAMA library is the computational mesh, and corresponding costs, partitioned into sub-domains. The core library functions then perform a parallel computation of a mesh re-allocation that will re-balance the costs based on the DRAMA cost model. We discuss the basic features of this cost model, which allows a general approach to load identification, modelling and imbalance minimisation. Results from crash simulations are presented which show the necessity for multi-phase/multi-constraint partitioning components

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a method for interpolation over a set of retrieved cases in the adaptation phase of the case-based reasoning cycle. The method has two advantages over traditional systems: the first is that it can predict “new” instances, not yet present in the case base; the second is that it can predict solutions not present in the retrieval set. The method is a generalisation of Shepard’s Interpolation method, formulated as the minimisation of an error function defined in terms of distance metrics in the solution and problem spaces. We term the retrieval algorithm the Generalised Shepard Nearest Neighbour (GSNN) method. A novel aspect of GSNN is that it provides a general method for interpolation over nominal solution domains. The method is illustrated in the paper with reference to the Irises classification problem. It is evaluated with reference to a simulated nominal value test problem, and to a benchmark case base from the travel domain. The algorithm is shown to out-perform conventional nearest neighbour methods on these problems. Finally, GSNN is shown to improve in efficiency when used in conjunction with a diverse retrieval algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The coefficients of an echo canceller with a near-end section and a far-end section are usually updated with the same updating scheme, such as the LMS algorithm. A novel scheme is proposed for echo cancellation that is based on the minimisation of two different cost functions, i.e. one for the near-end section and a different one for the far-end section. The approach considered leads to a substantial improvement in performance over the LMS algorithm when it is applied to both sections of the echo canceller. The convergence properties of the algorithm are derived. The proposed scheme is also shown to be robust to noise variations. Simulation results confirm the superior performance of the new algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With reference to the Kosovo war, we examined how the (un-)justness of military intervention is cognitively constructed. Four types of reinterpretation were hypothesized to relate to positive evaluation of the intervention: minimisation of negative consequences of NATO's intervention, denial of responsibility of the Western countries for the war, blame of Yugoslavia, and justification of the intervention through positive motives. As determinants of evaluation of the war, belief in a just world, militarism-pacifism, authoritarianism, and diffuse political support were taken into account. Hypotheses were tested with 165 university students using structural equation modelling. Consistent with our assumptions, the four types of reinterpretation related strongly to positive evaluation of the intervention, showing their relevance with regard to military intervention. Further, the assessed political attitudes influenced evaluation of the war while, contrary to predictions, belief in a just world did not. The causal status of the reinterpretations and the interplay of belief in a just world and political attitudes are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: In the Medical Research Council (MRC) COIN trial, the epidermal growth factor receptor (EGFR)-targeted antibody cetuximab was added to standard chemotherapy in first-line treatment of advanced colorectal cancer with the aim of assessing effect on overall survival.
Methods: In this randomised controlled trial, patients who were fit for but had not received previous chemotherapy for advanced colorectal cancer were randomly assigned to oxaliplatin and fluoropyrimidine chemotherapy (arm A), the same combination plus cetuximab (arm B), or intermittent chemotherapy (arm C). The choice of fluoropyrimidine therapy (capecitabine or infused fluouroracil plus leucovorin) was decided before randomisation. Randomisation was done centrally (via telephone) by the MRC Clinical Trials Unit using minimisation. Treatment allocation was not masked. The comparison of arms A and C is described in a companion paper. Here, we present the comparison of arm A and B, for which the primary outcome was overall survival in patients with KRAS wild-type tumours. Analysis was by intention to treat. Further analyses with respect to NRAS, BRAF, and EGFR status were done. The trial is registered, ISRCTN27286448.
Findings: 1630 patients were randomly assigned to treatment groups (815 to standard therapy and 815 to addition of cetuximab). Tumour samples from 1316 (81%) patients were used for somatic molecular analyses; 565 (43%) had KRAS mutations. In patients with KRAS wild-type tumours (arm A, n=367; arm B, n=362), overall survival did not differ between treatment groups (median survival 17·9 months [IQR 10·3—29·2] in the control group vs 17·0 months [9·4—30·1] in the cetuximab group; HR 1·04, 95% CI 0·87—1·23, p=0·67). Similarly, there was no effect on progression-free survival (8·6 months [IQR 5·0—12·5] in the control group vs 8·6 months [5·1—13·8] in the cetuximab group; HR 0·96, 0·82—1·12, p=0·60). Overall response rate increased from 57% (n=209) with chemotherapy alone to 64% (n=232) with addition of cetuximab (p=0·049). Grade 3 and higher skin and gastrointestinal toxic effects were increased with cetuximab (14 vs 114 and 67 vs 97 patients in the control group vs the cetuximab group with KRAS wild-type tumours, respectively). Overall survival differs by somatic mutation status irrespective of treatment received: BRAF mutant, 8·8 months (IQR 4·5—27·4); KRAS mutant, 14·4 months (8·5—24·0); all wild-type, 20·1 months (11·5—31·7).
Interpretation: This trial has not confirmed a benefit of addition of cetuximab to oxaliplatin-based chemotherapy in first-line treatment of patients with advanced colorectal cancer. Cetuximab increases response rate, with no evidence of benefit in progression-free or overall survival in KRAS wild-type patients or even in patients selected by additional mutational analysis of their tumours. The use of cetuximab in combination with oxaliplatin and capecitabine in first-line chemotherapy in patients with widespread metastases cannot be recommended.