988 resultados para Parameter Optimization
Resumo:
In this paper, we propose a novel online hidden Markov model (HMM) parameter estimator based on Kerridge inaccuracy rate (KIR) concepts. Under mild identifiability conditions, we prove that our online KIR-based estimator is strongly consistent. In simulation studies, we illustrate the convergence behaviour of our proposed online KIR-based estimator and provide a counter-example illustrating the local convergence properties of the well known recursive maximum likelihood estimator (arguably the best existing solution).
Resumo:
This thesis presents a multi-criteria optimisation study of group replacement schedules for water pipelines, which is a capital-intensive and service critical decision. A new mathematical model was developed, which minimises total replacement costs while maintaining a satisfactory level of services. The research outcomes are expected to enrich the body of knowledge of multi-criteria decision optimisation, where group scheduling is required. The model has the potential to optimise replacement planning for other types of linear asset networks resulting in bottom-line benefits for end users and communities. The results of a real case study show that the new model can effectively reduced the total costs and service interruptions.
Resumo:
The wide applicability of correlation analysis inspired the development of this paper. In this paper, a new correlated modified particle swarm optimization (COM-PSO) is developed. The Correlation Adjustment algorithm is proposed to recover the correlation between the considered variables of all particles at each of iterations. It is shown that the best solution, the mean and standard deviation of the solutions over the multiple runs as well as the convergence speed were improved when the correlation between the variables was increased. However, for some rotated benchmark function, the contrary results are obtained. Moreover, the best solution, the mean and standard deviation of the solutions are improved when the number of correlated variables of the benchmark functions is increased. The results of simulations and convergence performance are compared with the original PSO. The improvement of results, the convergence speed, and the ability to simulate the correlated phenomena by the proposed COM-PSO are discussed by the experimental results.
Resumo:
The K-means algorithm is one of the most popular techniques in clustering. Nevertheless, the performance of the K-means algorithm depends highly on initial cluster centers and converges to local minima. This paper proposes a hybrid evolutionary programming based clustering algorithm, called PSO-SA, by combining particle swarm optimization (PSO) and simulated annealing (SA). The basic idea is to search around the global solution by SA and to increase the information exchange among particles using a mutation operator to escape local optima. Three datasets, Iris, Wisconsin Breast Cancer, and Ripley’s Glass, have been considered to show the effectiveness of the proposed clustering algorithm in providing optimal clusters. The simulation results show that the PSO-SA clustering algorithm not only has a better response but also converges more quickly than the K-means, PSO, and SA algorithms.
Resumo:
This paper presents a new hybrid evolutionary algorithm based on Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO) for daily Volt/Var control in distribution system including Distributed Generators (DGs). Due to the small X/R ratio and radial configuration of distribution systems, DGs have much impact on this problem. Since DGs are independent power producers or private ownership, a price based methodology is proposed as a proper signal to encourage owners of DGs in active power generation. Generally, the daily Volt/Var control is a nonlinear optimization problem. Therefore, an efficient hybrid evolutionary method based on Particle Swarm Optimization and Ant Colony Optimization (ACO), called HPSO, is proposed to determine the active power values of DGs, reactive power values of capacitors and tap positions of transformers for the next day. The feasibility of the proposed algorithm is demonstrated and compared with methods based on the original PSO, ACO and GA algorithms on IEEE 34-bus distribution feeder.
Resumo:
This paper presents a new algorithm based on honey-bee mating optimization (HBMO) to estimate harmonic state variables in distribution networks including distributed generators (DGs). The proposed algorithm performs estimation for both amplitude and phase of each harmonics by minimizing the error between the measured values from phasor measurement units (PMUs) and the values computed from the estimated parameters during the estimation process. Simulation results on two distribution test system are presented to demonstrate that the speed and accuracy of proposed distribution harmonic state estimation (DHSE) algorithm is extremely effective and efficient in comparison with the conventional algorithms such as weight least square (WLS), genetic algorithm (GA) and tabu search (TS).
Resumo:
This paper presents an efficient algorithm for multi-objective distribution feeder reconfiguration based on Modified Honey Bee Mating Optimization (MHBMO) approach. The main objective of the Distribution feeder reconfiguration (DFR) is to minimize the real power loss, deviation of the nodes’ voltage. Because of the fact that the objectives are different and no commensurable, it is difficult to solve the problem by conventional approaches that may optimize a single objective. So the metahuristic algorithm has been applied to this problem. This paper describes the full algorithm to Objective functions paid, The results of simulations on a 32 bus distribution system is given and shown high accuracy and optimize the proposed algorithm in power loss minimization.
Resumo:
A long query provides more useful hints for searching relevant documents, but it is likely to introduce noise which affects retrieval performance. In order to smooth such adverse effect, it is important to reduce noisy terms, introduce and boost additional relevant terms. This paper presents a comprehensive framework, called Aspect Hidden Markov Model (AHMM), which integrates query reduction and expansion, for retrieval with long queries. It optimizes the probability distribution of query terms by utilizing intra-query term dependencies as well as the relationships between query terms and words observed in relevance feedback documents. Empirical evaluation on three large-scale TREC collections demonstrates that our approach, which is automatic, achieves salient improvements over various strong baselines, and also reaches a comparable performance to a state of the art method based on user’s interactive query term reduction and expansion.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
This paper presents a novel algorithm based on particle swarm optimization (PSO) to estimate the states of electric distribution networks. In order to improve the performance, accuracy, convergence speed, and eliminate the stagnation effect of original PSO, a secondary PSO loop and mutation algorithm as well as stretching function is proposed. For accounting uncertainties of loads in distribution networks, pseudo-measurements is modeled as loads with the realistic errors. Simulation results on 6-bus radial and 34-bus IEEE test distribution networks show that the distribution state estimation based on proposed DLM-PSO presents lower estimation error and standard deviation in comparison with algorithms such as WLS, GA, HBMO, and original PSO.
Resumo:
Studies on quantitative fit analysis of precontoured fracture fixation plates emerged within the last few years and therefore, there is a wide research gap in this area. Quantitative fit assessment facilitates the measure of the gap between a fracture fixation plate and the underlying bone, and specifies the required plate fit criteria. For clinically meaningful fit assessment outcome, it is necessary to establish the appropriate criteria and parameter. The present paper studies this subject and recommends using multiple fit criteria and the maximum distance between the plate and underlying bone as fit parameter for clinically relevant outcome. We also propose the development of a software tool for automatic plate positioning and fit assessment for the purpose of implant design validation and optimization in an effort to provide better fitting implant that can assist proper fracture healing. The fundamental specifications of the software are discussed.
Resumo:
In Service-oriented Architectures, business processes can be realized by composing loosely coupled services. The problem of QoS-aware service composition is widely recognized in the literature. Existing approaches on computing an optimal solution to this problem tackle structured business processes, i.e., business processes which are composed of XOR-block, AND-block, and repeat loop orchestration components. As of yet, OR-block and unstructured orchestration components have not been sufficiently considered in the context of QoS-aware service composition. The work at hand addresses this shortcoming. An approach for computing an optimal solution to the service composition problem is proposed considering the structured orchestration components, such as AND/XOR/OR-block and repeat loop, as well as unstructured orchestration components.
Resumo:
Scaffolds play a pivotal role in tissue engineering, promoting the synthesis of neo extra-cellular matrix (ECM), and providing temporary mechanical support for the cells during tissue regeneration. Advances introduced by additive manufacturing techniques have significantly improved the ability to regulate scaffold architecture, enhancing the control over scaffold shape and porosity. Thus, considerable research efforts have been devoted to the fabrication of 3D porous scaffolds with optimized micro-architectural features. This chapter gives an overview of the methods for the design of additively manufactured scaffolds and their applicability in tissue engineering (TE). Along with a survey of the state of the art, the Authors will also present a recently developed method, called Load-Adaptive Scaffold Architecturing (LASA), which returns scaffold architectures optimized for given applied mechanical loads systems, once the specific stress distribution is evaluated through Finite Element Analysis (FEA).
Resumo:
This paper presents a method for the estimation of thrust model parameters of uninhabited airborne systems using specific flight tests. Particular tests are proposed to simplify the estimation. The proposed estimation method is based on three steps. The first step uses a regression model in which the thrust is assumed constant. This allows us to obtain biased initial estimates of the aerodynamic coeficients of the surge model. In the second step, a robust nonlinear state estimator is implemented using the initial parameter estimates, and the model is augmented by considering the thrust as random walk. In the third step, the estimate of the thrust obtained by the observer is used to fit a polynomial model in terms of the propeller advanced ratio. We consider a numerical example based on Monte-Carlo simulations to quantify the sampling properties of the proposed estimator given realistic flight conditions.
Resumo:
Commodity price modeling is normally approached in terms of structural time-series models, in which the different components (states) have a financial interpretation. The parameters of these models can be estimated using maximum likelihood. This approach results in a non-linear parameter estimation problem and thus a key issue is how to obtain reliable initial estimates. In this paper, we focus on the initial parameter estimation problem for the Schwartz-Smith two-factor model commonly used in asset valuation. We propose the use of a two-step method. The first step considers a univariate model based only on the spot price and uses a transfer function model to obtain initial estimates of the fundamental parameters. The second step uses the estimates obtained in the first step to initialize a re-parameterized state-space-innovations based estimator, which includes information related to future prices. The second step refines the estimates obtained in the first step and also gives estimates of the remaining parameters in the model. This paper is part tutorial in nature and gives an introduction to aspects of commodity price modeling and the associated parameter estimation problem.