974 resultados para Simulated annealing (Matemática)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we investigate the camera network placement problem for target coverage in manufacturing workplaces. The problem is formulated to find the minimum number of cameras of different types and their best configurations to maximise the coverage of the monitored workplace such that the given set of target points of interest are each k-covered with a predefined minimum spatial resolution. Since the problem is NP-complete, and even NP-hard to approximate, a novel method based on Simulated Annealing is presented to solve the optimisation problem. A new neighbourhood generation function is proposed to handle the discrete nature of the problem. The visual coverage is modelled using realistic and coherent assumptions of camera intrinsic and extrinsic parameters making it suitable for many real world camera based applications. Task-specific quality of coverage measure is proposed to assist selecting the best among the set of camera network placements with equal coverage. A 3D CAD of the monitored space is used to examine physical occlusions of target points. The results show the accuracy, efficiency and scalability of the presented solution method; which can be applied effectively in the design of practical camera networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In contrast to point forecast, prediction interval-based neural network offers itself as an effective tool to quantify the uncertainty and disturbances that associated with process data. However, single best neural network (NN) does not always guarantee to predict better quality of forecast for different data sets or a whole range of data set. Literature reported that ensemble of NNs using forecast combination produces stable and consistence forecast than single best NN. In this work, a NNs ensemble procedure is introduced to construct better quality of Pis. Weighted averaging forecasts combination mechanism is employed to combine the Pi-based forecast. As the key contribution of this paper, a new Pi-based cost function is proposed to optimize the individual weights for NN in combination process. An optimization algorithm, named simulated annealing (SA) is used to minimize the PI-based cost function. Finally, the proposed method is examined in two different case studies and compared the results with the individual best NNs and available simple averaging Pis aggregating method. Simulation results demonstrated that the proposed method improved the quality of Pis than individual best NNs and simple averaging ensemble method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the resource-allocation problem in multicell networks targeting the max-min throughput of all cells. A joint optimization over power control, channel allocation, and user association is considered, and the problem is then formulated as a nonconvex mixed-integer nonlinear problem (MINLP). To solve this problem, we proposed an alternating-optimization-based algorithm, which applies branch-and-bound and simulated annealing in solving subproblems at each optimization step. We also demonstrate the convergence and efficiency of the proposed algorithms by thorough numerical experiments. The experimental results show that joint optimization over all resources outperforms the restricted optimization over individual resources significantly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neural networks (NNs) are an effective tool to model nonlinear systems. However, their forecasting performance significantly drops in the presence of process uncertainties and disturbances. NN-based prediction intervals (PIs) offer an alternative solution to appropriately quantify uncertainties and disturbances associated with point forecasts. In this paper, an NN ensemble procedure is proposed to construct quality PIs. A recently developed lower-upper bound estimation method is applied to develop NN-based PIs. Then, constructed PIs from the NN ensemble members are combined using a weighted averaging mechanism. Simulated annealing and a genetic algorithm are used to optimally adjust the weights for the aggregation mechanism. The proposed method is examined for three different case studies. Simulation results reveal that the proposed method improves the average PI quality of individual NNs by 22%, 18%, and 78% for the first, second, and third case studies, respectively. The simulation study also demonstrates that a 3%-4% improvement in the quality of PIs can be achieved using the proposed method compared to the simple averaging aggregation method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a methodology for automatic extraction of building roof contours from a Digital Elevation Model (DEM), which is generated through the regularization of an available laser point cloud. The methodology is based on two steps. First, in order to detect high objects (buildings, trees etc.), the DEM is segmented through a recursive splitting technique and a Bayesian merging technique. The recursive splitting technique uses the quadtree structure for subdividing the DEM into homogeneous regions. In order to minimize the fragmentation, which is commonly observed in the results of the recursive splitting segmentation, a region merging technique based on the Bayesian framework is applied to the previously segmented data. The high object polygons are extracted by using vectorization and polygonization techniques. Second, the building roof contours are identified among all high objects extracted previously. Taking into account some roof properties and some feature measurements (e. g., area, rectangularity, and angles between principal axes of the roofs), an energy function was developed based on the Markov Random Field (MRF) model. The solution of this function is a polygon set corresponding to building roof contours and is found by using a minimization technique, like the Simulated Annealing (SA) algorithm. Experiments carried out with laser scanning DEM's showed that the methodology works properly, as it delivered roof contours with approximately 90% shape accuracy and no false positive was verified.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper presents an extended genetic algorithm for solving the optimal transmission network expansion planning problem. Two main improvements have been introduced in the genetic algorithm: (a) initial population obtained by conventional optimisation based methods; (b) mutation approach inspired in the simulated annealing technique, the proposed method is general in the sense that it does not assume any particular property of the problem being solved, such as linearity or convexity. Excellent performance is reported in the test results section of the paper for a difficult large-scale real-life problem: a substantial reduction in investment costs has been obtained with regard to previous solutions obtained via conventional optimisation methods and simulated annealing algorithms; statistical comparison procedures have been employed in benchmarking different versions of the genetic algorithm and simulated annealing methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aspartic protease (EC 3.4.23) make up a widely distributed class of enzymes in animals, plants, microbes and, viruses. In animals these enzymes perform diverse functions, which range from digestion of food proteins to very specific regulatory roles. In contrast the information about the well-characterized aspartic proteases, very little is known about the corresponding enzyme in urine. A new aspartic protease isolated from human urine has been crystallized and X-ray diffraction data collected to 2.45 Angstrom resolution using a synchrotron radiation source. Crystals belong to the space group P2(1)2(1)2(1) the cell parameters obtained were a=50.99, b=75.56 and c=89.90 Angstrom. Preliminary analysis revealed the presence of one molecule in the asymmetric unit. The structure was determined using the molecular replacement technique and is currently being refined using simulated annealing and conjugate gradient protocols.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel common Tabu algorithm for global optimizations of engineering problems is presented. The robustness and efficiency of the presented method are evaluated by using standard mathematical functions and hy solving a practical engineering problem. The numerical results show that the proposed method is (i) superior to the conventional Tabu search algorithm in robustness, and (ii) superior to the simulated annealing algorithm in efficiency. (C) 2001 Elsevier B.V. B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the present work we study a long superconducting wire with a columnar defect in the presence of an applied magnetic field. The cross section of the cylinder is assumed to be circular. The field is taken uniform and parallel to the cylinder axis. We use the London theory to investigate the vortex lattice inside the wire. Although this theory is valid in the limit of low vortex density, that is, when the nearest neighbor vortex distance is much larger than the coherence length, we can obtain a reasonable qualitative description of lattice properties. We calculate: (1) the vortex lattice structure using the simulated annealing technique; (2) the magnetization curve as a function of the applied field.