15 resultados para Pareto optimality

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work performs an algorithmic study of optimization of a conformal radiotherapy plan treatment. Initially we show: an overview about cancer, radiotherapy and the physics of interaction of ionizing radiation with matery. A proposal for optimization of a plan of treatment in radiotherapy is developed in a systematic way. We show the paradigm of multicriteria problem, the concept of Pareto optimum and Pareto dominance. A generic optimization model for radioterapic treatment is proposed. We construct the input of the model, estimate the dose given by the radiation using the dose matrix, and show the objective function for the model. The complexity of optimization models in radiotherapy treatment is typically NP which justifyis the use of heuristic methods. We propose three distinct methods: MOGA, MOSA e MOTS. The project of these three metaheuristic procedures is shown. For each procedures follows: a brief motivation, the algorithm itself and the method for tuning its parameters. The three method are applied to a concrete case and we confront their performances. Finally it is analyzed for each method: the quality of the Pareto sets, some solutions and the respective Pareto curves

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of João Câmara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5°, 2.0°, 2.5°, 3.0° and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2º

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper aim to check a hypothesis that assumes several behaviors related to social work norm´s obeying as a phenomenon that can be explained by actor´s social network structure and the rational choice processes related to the social norm inside that network, principally the payoff´s analysis received by the closest actors, or neighbors, at a social situation. Taking the sociological paradigm of rational action theory as a basis, the focus is on a debate about the logic of social norms, from Émile Durkheim´s method to Jon Elster´s theory, but also including social network analysis´s variables according to Robert Hanneman; and also Vilfredo Pareto´s constants related to human sociability, at the aim to detect elements that can help the scholars to develop an agent based model which could explain the sociological problem of deviance by a better way than the common sense´s view about morality and ethics at a social work environment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Copper is one of the most used metals in platingprocesses of galvanic industries. The presence of copper, a heavy metal, in galvanic effluents is harmful to the environment.The main objective of this researchwas the removal ofcopperfromgalvanic effluents, using for this purpose anionic surfactants. The removal process is based on the interaction between the polar head group of the anionic surfactant and the divalent copper in solution. The surfactants used in this study were derived from soybean oil (OSS), coconut oil (OCS), and sunflower oil (OGS). It was used a copper synthetic solution (280 ppm Cu+2) simulating the rinse water from a copper acid bath of a galvanic industry. It were developed 23and 32 factorial designs to evaluate the parameters that have influence in theremoval process. For each surfactant (OSS, OCS, and OGS), the independent variables evaluated were: surfactant concentration (1.25 to 3.75 g/L), pH (5 to 9) and the presence of an anionic polymer (0 to 0.0125 g/L).From the results obtained in the 23 factorial design and in the calculus for estimatingthe stoichiometric relationship between surfactants and copper in solution, it were developed new experimental tests, varying surfactant concentration in the range of 1.25 to 6.8 g/L (32 factorial design).The results obtained in the experimental designs were subjected to statistical evaluations to obtain Pareto charts and mathematical modelsfor Copper removal efficiency (%). The statistical evaluation of the 23 and 32factorial designs, using saponifiedcoconut oil (OCS), presented the mathematical model that best described the copper removal process.It can be concluded that OCS was the most efficient anionic surfactant, removing 100% of the copper present in the synthetic galvanic solution

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Present day weather forecast models usually cannot provide realistic descriptions of local and particulary extreme weather conditions. However, for lead times of about a small number of days, they provide reliable forecast of the atmospheric circulation that encompasses the subscale processes leading to extremes. Hence, forecasts of extreme events can only be achieved through a combination of dynamical and statistical analysis methods, where a stable and significant statistical model based on prior physical reasoning establishes posterior statistical-dynamical model between the local extremes and the large scale circulation. Here we present the development and application of such a statistical model calibration on the besis of extreme value theory, in order to derive probabilistic forecast for extreme local temperature. The dowscaling applies to NCEP/NCAR re-analysis, in order to derive estimates of daily temperature at Brazilian northeastern region weather stations

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Survival Analysis, long duration models allow for the estimation of the healing fraction, which represents a portion of the population immune to the event of interest. Here we address classical and Bayesian estimation based on mixture models and promotion time models, using different distributions (exponential, Weibull and Pareto) to model failure time. The database used to illustrate the implementations is described in Kersey et al. (1987) and it consists of a group of leukemia patients who underwent a certain type of transplant. The specific implementations used were numeric optimization by BFGS as implemented in R (base::optim), Laplace approximation (own implementation) and Gibbs sampling as implemented in Winbugs. We describe the main features of the models used, the estimation methods and the computational aspects. We also discuss how different prior information can affect the Bayesian estimates

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Quadratic Minimum Spanning Tree Problem (QMST) is a version of the Minimum Spanning Tree Problem in which, besides the traditional linear costs, there is a quadratic structure of costs. This quadratic structure models interaction effects between pairs of edges. Linear and quadratic costs are added up to constitute the total cost of the spanning tree, which must be minimized. When these interactions are restricted to adjacent edges, the problem is named Adjacent Only Quadratic Minimum Spanning Tree (AQMST). AQMST and QMST are NP-hard problems that model several problems of transport and distribution networks design. In general, AQMST arises as a more suitable model for real problems. Although, in literature, linear and quadratic costs are added, in real applications, they may be conflicting. In this case, it may be interesting to consider these costs separately. In this sense, Multiobjective Optimization provides a more realistic model for QMST and AQMST. A review of the state-of-the-art, so far, was not able to find papers regarding these problems under a biobjective point of view. Thus, the objective of this Thesis is the development of exact and heuristic algorithms for the Biobjective Adjacent Only Quadratic Spanning Tree Problem (bi-AQST). In order to do so, as theoretical foundation, other NP-hard problems directly related to bi-AQST are discussed: the QMST and AQMST problems. Bracktracking and branch-and-bound exact algorithms are proposed to the target problem of this investigation. The heuristic algorithms developed are: Pareto Local Search, Tabu Search with ejection chain, Transgenetic Algorithm, NSGA-II and a hybridization of the two last-mentioned proposals called NSTA. The proposed algorithms are compared to each other through performance analysis regarding computational experiments with instances adapted from the QMST literature. With regard to exact algorithms, the analysis considers, in particular, the execution time. In case of the heuristic algorithms, besides execution time, the quality of the generated approximation sets is evaluated. Quality indicators are used to assess such information. Appropriate statistical tools are used to measure the performance of exact and heuristic algorithms. Considering the set of instances adopted as well as the criteria of execution time and quality of the generated approximation set, the experiments showed that the Tabu Search with ejection chain approach obtained the best results and the transgenetic algorithm ranked second. The PLS algorithm obtained good quality solutions, but at a very high computational time compared to the other (meta)heuristics, getting the third place. NSTA and NSGA-II algorithms got the last positions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Developing an efficient methodology for oil recovery is extremely important . Within the range of enh anced oil recovery, known as EOR, the injection of polymer solutions becomes effective in controlling the mobility of displacing fluid . This method consists of adding polymers to the injection water to increase its viscosity, so that more water diffuses in to the porous medium and increasing the sweep efficiency in the reservoir. This work is studied by numerical simulation , application of the injection polymer solution in a homogeneous reservoir , semisynthetic with similar characteristics to the reservoirs of the Brazilian Northeast , numerical simulations were performed using thermal simulator STARS from CMG (Computer Modelling Group ). The study aimed to analyze the influence of some parameters on the behavior of reservoir oil production, with the response to cumulative production. Simulations were performed to analyze the influence of water injection, polymer solution and alternating injection of water banks and polymer solution, comparing the results for each simulated condition. The primary outcomes were: oil viscosity, percentage of injected polymer, polymer viscosity and flow rate of water injection. The evaluation of the influence of variables consisted of a complete experimental design followed a Pareto analysis for the purpose of pointing out which va riables would be most influential on the response represented b y the cumulative oil production . It was found that all variables significantly influenced the recovery of oil and the injection of polymer solution on an ongoing basis is more efficient for the cumulative production compared to oil recovery by continuous water injection. The primary recovery show ed low levels of oil production , water injection significantly improves the pro duction of oil in the reservoir , but the injection of polymer solution em erges as a new methodology to increase the production of oil, increasing the life of the well and possible reduction of water produced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis used four different methods in order to diagnose the precipitation extremes on Northeastern Brazil (NEB): Generalized Linear Model s via logistic regression and Poisson, extreme value theory analysis via generalized extre me value (GEV) and generalized Pareto (GPD) distributions and Vectorial Generalized Linea r Models via GEV (MVLG GEV). The logistic regression and Poisson models were used to identify the interactions between the precipitation extremes and other variables based on the odds ratios and relative risks. It was found that the outgoing longwave radiation was the indicator variable for the occurrence of extreme precipitation on eastern, northern and semi arid NEB, and the relative humidity was verified on southern NEB. The GEV and GPD distribut ions (based on the 95th percentile) showed that the location and scale parameters were presented the maximum on the eastern and northern coast NEB, the GEV verified a maximum core on western of Pernambuco influenced by weather systems and topography. The GEV and GPD shape parameter, for most regions the data fitted by Weibull negative an d Beta distributions (ξ < 0) , respectively. The levels and return periods of GEV (GPD) on north ern Maranhão (centerrn of Bahia) may occur at least an extreme precipitation event excee ding over of 160.9 mm /day (192.3 mm / day) on next 30 years. The MVLG GEV model found tha t the zonal and meridional wind components, evaporation and Atlantic and Pacific se a surface temperature boost the precipitation extremes. The GEV parameters show the following results: a) location ( ), the highest value was 88.26 ± 6.42 mm on northern Maran hão; b) scale ( σ ), most regions showed positive values, except on southern of Maranhão; an d c) shape ( ξ ), most of the selected regions were adjusted by the Weibull negative distr ibution ( ξ < 0 ). The southern Maranhão and southern Bahia have greater accuracy. The level period, it was estimated that the centern of Bahia may occur at least an extreme precipitatio n event equal to or exceeding over 571.2 mm/day on next 30 years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intense precipitation events (IPE) have been causing great social and economic losses in the affected regions. In the Amazon, these events can have serious impacts, primarily for populations living on the margins of its countless rivers, because when water levels are elevated, floods and/or inundations are generally observed. Thus, the main objective of this research is to study IPE, through Extreme Value Theory (EVT), to estimate return periods of these events and identify regions of the Brazilian Amazon where IPE have the largest values. The study was performed using daily rainfall data of the hydrometeorological network managed by the National Water Agency (Agência Nacional de Água) and the Meteorological Data Bank for Education and Research (Banco de Dados Meteorológicos para Ensino e Pesquisa) of the National Institute of Meteorology (Instituto Nacional de Meteorologia), covering the period 1983-2012. First, homogeneous rainfall regions were determined through cluster analysis, using the hierarchical agglomerative Ward method. Then synthetic series to represent the homogeneous regions were created. Next EVT, was applied in these series, through Generalized Extreme Value (GEV) and the Generalized Pareto Distribution (GPD). The goodness of fit of these distributions were evaluated by the application of the Kolmogorov-Smirnov test, which compares the cumulated empirical distributions with the theoretical ones. Finally, the composition technique was used to characterize the prevailing atmospheric patterns for the occurrence of IPE. The results suggest that the Brazilian Amazon has six pluvial homogeneous regions. It is expected more severe IPE to occur in the south and in the Amazon coast. More intense rainfall events are expected during the rainy or transitions seasons of each sub-region, with total daily precipitation of 146.1, 143.1 and 109.4 mm (GEV) and 201.6, 209.5 and 152.4 mm (GPD), at least once year, in the south, in the coast and in the northwest of the Brazilian Amazon, respectively. For the south Amazonia, the composition analysis revealed that IPE are associated with the configuration and formation of the South Atlantic Convergence Zone. Along the coast, intense precipitation events are associated with mesoscale systems, such Squall Lines. In Northwest Amazonia IPE are apparently associated with the Intertropical Convergence Zone and/or local convection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important problem faced by the oil industry is to distribute multiple oil products through pipelines. Distribution is done in a network composed of refineries (source nodes), storage parks (intermediate nodes), and terminals (demand nodes) interconnected by a set of pipelines transporting oil and derivatives between adjacent areas. Constraints related to storage limits, delivery time, sources availability, sending and receiving limits, among others, must be satisfied. Some researchers deal with this problem under a discrete viewpoint in which the flow in the network is seen as batches sending. Usually, there is no separation device between batches of different products and the losses due to interfaces may be significant. Minimizing delivery time is a typical objective adopted by engineers when scheduling products sending in pipeline networks. However, costs incurred due to losses in interfaces cannot be disregarded. The cost also depends on pumping expenses, which are mostly due to the electricity cost. Since industrial electricity tariff varies over the day, pumping at different time periods have different cost. This work presents an experimental investigation of computational methods designed to deal with the problem of distributing oil derivatives in networks considering three minimization objectives simultaneously: delivery time, losses due to interfaces and electricity cost. The problem is NP-hard and is addressed with hybrid evolutionary algorithms. Hybridizations are mainly focused on Transgenetic Algorithms and classical multi-objective evolutionary algorithm architectures such as MOEA/D, NSGA2 and SPEA2. Three architectures named MOTA/D, NSTA and SPETA are applied to the problem. An experimental study compares the algorithms on thirty test cases. To analyse the results obtained with the algorithms Pareto-compliant quality indicators are used and the significance of the results evaluated with non-parametric statistical tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Quadratic Minimum Spanning Tree (QMST) problem is a generalization of the Minimum Spanning Tree problem in which, beyond linear costs associated to each edge, quadratic costs associated to each pair of edges must be considered. The quadratic costs are due to interaction costs between the edges. When interactions occur between adjacent edges only, the problem is named Adjacent Only Quadratic Minimum Spanning Tree (AQMST). Both QMST and AQMST are NP-hard and model a number of real world applications involving infrastructure networks design. Linear and quadratic costs are summed in the mono-objective versions of the problems. However, real world applications often deal with conflicting objectives. In those cases, considering linear and quadratic costs separately is more appropriate and multi-objective optimization provides a more realistic modelling. Exact and heuristic algorithms are investigated in this work for the Bi-objective Adjacent Only Quadratic Spanning Tree Problem. The following techniques are proposed: backtracking, branch-and-bound, Pareto Local Search, Greedy Randomized Adaptive Search Procedure, Simulated Annealing, NSGA-II, Transgenetic Algorithm, Particle Swarm Optimization and a hybridization of the Transgenetic Algorithm with the MOEA-D technique. Pareto compliant quality indicators are used to compare the algorithms on a set of benchmark instances proposed in literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Quadratic Minimum Spanning Tree (QMST) problem is a generalization of the Minimum Spanning Tree problem in which, beyond linear costs associated to each edge, quadratic costs associated to each pair of edges must be considered. The quadratic costs are due to interaction costs between the edges. When interactions occur between adjacent edges only, the problem is named Adjacent Only Quadratic Minimum Spanning Tree (AQMST). Both QMST and AQMST are NP-hard and model a number of real world applications involving infrastructure networks design. Linear and quadratic costs are summed in the mono-objective versions of the problems. However, real world applications often deal with conflicting objectives. In those cases, considering linear and quadratic costs separately is more appropriate and multi-objective optimization provides a more realistic modelling. Exact and heuristic algorithms are investigated in this work for the Bi-objective Adjacent Only Quadratic Spanning Tree Problem. The following techniques are proposed: backtracking, branch-and-bound, Pareto Local Search, Greedy Randomized Adaptive Search Procedure, Simulated Annealing, NSGA-II, Transgenetic Algorithm, Particle Swarm Optimization and a hybridization of the Transgenetic Algorithm with the MOEA-D technique. Pareto compliant quality indicators are used to compare the algorithms on a set of benchmark instances proposed in literature.