938 resultados para Objective function values
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
This work performs an algorithmic study of optimization of a conformal radiotherapy plan treatment. Initially we show: an overview about cancer, radiotherapy and the physics of interaction of ionizing radiation with matery. A proposal for optimization of a plan of treatment in radiotherapy is developed in a systematic way. We show the paradigm of multicriteria problem, the concept of Pareto optimum and Pareto dominance. A generic optimization model for radioterapic treatment is proposed. We construct the input of the model, estimate the dose given by the radiation using the dose matrix, and show the objective function for the model. The complexity of optimization models in radiotherapy treatment is typically NP which justifyis the use of heuristic methods. We propose three distinct methods: MOGA, MOSA e MOTS. The project of these three metaheuristic procedures is shown. For each procedures follows: a brief motivation, the algorithm itself and the method for tuning its parameters. The three method are applied to a concrete case and we confront their performances. Finally it is analyzed for each method: the quality of the Pareto sets, some solutions and the respective Pareto curves
Resumo:
Nonogram is a logical puzzle whose associated decision problem is NP-complete. It has applications in pattern recognition problems and data compression, among others. The puzzle consists in determining an assignment of colors to pixels distributed in a N M matrix that satisfies line and column constraints. A Nonogram is encoded by a vector whose elements specify the number of pixels in each row and column of a figure without specifying their coordinates. This work presents exact and heuristic approaches to solve Nonograms. The depth first search was one of the chosen exact approaches because it is a typical example of brute search algorithm that is easy to implement. Another implemented exact approach was based on the Las Vegas algorithm, so that we intend to investigate whether the randomness introduce by the Las Vegas-based algorithm would be an advantage over the depth first search. The Nonogram is also transformed into a Constraint Satisfaction Problem. Three heuristics approaches are proposed: a Tabu Search and two memetic algorithms. A new function to calculate the objective function is proposed. The approaches are applied on 234 instances, the size of the instances ranging from 5 x 5 to 100 x 100 size, and including logical and random Nonograms
Resumo:
The history match procedure in an oil reservoir is of paramount importance in order to obtain a characterization of the reservoir parameters (statics and dynamics) that implicates in a predict production more perfected. Throughout this process one can find reservoir model parameters which are able to reproduce the behaviour of a real reservoir.Thus, this reservoir model may be used to predict production and can aid the oil file management. During the history match procedure the reservoir model parameters are modified and for every new set of reservoir model parameters found, a fluid flow simulation is performed so that it is possible to evaluate weather or not this new set of parameters reproduces the observations in the actual reservoir. The reservoir is said to be matched when the discrepancies between the model predictions and the observations of the real reservoir are below a certain tolerance. The determination of the model parameters via history matching requires the minimisation of an objective function (difference between the observed and simulated productions according to a chosen norm) in a parameter space populated by many local minima. In other words, more than one set of reservoir model parameters fits the observation. With respect to the non-uniqueness of the solution, the inverse problem associated to history match is ill-posed. In order to reduce this ambiguity, it is necessary to incorporate a priori information and constraints in the model reservoir parameters to be determined. In this dissertation, the regularization of the inverse problem associated to the history match was performed via the introduction of a smoothness constraint in the following parameter: permeability and porosity. This constraint has geological bias of asserting that these two properties smoothly vary in space. In this sense, it is necessary to find the right relative weight of this constrain in the objective function that stabilizes the inversion and yet, introduces minimum bias. A sequential search method called COMPLEX was used to find the reservoir model parameters that best reproduce the observations of a semi-synthetic model. This method does not require the usage of derivatives when searching for the minimum of the objective function. Here, it is shown that the judicious introduction of the smoothness constraint in the objective function formulation reduces the associated ambiguity and introduces minimum bias in the estimates of permeability and porosity of the semi-synthetic reservoir model
Resumo:
The gravity inversion method is a mathematic process that can be used to estimate the basement relief of a sedimentary basin. However, the inverse problem in potential-field methods has neither a unique nor a stable solution, so additional information (other than gravity measurements) must be supplied by the interpreter to transform this problem into a well-posed one. This dissertation presents the application of a gravity inversion method to estimate the basement relief of the onshore Potiguar Basin. The density contrast between sediments and basament is assumed to be known and constant. The proposed methodology consists of discretizing the sedimentary layer into a grid of rectangular juxtaposed prisms whose thicknesses correspond to the depth to basement which is the parameter to be estimated. To stabilize the inversion I introduce constraints in accordance with the known geologic information. The method minimizes an objective function of the model that requires not only the model to be smooth and close to the seismic-derived model, which is used as a reference model, but also to honor well-log constraints. The latter are introduced through the use of logarithmic barrier terms in the objective function. The inversion process was applied in order to simulate different phases during the exploration development of a basin. The methodology consisted in applying the gravity inversion in distinct scenarios: the first one used only gravity data and a plain reference model; the second scenario was divided in two cases, we incorporated either borehole logs information or seismic model into the process. Finally I incorporated the basement depth generated by seismic interpretation into the inversion as a reference model and imposed depth constraint from boreholes using the primal logarithmic barrier method. As a result, the estimation of the basement relief in every scenario has satisfactorily reproduced the basin framework, and the incorporation of the constraints led to improve depth basement definition. The joint use of surface gravity data, seismic imaging and borehole logging information makes the process more robust and allows an improvement in the estimate, providing a result closer to the actual basement relief. In addition, I would like to remark that the result obtained in the first scenario already has provided a very coherent basement relief when compared to the known basin framework. This is significant information, when comparing the differences in the costs and environment impact related to gravimetric and seismic surveys and also the well drillings
Resumo:
OBJETIVO: O aumento do índice de massa corporal (IMC) tem sido associado a uma maior prevalência da asma em adultos. O presente estudo tem o objetivo de avaliar a associação entre a prevalência da obesidade e a gravidade da asma. MÉTODOS: Prontuários de duzentos asmáticos acima dos 20 anos de idade foram avaliados retrospectivamente. A asma foi classificada quanto à gravidade através da história clínica e do diagnóstico registrados, dos resultados da espirometria e da medicação prescrita. O IMC foi calculado e foram considerados obesos os pacientes com IMC > 30 kg/m². RESULTADOS: 23% dos pacientes apresentavam asma intermitente, 25,5%, asma persistente leve, 24%, asma persistente moderada e 27,5%, asma persistente grave. O IMC < 29,9 kg/m² foi observado em 68% dos pacientes e em 32% o IMC foi > 30 kg/m². O odds ratio da relação entre a obesidade e a gravidade da asma foi de 1,17 (CI95%: 0,90-1,53; p > 0,05). CONCLUSÕES: Na amostra estudada não foi encontrada correlação entre a obesidade e a gravidade da asma nem no sexo masculino, nem no feminino.
Resumo:
The study of robust design methodologies and techniques has become a new topical area in design optimizations in nearly all engineering and applied science disciplines in the last 10 years due to inevitable and unavoidable imprecision or uncertainty which is existed in real word design problems. To develop a fast optimizer for robust designs, a methodology based on polynomial chaos and tabu search algorithm is proposed. In the methodology, the polynomial chaos is employed as a stochastic response surface model of the objective function to efficiently evaluate the robust performance parameter while a mechanism to assign expected fitness only to promising solutions is introduced in tabu search algorithm to minimize the requirement for determining robust metrics of intermediate solutions. The proposed methodology is applied to the robust design of a practical inverse problem with satisfactory results.
Resumo:
Neste artigo é proposto um método semiautomático para extração de rodovias combinando um estereopar de imagens aéreas de baixa resolução com um poliedro gerado a partir de um modelo digital do terreno (MDT). O problema é formulado no espaço-objeto através de uma função objetivo que modela o objeto 'rodovia' como uma curva suave e pertencente a uma superfície poliédrica. A função objetivo proposta depende também de informações radiométricas, que são acessadas no espaço-imagem via relação de colinearidade entre pontos da rodovia no espaço-objeto e os correspondentes nos espaços imagem do estereopar. A linha poligonal que melhor modela a rodovia selecionada é obtida por otimização no espaço-objeto da função objetivo, tendo por base o algoritmo de programação dinâmica. O processo de otimização é iterativo e dependente do fornecimento por um operador de uma aproximação inicial para a rodovia selecionada. Os resultados obtidos mostraram que o método é robusto frente a anomalias existentes ao longo das rodovias, tais como obstruções causadas por sombras e árvores.
Resumo:
A non-linear model is presented which optimizes the lay-out, as well as the design and management of trickle irrigation systems, to achieve maximum net benefit. The model consists of an objective function that maximizes profit at the farm level, subject to appropriate geometric and hydraulic constraints. It can be applied to rectangular shaped fields, with uniform or zero slope. The software used is the Gams-Minos package. The basic inputs are the crop-water-production function, the cost function and cost of system components, and design variables. The main outputs are the annual net benefit and pipe diameters and lengths. To illustrate the capability of the model, a sensitivity analysis of the annual net benefit for a citrus field is evaluated with respect to irrigated area, ground slope, micro-sprinkler discharge and shape of the field. The sensitivity analysis suggests that the greatest benefit is obtained with the smallest microsprinkler discharge, the greatest area, a square field and zero ground slope. The costs of the investment and energy are the components of the objective function that had the greatest effect in the 120 situations evaluated. (C) 1996 Academic Press Limited
Resumo:
We evaluated the influence of dietry inclusion of corn gluten meal, apocartenoic acid ethyl ester (APO-EE), canthaxanthin, and Rhodocylus gelatinosus R-1 biomass on broiler carcass color. These oxycarotenoid sources were used as pigment supplements to a basal ration containing yellow corn as the sole source of xnathophylls. Objective color values of L (lightness),C (chroma), and h (hue) were measured on skin and meat surfaces of broiler carcasses. on both surfaces, R. gelatinosus R-1 biomass oxycarotenoids enhanced the chroma values (color saturation), as compared to yellow corn xanthophylls, and tended to provide yellowness to broiler carcasses, whereas the APO-EE and canthaxanthin tended to provide redness. At the concentrations studied, R. gelatinosus R-1 biomass oxycarotenoids were less effective than APO-EE and canthaxanthin in enhancing color saturation. Lightness, chroma, and blue values did not differ significantly between males and females. However, skin showed significantly higher color saturation than meat in breast and thigh portions of the carcass.
Resumo:
Linear Matrix Inequalities (LMIs) is a powerful too] that has been used in many areas ranging from control engineering to system identification and structural design. There are many factors that make LMI appealing. One is the fact that a lot of design specifications and constrains can be formulated as LMIs [1]. Once formulated in terms of LMIs a problem can be solved efficiently by convex optimization algorithms. The basic idea of the LMI method is to formulate a given problem as an optimization problem with linear objective function and linear matrix inequalities constrains. An intelligent structure involves distributed sensors and actuators and a control law to apply localized actions, in order to minimize or reduce the response at selected conditions. The objective of this work is to implement techniques of control based on LMIs applied to smart structures.
Resumo:
Analog networks for solving convex nonlinear unconstrained programming problems without using gradient information of the objective function are proposed. The one-dimensional net can be used as a building block in multi-dimensional networks for optimizing objective functions of several variables.
Resumo:
Piecewise-Linear Programming (PLP) is an important area of Mathematical Programming and concerns the minimisation of a convex separable piecewise-linear objective function, subject to linear constraints. In this paper a subarea of PLP called Network Piecewise-Linear Programming (NPLP) is explored. The paper presents four specialised algorithms for NPLP: (Strongly Feasible) Primal Simplex, Dual Method, Out-of-Kilter and (Strongly Polynomial) Cost-Scaling and their relative efficiency is studied. A statistically designed experiment is used to perform a computational comparison of the algorithms. The response variable observed in the experiment is the CPU time to solve randomly generated network piecewise-linear problems classified according to problem class (Transportation, Transshipment and Circulation), problem size, extent of capacitation, and number of breakpoints per arc. Results and conclusions on performance of the algorithms are reported.