34 resultados para Unconstrained and convex optimization
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
We study under which conditions the core of a game involved in a convex decomposition of another game turns out to be a stable set of the decomposed game. Some applications and numerical examples, including the remarkable Lucas¿ five player game with a unique stable set different from the core, are reckoning and analyzed.
Resumo:
We study under which conditions the core of a game involved in a convex decomposition of another game turns out to be a stable set of the decomposed game. Some applications and numerical examples, including the remarkable Lucas¿ five player game with a unique stable set different from the core, are reckoning and analyzed.
Resumo:
Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the ethanol production in the fermentation of Saccharomyces cerevisiae.
Resumo:
Current technology trends in medical device industry calls for fabrication of massive arrays of microfeatures such as microchannels on to nonsilicon material substrates with high accuracy, superior precision, and high throughput. Microchannels are typical features used in medical devices for medication dosing into the human body, analyzing DNA arrays or cell cultures. In this study, the capabilities of machining systems for micro-end milling have been evaluated by conducting experiments, regression modeling, and response surface methodology. In machining experiments by using micromilling, arrays of microchannels are fabricated on aluminium and titanium plates, and the feature size and accuracy (width and depth) and surface roughness are measured. Multicriteria decision making for material and process parameters selection for desired accuracy is investigated by using particle swarm optimization (PSO) method, which is an evolutionary computation method inspired by genetic algorithms (GA). Appropriate regression models are utilized within the PSO and optimum selection of micromilling parameters; microchannel feature accuracy and surface roughness are performed. An analysis for optimal micromachining parameters in decision variable space is also conducted. This study demonstrates the advantages of evolutionary computing algorithms in micromilling decision making and process optimization investigations and can be expanded to other applications
Resumo:
Les xarxes híbrides satèl·lit-terrestre ofereixen connectivitat a zones remotes i aïllades i permeten resoldre nombrosos problemes de comunicacions. No obstant, presenten diversos reptes, ja que realitzen la comunicació per un canal mòbil terrestre i un canal satèl·lit contigu. Un d'aquests reptes és trobar mecanismes per realitzar eficientment l'enrutament i el control de flux, de manera conjunta. L'objectiu d'aquest projecte és simular i estudiar algorismes existents que resolguin aquests problemes, així com proposar-ne de nous, mitjançant diverses tècniques d'optimització convexa. A partir de les simulacions realitzades en aquest estudi, s'han analitzat àmpliament els diversos problemes d'enrutament i control de flux, i s'han avaluat els resultats obtinguts i les prestacions dels algorismes emprats. En concret, s'han implementat de manera satisfactòria algorismes basats en el mètode de descomposició dual, el mètode de subgradient, el mètode de Newton i el mètode de la barrera logarítmica, entre d'altres, per tal de resoldre els problemes d'enrutament i control de flux plantejats.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.
Resumo:
We characterize the Walrasian allocations correspondence, in classesof exchange economies with smooth and convex preferences, by means of consistency requirements and other axioms. We present three characterizationresults; all of which require consistency, converse consistency and standard axioms. Two characterizations hold also on domains with a finite number ofpotential agents, one of them requires envy freeness (with respect to trades) and the other--core selection; a third characterization, that requires coreselection, applies only to a variable number of agents domain, but is validalso when the domain includes only a small variety of preferences.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
A new approach to segmentation based on fusing circumscribed contours, region growing and clustering
Resumo:
One of the major problems in machine vision is the segmentation of images of natural scenes. This paper presents a new proposal for the image segmentation problem which has been based on the integration of edge and region information. The main contours of the scene are detected and used to guide the posterior region growing process. The algorithm places a number of seeds at both sides of a contour allowing stating a set of concurrent growing processes. A previous analysis of the seeds permits to adjust the homogeneity criterion to the regions's characteristics. A new homogeneity criterion based on clustering analysis and convex hull construction is proposed
Resumo:
Transport costs in address models of differentiation are usually modeled as separable of the consumption commodity and with a parametric price. However, there are many sectors in an economy where such modeling is not satisfactory either because transportation is supplied under oligopolistic conditions or because there is a difference (loss) between the amount delivered at the point of production and the amount received at the point of consumption. This paper is a first attempt to tackle these issues proposing to study competition in spatial models using an iceberg-like transport cost technology allowing for concave and convex melting functions.
Resumo:
In this paper, a method for enhancing current QoS routing methods by means of QoS protection is presented. In an MPLS network, the segments (links) to be protected are predefined and an LSP request involves, apart from establishing a working path, creating a specific type of backup path (local, reverse or global). Different QoS parameters, such as network load balancing, resource optimization and minimization of LSP request rejection should be considered. QoS protection is defined as a function of QoS parameters, such as packet loss, restoration time, and resource optimization. A framework to add QoS protection to many of the current QoS routing algorithms is introduced. A backup decision module to select the most suitable protection method is formulated and different case studies are analyzed
Resumo:
We consider a dynamic multifactor model of investment with financing imperfections,adjustment costs and fixed and variable capital. We use the model to derive a test offinancing constraints based on a reduced form variable capital equation. Simulation resultsshow that this test correctly identifies financially constrained firms even when the estimationof firms investment opportunities is very noisy. In addition, the test is well specified inthe presence of both concave and convex adjustment costs of fixed capital. We confirmempirically the validity of this test on a sample of small Italian manufacturing companies.
Resumo:
El reconeixement dels gestos de la mà (HGR, Hand Gesture Recognition) és actualment un camp important de recerca degut a la varietat de situacions en les quals és necessari comunicar-se mitjançant signes, com pot ser la comunicació entre persones que utilitzen la llengua de signes i les que no. En aquest projecte es presenta un mètode de reconeixement de gestos de la mà a temps real utilitzant el sensor Kinect per Microsoft Xbox, implementat en un entorn Linux (Ubuntu) amb llenguatge de programació Python i utilitzant la llibreria de visió artifical OpenCV per a processar les dades sobre un ordinador portàtil convencional. Gràcies a la capacitat del sensor Kinect de capturar dades de profunditat d’una escena es poden determinar les posicions i trajectòries dels objectes en 3 dimensions, el que implica poder realitzar una anàlisi complerta a temps real d’una imatge o d’una seqüencia d’imatges. El procediment de reconeixement que es planteja es basa en la segmentació de la imatge per poder treballar únicament amb la mà, en la detecció dels contorns, per després obtenir l’envolupant convexa i els defectes convexos, que finalment han de servir per determinar el nombre de dits i concloure en la interpretació del gest; el resultat final és la transcripció del seu significat en una finestra que serveix d’interfície amb l’interlocutor. L’aplicació permet reconèixer els números del 0 al 5, ja que s’analitza únicament una mà, alguns gestos populars i algunes de les lletres de l’alfabet dactilològic de la llengua de signes catalana. El projecte és doncs, la porta d’entrada al camp del reconeixement de gestos i la base d’un futur sistema de reconeixement de la llengua de signes capaç de transcriure tant els signes dinàmics com l’alfabet dactilològic.
Resumo:
Avui en dia s’ha convertit en una necessitat tenir cura del medi ambient i optimitzar els recursos naturals. En el camp per estalviar energia s’han fet grans progressos i disposem d’un gran ventall de dispositius que ens ajuden i ens faciliten l’optimització del consum d’energia. Però és una realitat que en l’estalvi del consum d’aigua el progrés ha estat molt menor i es limita molt a donar consells i repartir dosificadors d’aigua. Qui no ha vist carrers o jardins o cases inundades amb milers de litres d’aigua? Aquesta realitat m’ha portat a dissenyar i desenvolupar un prototip que em permeti tenir un millor control del consum d’aigua. El prototip, a trets principals, consta d’un sensor, una electrovàlvula i una placa Arduino Atmega. El sensor ens permet mesurar els litres consumits durant un cert període de temps. Passat aquest temps de mostreig es compara els litres consumits amb el consum habitual, en aquell període de temps. En cas de sobrepassar el volum programat es tancarà l’electrovàlvula de forma automàtica i rebrem un SMS al telèfon. L’activació de l’alarma es pot ajustar que sigui al igualar-se els dos valors, litres programats i litres consumits. També es pot programar el percentatge que cal sobrepassar de litres consumits per activar l’alarma, com el temps de mostreig. El fet de poder programar tots aquests valors ens permet fer un ajust ideal per a la instal·lació que es vol tenir controlada. A més, el prototip es pot utilitzar per enviar a la companyia d’aigua el valor del comptador de forma automàtica. D’aquesta forma la companyia d’aigua també optimitza recursos estalviant-se el desplaçament de personal a la instal·lació per fer la lectura corresponent. El prototip està basat amb un Arduino Atmega que ens permet el processament de les dades programades i capturades pel sensor. També s’ha incorporat una pantalla TFT Touch 2’8”, que permet visualitzar i programar els valors d’una forma molt més intuïtiva. Per enviar els SMS s’utilitza una placa d’Arduino Cel·lular Shield - SM5100B, a la qual només cal afegir una targeta SIM. A priori, el prototip té un elevat cost al fabricar una sola unitat i pot semblar poc útil. Però ens pot estalviar alguna sorpresa en les factures d’aigua si tenim una fuita i no ens n’adonem fins a veure el rebut de la companyia. Si es fabriqués a grans quantitats es podria abaratir el preu i fer-lo encara més engrescador.