998 resultados para Decomposition techniques
Resumo:
A decentralized solution method to the AC power flow problem in power systems with interconnected areas is presented. The proposed methodology allows finding the operation point of a particular area without explicit knowledge of network data of adjacent areas, being only necessary to exchange border information related to the interconnection lines between areas. The methodology is based on the decomposition of the first-order optimality conditions of the AC power flow, which is formulated as a nonlinear programming problem. A 9-bus didactic system, the IEEE Three Area RTS-96 and the IEEE 118 bus test systems are used in order to show the operation and effectiveness of the distributed AC power flow.
Resumo:
Includes bibliography
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
We propose a new methodology to evaluate the balance between segregation and integration in functional brain networks by using singular value decomposition techniques. By means of magnetoencephalography, we obtain the brain activity of a control group of 19 individuals during a memory task. Next, we project the node-to-node correlations into a complex network that is analyzed from the perspective of its modular structure encoded in the contribution matrix. In this way, we are able to study the role that nodes play I/O its community and to identify connector and local hubs. At the mesoscale level, the analysis of the contribution matrix allows us to measure the degree of overlapping between communities and quantify how far the functional networks are from the configuration that better balances the integrated and segregated activity
Resumo:
En esta Tesis se cuantifica la variación cromática que producen los barnices en el color de los diferentes tipos de maderas de construcción, obteniendo un modelo matemático de predicción de color de la madera. Se analizan las prestaciones de dieciséis barnices, supuestamente incoloros, aplicados sobre veinte tipos de maderas, angiospermas y gimnospermas, de distintas densidades y latitudes. Ambos materiales son de uso frecuente en el ámbito de la construcción y de fácil localización en las tiendas y almacenes de ambos sectores. Se utilizan técnicas de descomposición cromática, mediante el empleo de microscopio óptico de reflexión, para poder obtener un abanico gráfico de histogramas con valores numéricos de luminosidad y composición cromática, y de esta forma comprobar que los supuestos barnices que se venden como incoloros, no son totalmente incoloros sino que muestran tendencias a virar hacia alguno de los colores básicos. En el proceso experimental de la Tesis, se aplican 16 barnices sobre 20 tipos de maderas, obteniéndose los histogramas de las campañas de fotografías realizadas con cinco años de diferencia, obteniéndose no solo la variación de color que producen los barnices sobre el original de la madera, sino además la influencia de un envejecimiento a los cinco años. La Tesis relaciona el tipo de barniz idóneo para cada tipo de madera, de modo que produzca la menor variación cromática. La Tesis además obtiene un modelo matemático que permite predecir el color final de la madera tratada en función del color inicial de la madera sin barnizar. Se propone en esta Tesis una recomendación de los productos a utilizar en cada uno de los tipos de madera en base a su color inicial. This Thesis quantifies the chromatic variation caused by varnishes in the colour of different types of timbers, obtaining a mathematical model for predicting the timber’s colour. The performance of sixteen varnishes, supposedly colourless, is analysed, applied on twenty types of timber, angiosperms and gymnosperms, of different densities and latitudes. Both materials are of frequent use in the construction field and easily located in the stores and warehouses of both sectors. Chromatic decomposition techniques are used, through the utilization of a reflection optical microscope, in order to obtain a graphic range of histograms with numerical values of luminosity and chromatic composition, and this way confirm that the alleged varnishes sold as colourless are not completely colourless but are prone to shift towards one of the basic colours. During the Thesis’ experimental procedure 16 varnishes are applied on 20 types of timber, obtaining the colour histograms of the photography campaigns carried out with a five years difference, resulting in not only the colour variation caused by the varnishes on the original timber, but also the influence of its ageing five years later. The Thesis links the right type of varnish for each kind of timber, so that as little chromatic variation as possible occurs. The Thesis obtains as well a mathematical model, which makes it possible to predict the final colour of the treated timber depending on the original colour of the timber with no varnish. This Thesis proposes a recommendation of the products to use on each type of timber on the basis of its original colour.
Resumo:
Este trabajo de Tesis se desarrolla en el marco de los escenarios de ejecución distribuida de servicios móviles y contribuye a la definición y desarrollo del concepto de usuario prosumer. El usuario prosumer se caracteriza por utilizar su teléfono móvil para crear, proveer y ejecutar servicios. Este nuevo modelo de usuario contribuye al avance de la sociedad de la información, ya que el usuario prosumer se transforma de creador de contenidos a creador de servicios (estos últimos formados por contenidos y la lógica para acceder a ellos, procesarlos y representarlos). El objetivo general de este trabajo de Tesis es la provisión de un modelo de creación, distribución y ejecución de servicios para entorno móvil que permita a los usuarios no programadores (usuarios prosumer), pero expertos en un determinado dominio, crear y ejecutar sus propias aplicaciones y servicios. Para ello se definen, desarrollan e implementan metodologías, procesos, algoritmos y mecanismos adaptables a dominios específicos, para construir entornos de ejecución distribuida de servicios móviles para usuarios prosumer. La provisión de herramientas de creación adaptadas a usuarios no expertos es una tendencia actual que está siendo desarrollada en distintos trabajos de investigación. Sin embargo, no se ha propuesto una metodología de desarrollo de servicios que involucre al usuario prosumer en el proceso de diseño, desarrollo, implementación y validación de servicios. Este trabajo de Tesis realiza un estudio de las metodologías y tecnologías más innovadoras relacionadas con la co‐creación y utiliza este análisis para definir y validar una metodología que habilita al usuario para ser el responsable de la creación de servicios finales. Siendo los entornos móviles prosumer (mobile prosumer environments) una particularización de los entornos de ejecución distribuida de servicios móviles, en este trabajo se tesis se investiga en técnicas de adaptación, distribución, coordinación de servicios y acceso a recursos identificando como requisitos las problemáticas de este tipo de entornos y las características de los usuarios que participan en los mismos. Se contribuye a la adaptación de servicios definiendo un modelo de variabilidad que soporte la interdependencia entre las decisiones de personalización de los usuarios, incorporando mecanismos de guiado y detección de errores. La distribución de servicios se implementa utilizando técnicas de descomposición en árbol SPQR, cuantificando el impacto de separar cualquier servicio en distintos dominios. Considerando el plano de comunicaciones para la coordinación en la ejecución de servicios distribuidos hemos identificado varias problemáticas, como las pérdidas de enlace, conexiones, desconexiones y descubrimiento de participantes, que resolvemos utilizando técnicas de diseminación basadas en publicación subscripción y algoritmos Gossip. Para lograr una ejecución flexible de servicios distribuidos en entorno móvil, soportamos la adaptación a cambios en la disponibilidad de los recursos, proporcionando una infraestructura de comunicaciones para el acceso uniforme y eficiente a recursos. Se han realizado validaciones experimentales para evaluar la viabilidad de las soluciones propuestas, definiendo escenarios de aplicación relevantes (el nuevo universo inteligente, prosumerización de servicios en entornos hospitalarios y emergencias en la web de la cosas). Abstract This Thesis work is developed in the framework of distributed execution of mobile services and contributes to the definition and development of the concept of prosumer user. The prosumer user is characterized by using his mobile phone to create, provide and execute services. This new user model contributes to the advancement of the information society, as the prosumer is transformed from producer of content, to producer of services (consisting of content and logic to access them, process them and represent them). The overall goal of this Thesis work is to provide a model for creation, distribution and execution of services for the mobile environment that enables non‐programmers (prosumer users), but experts in a given domain, to create and execute their own applications and services. For this purpose I define, develop and implement methodologies, processes, algorithms and mechanisms, adapted to specific domains, to build distributed environments for the execution of mobile services for prosumer users. The provision of creation tools adapted to non‐expert users is a current trend that is being developed in different research works. However, it has not been proposed a service development methodology involving the prosumer user in the process of design, development, implementation and validation of services. This thesis work studies innovative methodologies and technologies related to the co‐creation and relies on this analysis to define and validate a methodological approach that enables the user to be responsible for creating final services. Being mobile prosumer environments a specific case of environments for distributed execution of mobile services, this Thesis work researches in service adaptation, distribution, coordination and resource access techniques, and identifies as requirements the challenges of such environments and characteristics of the participating users. I contribute to service adaptation by defining a variability model that supports the dependency of user personalization decisions, incorporating guiding and error detection mechanisms. Service distribution is implemented by using decomposition techniques based on SPQR trees, quantifying the impact of separating any service in different domains. Considering the communication level for the coordination of distributed service executions I have identified several problems, such as link losses, connections, disconnections and discovery of participants, which I solve using dissemination techniques based on publish‐subscribe communication models and Gossip algorithms. To achieve a flexible distributed service execution in mobile environments, I support adaptation to changes in the availability of resources, while providing a communication infrastructure for the uniform and efficient access to resources. Experimental validations have been conducted to assess the feasibility of the proposed solutions, defining relevant application scenarios (the new intelligent universe, service prosumerization in hospitals and emergency situations in the web of things).
Using interior point algorithms for the solution of linear programs with special structural features
Resumo:
Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.
Resumo:
The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver. The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.
Resumo:
Unstructured mesh codes for modelling continuum physics phenomena have evolved to provide the facility to model complex interacting systems. Parallelisation of such codes using single Program Multi Data (SPMD) domain decomposition techniques implemented with message passing has been demonstrated to provide high parallel efficiency, scalability to large numbers of processors P and portability across a wide range of parallel platforms. High efficiency, especially for large P requires that load balance is achieved in each parallel loop. For a code in which loops span a variety of mesh entity types, for example, elements, faces and vertices, some compromise is required between load balance for each entity type and the quantity of inter-processor communication required to satisfy data dependence between processors.
Resumo:
Evolutionary algorithms alone cannot solve optimization problems very efficiently since there are many random (not very rational) decisions in these algorithms. Combination of evolutionary algorithms and other techniques have been proven to be an efficient optimization methodology. In this talk, I will explain the basic ideas of our three algorithms along this line (1): Orthogonal genetic algorithm which treats crossover/mutation as an experimental design problem, (2) Multiobjective evolutionary algorithm based on decomposition (MOEA/D) which uses decomposition techniques from traditional mathematical programming in multiobjective optimization evolutionary algorithm, and (3) Regular model based multiobjective estimation of distribution algorithms (RM-MEDA) which uses the regular property and machine learning methods for improving multiobjective evolutionary algorithms.
Resumo:
Several decision and control tasks in cyber-physical networks can be formulated as large- scale optimization problems with coupling constraints. In these "constraint-coupled" problems, each agent is associated to a local decision variable, subject to individual constraints. This thesis explores the use of primal decomposition techniques to develop tailored distributed algorithms for this challenging set-up over graphs. We first develop a distributed scheme for convex problems over random time-varying graphs with non-uniform edge probabilities. The approach is then extended to unknown cost functions estimated online. Subsequently, we consider Mixed-Integer Linear Programs (MILPs), which are of great interest in smart grid control and cooperative robotics. We propose a distributed methodological framework to compute a feasible solution to the original MILP, with guaranteed suboptimality bounds, and extend it to general nonconvex problems. Monte Carlo simulations highlight that the approach represents a substantial breakthrough with respect to the state of the art, thus representing a valuable solution for new toolboxes addressing large-scale MILPs. We then propose a distributed Benders decomposition algorithm for asynchronous unreliable networks. The framework has been then used as starting point to develop distributed methodologies for a microgrid optimal control scenario. We develop an ad-hoc distributed strategy for a stochastic set-up with renewable energy sources, and show a case study with samples generated using Generative Adversarial Networks (GANs). We then introduce a software toolbox named ChoiRbot, based on the novel Robot Operating System 2, and show how it facilitates simulations and experiments in distributed multi-robot scenarios. Finally, we consider a Pickup-and-Delivery Vehicle Routing Problem for which we design a distributed method inspired to the approach of general MILPs, and show the efficacy through simulations and experiments in ChoiRbot with ground and aerial robots.
Resumo:
Important research effort has been devoted to the topic of optimal planning of distribution systems. The non linear nature of the system, the need to consider a large number of scenarios and the increasing necessity to deal with uncertainties make optimal planning in distribution systems a difficult task. Heuristic techniques approaches have been proposed to deal with these issues, overcoming some of the inherent difficulties of classic methodologies. This paper considers several methodologies used to address planning problems of electrical power distribution networks, namely mixedinteger linear programming (MILP), ant colony algorithms (AC), genetic algorithms (GA), tabu search (TS), branch exchange (BE), simulated annealing (SA) and the Bender´s decomposition deterministic non-linear optimization technique (BD). Adequacy of theses techniques to deal with uncertainties is discussed. The behaviour of each optimization technique is compared from the point of view of the obtained solution and of the methodology performance. The paper presents results of the application of these optimization techniques to a real case of a 10-kV electrical distribution system with 201 nodes that feeds an urban area.
Resumo:
The Spanish savings banks attracted quite a considerable amount of interest within the scientific arena, especially subsequent to the disappearance of the regulatory constraints during the second decade of the 1980s. Nonetheless, a lack of research identified with respect to mainstream paths given by strategic groups, and the analysis of the total factor productivity. Therefore, on the basis of the resource-based view of the firm and cluster analysis, we make use of changes in structure and performance ratios in order to identify the strategic groups extant in the sector. We attain a threeways division, which we link with different input-output specifications defining strategic paths. Consequently, on the basis of these three dissimilar approaches we compute and decompose a Hicks-Moorsteen total factor productivity index. Obtained results put forward an interesting interpretation under a multi-strategic approach, together with the setbacks of employing cluster analysis within a complex strategic environment. Moreover, we also propose an ex-post method of analysing the outcomes of the decomposed total factor productivity index that could be merged with non-traditional techniques of forming strategic groups, such as cognitive approaches.
Resumo:
La tècnica de l’electroencefalograma (EEG) és una de les tècniques més utilitzades per estudiar el cervell. En aquesta tècnica s’enregistren els senyals elèctrics que es produeixen en el còrtex humà a través d’elèctrodes col•locats al cap. Aquesta tècnica, però, presenta algunes limitacions a l’hora de realitzar els enregistraments, la principal limitació es coneix com a artefactes, que són senyals indesitjats que es mesclen amb els senyals EEG. L’objectiu d’aquest treball de final de màster és presentar tres nous mètodes de neteja d’artefactes que poden ser aplicats en EEG. Aquests estan basats en l’aplicació de la Multivariate Empirical Mode Decomposition, que és una nova tècnica utilitzada per al processament de senyal. Els mètodes de neteja proposats s’apliquen a dades EEG simulades que contenen artefactes (pestanyeigs), i un cop s’han aplicat els procediments de neteja es comparen amb dades EEG que no tenen pestanyeigs, per comprovar quina millora presenten. Posteriorment, dos dels tres mètodes de neteja proposats s’apliquen sobre dades EEG reals. Les conclusions que s’han extret del treball són que dos dels nous procediments de neteja proposats es poden utilitzar per realitzar el preprocessament de dades reals per eliminar pestanyeigs.
Resumo:
The standard data fusion methods may not be satisfactory to merge a high-resolution panchromatic image and a low-resolution multispectral image because they can distort the spectral characteristics of the multispectral data. The authors developed a technique, based on multiresolution wavelet decomposition, for the merging and data fusion of such images. The method presented consists of adding the wavelet coefficients of the high-resolution image to the multispectral (low-resolution) data. They have studied several possibilities concluding that the method which produces the best results consists in adding the high order coefficients of the wavelet transform of the panchromatic image to the intensity component (defined as L=(R+G+B)/3) of the multispectral image. The method is, thus, an improvement on standard intensity-hue-saturation (IHS or LHS) mergers. They used the ¿a trous¿ algorithm which allows the use of a dyadic wavelet to merge nondyadic data in a simple and efficient scheme. They used the method to merge SPOT and LANDSATTM images. The technique presented is clearly better than the IHS and LHS mergers in preserving both spectral and spatial information.