941 resultados para test case optimization
Resumo:
Several decision and control tasks in cyber-physical networks can be formulated as large- scale optimization problems with coupling constraints. In these "constraint-coupled" problems, each agent is associated to a local decision variable, subject to individual constraints. This thesis explores the use of primal decomposition techniques to develop tailored distributed algorithms for this challenging set-up over graphs. We first develop a distributed scheme for convex problems over random time-varying graphs with non-uniform edge probabilities. The approach is then extended to unknown cost functions estimated online. Subsequently, we consider Mixed-Integer Linear Programs (MILPs), which are of great interest in smart grid control and cooperative robotics. We propose a distributed methodological framework to compute a feasible solution to the original MILP, with guaranteed suboptimality bounds, and extend it to general nonconvex problems. Monte Carlo simulations highlight that the approach represents a substantial breakthrough with respect to the state of the art, thus representing a valuable solution for new toolboxes addressing large-scale MILPs. We then propose a distributed Benders decomposition algorithm for asynchronous unreliable networks. The framework has been then used as starting point to develop distributed methodologies for a microgrid optimal control scenario. We develop an ad-hoc distributed strategy for a stochastic set-up with renewable energy sources, and show a case study with samples generated using Generative Adversarial Networks (GANs). We then introduce a software toolbox named ChoiRbot, based on the novel Robot Operating System 2, and show how it facilitates simulations and experiments in distributed multi-robot scenarios. Finally, we consider a Pickup-and-Delivery Vehicle Routing Problem for which we design a distributed method inspired to the approach of general MILPs, and show the efficacy through simulations and experiments in ChoiRbot with ground and aerial robots.
Resumo:
Since the study of Large Dam Reservoirs is of worldwide interest, in this PhD project we investigated the Ridracoli reservoir, one of the main water supply in Emilia-Romagna (north-eastern Italy). This work aims to characterize waters and sediments to better understand their composition, interactions and any process that occurs, for a better geochemical and environmental knowledge of the area. Physical and chemical analyses on the water column have shown an alternation of stratification and mixing of water in the reservoir’s water body due to seasonal variations in temperature and density. In particular, it was observed the establishment, in late summer, of anoxic conditions at the bottom, which in turn affects the concentration and mobility of some elements of concern (e.g. Fe and Mn) for the water quality. Sediments within the reservoir and from surrounding areas were analysed for organic matter, total inorganic composition and grain size, assessing the inter-element relationship, grain size, geological background and damming influences on their chemistry, through descriptive statistics, Principal Component Analysis and Cluster Analysis. The reservoir’s area was also investigated by pseudo total composition (Aqua Regia digestion), degree of elements extractability, and enrichment factors, then analysed and compared to limits by law and literature. Sediment cores, interstitial waters, and benthic chamber data from the bottom were of great interest due to organic matter degradation, early diagenesis, mineral formation at water-sediment interface and observed flows. Finally, leaching test and extraction procedures, of environmental interest, showed peculiar partitioning, both regarding spatial and in-depth distribution, and the absence of pollution. Collectively, our results are useful for the comprehension of processes that occur in water and sediments of Ridracoli reservoir, providing important knowledges on the site that could be relevant for the management of the resource and the planning of future interventions.
Resumo:
The topic of the Ph.D project focuses on the modelling of the soil-water dynamics inside an instrumented embankment section along Secchia River (Cavezzo (MO)) in the period from 2017 to 2018 and the quantification of the performance of the direct and indirect simulations . The commercial code Hydrus2D by Pc-Progress has been chosen to run the direct simulations. Different soil-hydraulic models have been adopted and compared. The parameters of the different hydraulic models are calibrated using a local optimization method based on the Levenberg - Marquardt algorithm implemented in the Hydrus package. The calibration program is carried out using different types of dataset of observation points, different weighting distributions, different combinations of optimized parameters and different initial sets of parameters. The final goal is an in-depth study of the potentialities and limits of the inverse analysis when applied to a complex geotechnical problem as the case study. The second part of the research focuses on the effects of plant roots and soil-vegetation-atmosphere interaction on the spatial and temporal distribution of pore water pressure in soil. The investigated soil belongs to the West Charlestown Bypass embankment, Newcastle, Australia, that showed in the past years shallow instabilities and the use of long stem planting is intended to stabilize the slope. The chosen plant species is the Malaleuca Styphelioides, native of eastern Australia. The research activity included the design and realization of a specific large scale apparatus for laboratory experiments. Local suction measurements at certain intervals of depth and radial distances from the root bulb are recorded within the vegetated soil mass under controlled boundary conditions. The experiments are then reproduced numerically using the commercial code Hydrus 2D. Laboratory data are used to calibrate the RWU parameters and the parameters of the hydraulic model.
Resumo:
The design optimization of industrial products has always been an essential activity to improve product quality while reducing time-to-market and production costs. Although cost management is very complex and comprises all phases of the product life cycle, the control of geometrical and dimensional variations, known as Dimensional Management (DM), allows compliance with product and process requirements. Hence, the tolerance-cost optimization becomes the main practice to provide an effective application of Design for Tolerancing (DfT) and Design to Cost (DtC) approaches by enabling a connection between product tolerances and associated manufacturing costs. However, despite the growing interest in this topic, a profitable application in the industry of these techniques is hampered by their complexity: the definition of a systematic framework is the key element to improving design optimization, enhancing the concurrent use of Computer-Aided tools and Model-Based Definition (MBD) practices. The present doctorate research aims to define and develop an integrated methodology for product/process design optimization, to better exploit the new capabilities of advanced simulations and tools. By implementing predictive models and multi-disciplinary optimization, a Computer-Aided Integrated framework for tolerance-cost optimization has been proposed to allow the integration of DfT and DtC approaches and their direct application for the design of automotive components. Several case studies have been considered, with the final application of the integrated framework on a high-performance V12 engine assembly, to achieve both functional targets and cost reduction. From a scientific point of view, the proposed methodology provides an improvement for the tolerance-cost optimization of industrial components. The integration of theoretical approaches and Computer-Aided tools allows to analyse the influence of tolerances on both product performance and manufacturing costs. The case studies proved the suitability of the methodology for its application in the industrial field, providing the identification of further areas for improvement and refinement.
Resumo:
Several decision and control tasks involve networks of cyber-physical systems that need to be coordinated and controlled according to a fully-distributed paradigm involving only local communications without any central unit. This thesis focuses on distributed optimization and games over networks from a system theoretical perspective. In the addressed frameworks, we consider agents communicating only with neighbors and running distributed algorithms with optimization-oriented goals. The distinctive feature of this thesis is to interpret these algorithms as dynamical systems and, thus, to resort to powerful system theoretical tools for both their analysis and design. We first address the so-called consensus optimization setup. In this context, we provide an original system theoretical analysis of the well-known Gradient Tracking algorithm in the general case of nonconvex objective functions. Then, inspired by this method, we provide and study a series of extensions to improve the performance and to deal with more challenging settings like, e.g., the derivative-free framework or the online one. Subsequently, we tackle the recently emerged framework named distributed aggregative optimization. For this setup, we develop and analyze novel schemes to handle (i) online instances of the problem, (ii) ``personalized'' optimization frameworks, and (iii) feedback optimization settings. Finally, we adopt a system theoretical approach to address aggregative games over networks both in the presence or absence of linear coupling constraints among the decision variables of the players. In this context, we design and inspect novel fully-distributed algorithms, based on tracking mechanisms, that outperform state-of-the-art methods in finding the Nash equilibrium of the game.
Resumo:
Over the last century, mathematical optimization has become a prominent tool for decision making. Its systematic application in practical fields such as economics, logistics or defense led to the development of algorithmic methods with ever increasing efficiency. Indeed, for a variety of real-world problems, finding an optimal decision among a set of (implicitly or explicitly) predefined alternatives has become conceivable in reasonable time. In the last decades, however, the research community raised more and more attention to the role of uncertainty in the optimization process. In particular, one may question the notion of optimality, and even feasibility, when studying decision problems with unknown or imprecise input parameters. This concern is even more critical in a world becoming more and more complex —by which we intend, interconnected —where each individual variation inside a system inevitably causes other variations in the system itself. In this dissertation, we study a class of optimization problems which suffer from imprecise input data and feature a two-stage decision process, i.e., where decisions are made in a sequential order —called stages —and where unknown parameters are revealed throughout the stages. The applications of such problems are plethora in practical fields such as, e.g., facility location problems with uncertain demands, transportation problems with uncertain costs or scheduling under uncertain processing times. The uncertainty is dealt with a robust optimization (RO) viewpoint (also known as "worst-case perspective") and we present original contributions to the RO literature on both the theoretical and practical side.
Resumo:
This thesis deals with the analysis and management of emergency healthcare processes through the use of advanced analytics and optimization approaches. Emergency processes are among the most complex within healthcare. This is due to their non-elective nature and their high variability. This thesis is divided into two topics. The first one concerns the core of emergency healthcare processes, the emergency department (ED). In the second chapter, we describe the ED that is the case study. This is a real case study with data derived from a large ED located in northern Italy. In the next two chapters, we introduce two tools for supporting ED activities. The first one is a new type of analytics model. Its aim is to overcome the traditional methods of analyzing the activities provided in the ED by means of an algorithm that analyses the ED pathway (organized as event log) as a whole. The second tool is a decision-support system, which integrates a deep neural network for the prediction of patient pathways, and an online simulator to evaluate the evolution of the ED over time. Its purpose is to provide a set of solutions to prevent and solve the problem of the ED overcrowding. The second part of the thesis focuses on the COVID-19 pandemic emergency. In the fifth chapter, we describe a tool that was used by the Bologna local health authority in the first part of the pandemic. Its purpose is to analyze the clinical pathway of a patient and from this automatically assign them a state. Physicians used the state for routing the patients to the correct clinical pathways. The last chapter is dedicated to the description of a MIP model, which was used for the organization of the COVID-19 vaccination campaign in the city of Bologna, Italy.
Resumo:
The weight-transfer effect, consisting of the change in dynamic load distribution between the front and the rear tractor axles, is one of the most impairing phenomena for the performance, comfort, and safety of agricultural operations. Excessive weight transfer from the front to the rear tractor axle can occur during operation or maneuvering of implements connected to the tractor through the three-point hitch (TPH). In this respect, an optimal design of the TPH can ensure better dynamic load distribution and ultimately improve operational performance, comfort, and safety. In this study, a computational design tool (The Optimizer) for the determination of a TPH geometry that minimizes the weight-transfer effect is developed. The Optimizer is based on a constrained minimization algorithm. The objective function to be minimized is related to the tractor front-to-rear axle load transfer during a simulated reference maneuver performed with a reference implement on a reference soil. Simulations are based on a 3-degrees-of-freedom (DOF) dynamic model of the tractor-TPH-implement aggregate. The inertial, elastic, and viscous parameters of the dynamic model were successfully determined through a parameter identification algorithm. The geometry determined by the Optimizer complies with the ISO-730 Standard functional requirements and other design requirements. The interaction between the soil and the implement during the simulated reference maneuver was successfully validated against experimental data. Simulation results show that the adopted reference maneuver is effective in triggering the weight-transfer effect, with the front axle load exhibiting a peak-to-peak value of 27.1 kN during the maneuver. A benchmark test was conducted starting from four geometries of a commercially available TPH. As result, all the configurations were optimized by above 10%. The Optimizer, after 36 iterations, was able to find an optimized TPH geometry which allows to reduce the weight-transfer effect by 14.9%.
Resumo:
The industrial context is changing rapidly due to advancements in technology fueled by the Internet and Information Technology. The fourth industrial revolution counts integration, flexibility, and optimization as its fundamental pillars, and, in this context, Human-Robot Collaboration has become a crucial factor for manufacturing sustainability in Europe. Collaborative robots are appealing to many companies due to their low installation and running costs and high degree of flexibility, making them ideal for reshoring production facilities with a short return on investment. The ROSSINI European project aims to implement a true Human-Robot Collaboration by designing, developing, and demonstrating a modular and scalable platform for integrating human-centred robotic technologies in industrial production environments. The project focuses on safety concerns related to introducing a cobot in a shared working area and aims to lay the groundwork for a new working paradigm at the industrial level. The need for a software architecture suitable to the robotic platform employed in one of three use cases selected to deploy and test the new technology was the main trigger of this Thesis. The chosen application consists of the automatic loading and unloading of raw-material reels to an automatic packaging machine through an Autonomous Mobile Robot composed of an Autonomous Guided Vehicle, two collaborative manipulators, and an eye-on-hand vision system for performing tasks in a partially unstructured environment. The results obtained during the ROSSINI use case development were later used in the SENECA project, which addresses the need for robot-driven automatic cleaning of pharmaceutical bins in a very specific industrial context. The inherent versatility of mobile collaborative robots is evident from their deployment in the two projects with few hardware and software adjustments. The positive impact of Human-Robot Collaboration on diverse production lines is a motivation for future investments in research on this increasingly popular field by the industry.
Resumo:
Il presente lavoro di tesi verte sull’analisi e l’ottimizzazione dei flussi di libri generati tra le diverse sedi della biblioteca pubblica, Trondheim folkebibliotek, situata a Trondheim, città del nord norvegese. La ricerca si inserisce nell’ambito di un progetto pluriennale, SmartLIB, che questa sta intraprendendo con l’università NTNU - Norwegian University of Science and Technology. L’obiettivo di questa tesi è quello di analizzare possibili soluzioni per ottimizzare il flusso di libri generato dagli ordini dei cittadini. Una prima fase di raccolta ed analisi dei dati è servita per avere le informazioni necessarie per procedere nella ricerca. Successivamente è stata analizzata la possibilità di ridurre i flussi andando ad associare ad ogni dipartimento la quantità di copie necessarie per coprire il 90% della domanda, seguendo la distribuzione di Poisson. In seguito, sono state analizzate tre soluzioni per ottimizzare i flussi generati dai libri, il livello di riempimento dei box ed il percorso del camion che giornalmente visita tutte le sedi della libreria. Di supporto per questo secondo studio è stato il Vehicle Routing Problem (VRP). Un modello simulativo è stato creato su Anylogic ed utilizzato per validare le soluzioni proposte. I risultati hanno portato a proporre delle soluzioni per ottimizzare i flussi complessivi, riducendo il delay time di consegna dei libri del 50%, ad una riduzione del 53% del flusso di box e ad una conseguente aumento del 44% del tasso di riempimento di ogni box. Possibili future implementazioni delle soluzioni trovate corrispondono all’installazione di una nuova Sorting Machine nella sede centrale della libreria ed all’implementazione sempre in quest’ultima di un nuovo schedule giornaliero.
Resumo:
Nowadays, product development in all its phases plays a fundamental role in the industrial chain. The need for a company to compete at high levels, the need to be quick in responding to market demands and therefore to be able to engineer the product quickly and with a high level of quality, has led to the need to get involved in new more advanced methods/ processes. In recent years, we are moving away from the concept of 2D-based design and production and approaching the concept of Model Based Definition. By using this approach, increasingly complex systems turn out to be easier to deal with but above all cheaper in obtaining them. Thanks to the Model Based Definition it is possible to share data in a lean and simple way to the entire engineering and production chain of the product. The great advantage of this approach is precisely the uniqueness of the information. In this specific thesis work, this approach has been exploited in the context of tolerances with the aid of CAD / CAT software. Tolerance analysis or dimensional variation analysis is a way to understand how sources of variation in part size and assembly constraints propagate between parts and assemblies and how that range affects the ability of a project to meet its requirements. It is critically important to note how tolerance directly affects the cost and performance of products. Worst Case Analysis (WCA) and Statistical analysis (RSS) are the two principal methods in DVA. The thesis aims to show the advantages of using statistical dimensional analysis by creating and examining various case studies, using PTC CREO software for CAD modeling and CETOL 6σ for tolerance analysis. Moreover, it will be provided a comparison between manual and 3D analysis, focusing the attention to the information lost in the 1D case. The results obtained allow us to highlight the need to use this approach from the early stages of the product design cycle.
Resumo:
In the metal industry, and more specifically in the forging one, scrap material is a crucial issue and reducing it would be an important goal to reach. Not only would this help the companies to be more environmentally friendly and more sustainable, but it also would reduce the use of energy and lower costs. At the same time, the techniques for Industry 4.0 and the advancements in Artificial Intelligence (AI), especially in the field of Deep Reinforcement Learning (DRL), may have an important role in helping to achieve this objective. This document presents the thesis work, a contribution to the SmartForge project, that was performed during a semester abroad at Karlstad University (Sweden). This project aims at solving the aforementioned problem with a business case of the company Bharat Forge Kilsta, located in Karlskoga (Sweden). The thesis work includes the design and later development of an event-driven architecture with microservices, to support the processing of data coming from sensors set up in the company's industrial plant, and eventually the implementation of an algorithm with DRL techniques to control the electrical power to use in it.
Resumo:
In this Thesis, a life cycle analysis (LCA) of a biofuel cell designed by a team from the University of Bologna was done. The purpose of this study is to investigate the possible environmental impacts of the production and use of the cell and a possible optimization for an industrial scale-up. To do so, a first part of the paper was devoted to studying the present literature on biomass, and fuel cell treatments and then LCA studies on them. The experimental part presents the work done to create the Life Cycle Inventory and Life Cycle Impact Assessment. Several alternative scenarios were created to study process optimization. Reagents and energy supply were changed. To examine whether this technology can be competitive, a comparison was made with some biofuel cell use scenarios with traditional biomass treatment technologies. The result of this study is that this technology is promising from an environmental point of view in case it is possible to recover nutrients in output, without excessive energy consumption, and to minimize the use of energy used to prepare the solution.
Resumo:
Background Recentemente la letteratura scientifica ha dimostrato come un corretto controllo posturale faciliti i movimenti dell’arto superiore. Ci sono evidenze secondo cui, applicando al paziente dei contenimenti sul tronco, si ha un miglioramento della funzionalità dell’arto superiore. Obiettivi L’obiettivo principale della tesi era quello di verificare come il sostegno del tronco con l’utilizzo di una stabile struttura assiale, attraverso un supporto esterno definito “trunk constraint”, incrementi il controllo posturale, per facilitare i movimenti frazionati degli arti superiori in persone con esiti di patologie neurologiche. Materiali e metodi Il caso clinico riguarda un uomo di 60 anni con esiti di emiparesi sinistra da ictus ischemico destro. E’ stato eseguito un protocollo di dieci sessioni di trattamento, di un’ora ciascuna, in cui veniva applicata la facilitazione attraverso trunk constraint in diversi setting riabilitativi. I dati sono stati raccolti tramite le scale: Trunk Control Test, Trunk Impairment Scale e Fugl-Meyer Assessment. Inoltre, è stata eseguita l’analisi osservazionale, attraverso videoripresa, di un gesto funzionale dell’arto superiore. Risultati I dati rilevati dimostrano degli effetti positivi rispetto alle ipotesi di partenza. Infatti sono stati riscontrati miglioramenti negli item delle scale somministrate e nella valutazione qualitativa dell’arto superiore. In particolare, si è evidenziato un miglioramento nel controllo del tronco nella scala Trunk Control Test e nella Trunk Impairment Scale e della funzione dell’arto superiore alla scala Fugl-Meyer Assessment. L’analisi osservazionale dei video dimostra un miglioramento del timing di attivazione durante la fase di reaching. Conclusioni I risultati ottenuti supportano il fatto che un incremento dell’attività antigravitaria del tronco, anche attraverso supporti esterni come la trunk constraint, possono facilitare un miglioramento funzionale dell’arto superiore.
Resumo:
The need for sustainable economic growth and environmental stewardship emerged around the start of the twentieth century when society became aware that the traditional development model would lead to the collapse of the terrestrial ecosystem in the long run. Over the years, the international community's environmental efforts have demonstrated unequivocally that the planet's limits are real. And so, the new development approach has laid the groundwork for the future. According to this model, design also plays a key role in ensuring a better future. The design has undergone an ecological and sustainable evolution as a result of the global environmental crisis and the degradation of our ecosystem and biodiversity. In this contest, Prosperity Thinking is inserted, a still evolving methodology developed by the Future Food Institute starting from 2019. The main concepts on which it is based are described, as well as the method that identifies it, which is divided into the following stages: 1) Problem Framing 2) Ideation and Prototyping 3) Test & Analyze. The development of the prosperity thinking toolkit is described, beginning with the search for tools from the literature on sustainable design and ending with its validation with the help of design experts. The testing of some tools will be recounted during a workshop organized by FFI, in which 15 people ranging in age from 14 to 40 will participate, and then the final version of the toolkit will be presented which has been obtained by adding to it the tools proposed by the experts. Finally, a reflection on the future of Prosperity Thinking, a method in constant evolution that must continue to follow societal and environmental changes in order to respond to the ever-increasingly complex challenge of sustainability.