917 resultados para Optimization algorithm
Resumo:
Combinatorial Optimization Problems occur in a wide variety of contexts and generally are NP-hard problems. At a corporate level solving this problems is of great importance since they contribute to the optimization of operational costs. In this thesis we propose to solve the Public Transport Bus Assignment problem considering an heterogeneous fleet and line exchanges, a variant of the Multi-Depot Vehicle Scheduling Problem in which additional constraints are enforced to model a real life scenario. The number of constraints involved and the large number of variables makes impracticable solving to optimality using complete search techniques. Therefore, we explore metaheuristics, that sacrifice optimality to produce solutions in feasible time. More concretely, we focus on the development of algorithms based on a sophisticated metaheuristic, Ant-Colony Optimization (ACO), which is based on a stochastic learning mechanism. For complex problems with a considerable number of constraints, sophisticated metaheuristics may fail to produce quality solutions in a reasonable amount of time. Thus, we developed parallel shared-memory (SM) synchronous ACO algorithms, however, synchronism originates the straggler problem. Therefore, we proposed three SM asynchronous algorithms that break the original algorithm semantics and differ on the degree of concurrency allowed while manipulating the learned information. Our results show that our sequential ACO algorithms produced better solutions than a Restarts metaheuristic, the ACO algorithms were able to learn and better solutions were achieved by increasing the amount of cooperation (number of search agents). Regarding parallel algorithms, our asynchronous ACO algorithms outperformed synchronous ones in terms of speedup and solution quality, achieving speedups of 17.6x. The cooperation scheme imposed by asynchronism also achieved a better learning rate than the original one.
Resumo:
Phosphorus (P) is becoming a scarce element due to the decreasing availability of primary sources. Therefore, recover P from secondary sources, e.g. waste streams, have become extremely important. Sewage sludge ash (SSA) is a reliable secondary source of P. The use of SSAs as a direct fertilizer has very restricted legislation due to the presence of inorganic contaminants. Furthermore, the P present in SSAs is not in a plant-available form. The electrodialytic (ED) process is one of the methods under development to recover P and simultaneously remove heavy metals. The present work aimed to optimize the P recovery through a 2 compartment electrodialytic cell. The research was divided in three independent phases. In the first phase, ED experiments were carried out for two SSAs from different seasons, varying the duration of the ED process (2, 4, 6 and 9 days). During the ED treatment the SSA was suspended in distilled water in the anolyte, which was separated from the catholyte by a cation exchange membrane. From both ashes 90% of P was successfully extracted after 6 days of treatment. Regarding the heavy metals removal, one of the SSAs had a better removal than the other. Therefore, it was possible to conclude that SSAs from different seasons can be submitted to ED process under the same parameters. In the second phase, the two SSAs were exposed to humidity and air prior to ED, in order to carbonate them. Although this procedure was not successful, ED experiments were carried out varying the duration of the treatment (2 and 6 days) and the period of air exposure that SSAs were submitted to (7, 14 and 30 days). After 6 days of treatment and 30 days of air exposure, 90% of phosphorus was successfully extracted from both ashes. No differences were identified between carbonated and non-carbonated SSAs. Thus, SSAs that were exposed to the air and humidity, e.g. SSAs stored for 30 days in an open deposit, can be treated under the same parameters as the SSAs directly collected from the incineration process. In the third phase, ED experiments were carried out during 6 days varying the stirring time (0, 1, 2 and 4 h/day) in order to investigate if energy can be saved on the stirring process. After 6 days of treatment and 4 h/day stirring, 80% and 90% of P was successfully extracted from SSA-A and SSA-B, respectively. This value is very similar to the one obtained for 6 days of treatment stirring 24 h/day.
Resumo:
Cancer remains as one of the top killing diseases in first world countries. It’s not a single, but a set of various diseases for which different treatment approaches have been taken over the years. Cancer immunotherapy comes as a “new” breath on cancer treatment, taking use of the patients’ immune system to induce anti-cancer responses. Dendritic Cell (DC) vaccines use the extraordinary capacity of DCs’ antigen presentation so that specific T cell responses may be generated against cancer. In this work, we report the ex vivo generation of DCs from precursors isolated from clinical-grade cryopreserved umbilical cord blood (UCB) samples. After the thawing protocol for cryopreserved samples was optimized, the generation of DCs from CD14+ monocytes, i.e., moDCs, or CD34+ hematopoietic stem cells (HSCs), i.e, CD34-derived DCs, was followed and their phenotype and function evaluated. Functional testing included the ability to respond to maturation stimuli (including enzymatic removal of surface sialic acids), Ovalbumin-FITC endocytic capacity, cytokine secretion and T cell priming ability. In order to evaluate the feasibility of using DCs derived from UCB precursors to induce immune responses, they were compared to peripheral blood (PB) moDCs. We observed an increased endocytosis capacity after moDCs were differentiated from monocyte precursors, but almost 10-fold lower than that of PB moDCs. Maturation markers were absent, low levels of inflammatory cytokines were seen and T cell stimulatory capacity was reduced. Sialidase enzymatic treatment was able to mature these cells, diminishing endocytosis and promoting higher T cell stimulation. CD34-derived DCs showed higher capacity for both maturation and endocytic capacity than moDCs. Although much more information was acquired from moDCs than from CD34-derived DCs, we conclude the last as probably the best suited for generating an immune response against cancer, but of course much more research has to be performed.
Resumo:
Contém resumo
Resumo:
In order to address and resolve the wastewater contamination problem of the Sines refinery with the main objective of optimizing the quality of this stream and reducing the costs charged to the refinery, a dynamic mass balance was developed nd implemented for ammonia and polar oil and grease (O&G) contamination in the wastewater circuit. The inadequate routing of sour gas from the sour water stripping unit and the kerosene caustic washing unit, were identified respectively as the major source of ammonia and polar substances present in the industrial wastewater effluent. For the O&G content, a predictive model was developed for the kerosene caustic washing unit, following the Projection to Latent Structures (PLS) approach. Comparison between analytical data for ammonia and polar O&G concentrations in refinery wastewater originating from the Dissolved Air Flotation (DAF) effluent and the model predictions of the dynamic mass balance calculations are in a very good agreement and highlights the dominant impact of the identified streams for the wastewater contamination levels. The ammonia contamination problem was solved by rerouting the sour gas through an existing clogged line with ammonia salts due to a non-insulated line section, while for the O&G a dynamic mass balance was implemented as an online tool, which allows for prevision of possible contamination situations and taking the required preventive actions, and can also serve as a basis for establishing relationships between the O&G contamination in the refinery wastewater with the properties of the refined crude oils and the process operating conditions. The PLS model developed could be of great asset in both optimizing the existing and designing new refinery wastewater treatment units or reuse schemes. In order to find a possible treatment solution for the spent caustic problem, an on-site pilot plant experiments for NaOH recovery from the refinery kerosene caustic washing unit effluent using an alkaline-resistant nanofiltration (NF) polymeric membrane were performed in order to evaluate its applicability for treating these highly alkaline and contaminated streams. For a constant operating pressure and temperature and adequate operating conditions, 99.9% of oil and grease rejection and 97.7% of chemical oxygen demand (COD) rejection were observed. No noticeable membrane fouling or flux decrease were registered until a volume concentration factor of 3. These results allow for NF permeate reuse instead of fresh caustic and for significant reduction of the wastewater contamination, which can result in savings of 1.5 M€ per year at the current prices for the largest Portuguese oil refinery. The capital investments needed for implementation of the required NF membrane system are less than 10% of those associated with the traditional wet air oxidation solution of the spent caustic problem. The operating costs are very similar, but can be less than half if reusing the NF concentrate in refinery pH control applications. The payback period was estimated to be 1.1 years. Overall, the pilot plant experimental results obtained and the process economic evaluation data indicate a very competitive solution through the proposed NF treatment process, which represents a highly promising alternative to conventional and existing spent caustic treatment units.
Resumo:
Polysaccharides are gaining increasing attention as potential environmental friendly and sustainable building blocks in many fields of the (bio)chemical industry. The microbial production of polysaccharides is envisioned as a promising path, since higher biomass growth rates are possible and therefore higher productivities may be achieved compared to vegetable or animal polysaccharides sources. This Ph.D. thesis focuses on the modeling and optimization of a particular microbial polysaccharide, namely the production of extracellular polysaccharides (EPS) by the bacterial strain Enterobacter A47. Enterobacter A47 was found to be a metabolically versatile organism in terms of its adaptability to complex media, notably capable of achieving high growth rates in media containing glycerol byproduct from the biodiesel industry. However, the industrial implementation of this production process is still hampered due to a largely unoptimized process. Kinetic rates from the bioreactor operation are heavily dependent on operational parameters such as temperature, pH, stirring and aeration rate. The increase of culture broth viscosity is a common feature of this culture and has a major impact on the overall performance. This fact complicates the mathematical modeling of the process, limiting the possibility to understand, control and optimize productivity. In order to tackle this difficulty, data-driven mathematical methodologies such as Artificial Neural Networks can be employed to incorporate additional process data to complement the known mathematical description of the fermentation kinetics. In this Ph.D. thesis, we have adopted such an hybrid modeling framework that enabled the incorporation of temperature, pH and viscosity effects on the fermentation kinetics in order to improve the dynamical modeling and optimization of the process. A model-based optimization method was implemented that enabled to design bioreactor optimal control strategies in the sense of EPS productivity maximization. It is also critical to understand EPS synthesis at the level of the bacterial metabolism, since the production of EPS is a tightly regulated process. Methods of pathway analysis provide a means to unravel the fundamental pathways and their controls in bioprocesses. In the present Ph.D. thesis, a novel methodology called Principal Elementary Mode Analysis (PEMA) was developed and implemented that enabled to identify which cellular fluxes are activated under different conditions of temperature and pH. It is shown that differences in these two parameters affect the chemical composition of EPS, hence they are critical for the regulation of the product synthesis. In future studies, the knowledge provided by PEMA could foster the development of metabolically meaningful control strategies that target the EPS sugar content and oder product quality parameters.
Resumo:
Nowadays, a significant number of banks in Portugal are facing a bank-branch restructuring problem, and Millennium BCP is not an exception. The closure of branches is a major component of profit maximization through the reduction in operational and personnel costs but also an opportunity to approach the idea of “baking of future” and start thinking on the benefits of the digital era. This dissertation centers on a current high-impact organizational problem addressed by the company and consists in a proposal of optimization to the model that Millennium BCP uses. Even though measures of performance are usually considered the most important elements in evaluating the viability of branches, there is evidence suggesting that other general factors can be important to assess branch potential, such as the influx on branches, business dimensions of a branch and its location, which will be addressed in this project.
Resumo:
Sonae MC is constantly innovating and keeping up with the new market trends, being increasingly focused on E-commerce due to its growing importance. In that area, a telephone line is available to support customers with their problems. However, rare were the cases in which those problems were solved in the first contact. Therefore, the goal of this work was to reengineer these processes to improve the service performance and consequently the customer’s satisfaction. Following an evolutionary approach, improvement opportunities were suggested and if correctly implemented the cases resolution time could decrease 1 day and Sonae MC will save €7.750 per month.
Resumo:
Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.
Resumo:
Autor proof
Resumo:
The present paper reports the precipitation process of Al3Sc structures in an aluminum scandium alloy, which has been simulated with a synchronous parallel kinetic Monte Carlo (spkMC) algorithm. The spkMC implementation is based on the vacancy diffusion mechanism. To filter the raw data generated by the spkMC simulations, the density-based clustering with noise (DBSCAN) method has been employed. spkMC and DBSCAN algorithms were implemented in the C language and using MPI library. The simulations were conducted in the SeARCH cluster located at the University of Minho. The Al3Sc precipitation was successfully simulated at the atomistic scale with the spkMC. DBSCAN proved to be a valuable aid to identify the precipitates by performing a cluster analysis of the simulation results. The achieved simulations results are in good agreement with those reported in the literature under sequential kinetic Monte Carlo simulations (kMC). The parallel implementation of kMC has provided a 4x speedup over the sequential version.
Resumo:
Earthworks tasks aim at levelling the ground surface at a target construction area and precede any kind of structural construction (e.g., road and railway construction). It is comprised of sequential tasks, such as excavation, transportation, spreading and compaction, and it is strongly based on heavy mechanical equipment and repetitive processes. Under this context, it is essential to optimize the usage of all available resources under two key criteria: the costs and duration of earthwork projects. In this paper, we present an integrated system that uses two artificial intelligence based techniques: data mining and evolutionary multi-objective optimization. The former is used to build data-driven models capable of providing realistic estimates of resource productivity, while the latter is used to optimize resource allocation considering the two main earthwork objectives (duration and cost). Experiments held using real-world data, from a construction site, have shown that the proposed system is competitive when compared with current manual earthwork design.
Resumo:
In highway construction, earthworks refer to the tasks of excavation, transportation, spreading and compaction of geomaterial (e.g. soil, rockfill and soil-rockfill mixture). Whereas relying heavily on machinery and repetitive processes, these tasks are highly susceptible to optimization. In this context Artificial Intelligent techniques, such as Data Mining and modern optimization can be applied for earthworks. A survey of these applications shows that they focus on the optimization of specific objectives and/or construction phases being possible to identify the capabilities and limitations of the analyzed techniques. Thus, according to the pinpointed drawbacks of these techniques, this paper describes a novel intelligent earthwork optimization system, capable of integrating DM, modern optimization and GIS technologies in order to optimize the earthwork processes throughout all phases of design and construction work. This integration system allows significant savings in time, cost and gas emissions contributing for a more sustainable construction.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
PhD thesis in Bioengineering