998 resultados para Regulatory optimization
Resumo:
Cancer remains as one of the top killing diseases in first world countries. It’s not a single, but a set of various diseases for which different treatment approaches have been taken over the years. Cancer immunotherapy comes as a “new” breath on cancer treatment, taking use of the patients’ immune system to induce anti-cancer responses. Dendritic Cell (DC) vaccines use the extraordinary capacity of DCs’ antigen presentation so that specific T cell responses may be generated against cancer. In this work, we report the ex vivo generation of DCs from precursors isolated from clinical-grade cryopreserved umbilical cord blood (UCB) samples. After the thawing protocol for cryopreserved samples was optimized, the generation of DCs from CD14+ monocytes, i.e., moDCs, or CD34+ hematopoietic stem cells (HSCs), i.e, CD34-derived DCs, was followed and their phenotype and function evaluated. Functional testing included the ability to respond to maturation stimuli (including enzymatic removal of surface sialic acids), Ovalbumin-FITC endocytic capacity, cytokine secretion and T cell priming ability. In order to evaluate the feasibility of using DCs derived from UCB precursors to induce immune responses, they were compared to peripheral blood (PB) moDCs. We observed an increased endocytosis capacity after moDCs were differentiated from monocyte precursors, but almost 10-fold lower than that of PB moDCs. Maturation markers were absent, low levels of inflammatory cytokines were seen and T cell stimulatory capacity was reduced. Sialidase enzymatic treatment was able to mature these cells, diminishing endocytosis and promoting higher T cell stimulation. CD34-derived DCs showed higher capacity for both maturation and endocytic capacity than moDCs. Although much more information was acquired from moDCs than from CD34-derived DCs, we conclude the last as probably the best suited for generating an immune response against cancer, but of course much more research has to be performed.
Resumo:
In order to address and resolve the wastewater contamination problem of the Sines refinery with the main objective of optimizing the quality of this stream and reducing the costs charged to the refinery, a dynamic mass balance was developed nd implemented for ammonia and polar oil and grease (O&G) contamination in the wastewater circuit. The inadequate routing of sour gas from the sour water stripping unit and the kerosene caustic washing unit, were identified respectively as the major source of ammonia and polar substances present in the industrial wastewater effluent. For the O&G content, a predictive model was developed for the kerosene caustic washing unit, following the Projection to Latent Structures (PLS) approach. Comparison between analytical data for ammonia and polar O&G concentrations in refinery wastewater originating from the Dissolved Air Flotation (DAF) effluent and the model predictions of the dynamic mass balance calculations are in a very good agreement and highlights the dominant impact of the identified streams for the wastewater contamination levels. The ammonia contamination problem was solved by rerouting the sour gas through an existing clogged line with ammonia salts due to a non-insulated line section, while for the O&G a dynamic mass balance was implemented as an online tool, which allows for prevision of possible contamination situations and taking the required preventive actions, and can also serve as a basis for establishing relationships between the O&G contamination in the refinery wastewater with the properties of the refined crude oils and the process operating conditions. The PLS model developed could be of great asset in both optimizing the existing and designing new refinery wastewater treatment units or reuse schemes. In order to find a possible treatment solution for the spent caustic problem, an on-site pilot plant experiments for NaOH recovery from the refinery kerosene caustic washing unit effluent using an alkaline-resistant nanofiltration (NF) polymeric membrane were performed in order to evaluate its applicability for treating these highly alkaline and contaminated streams. For a constant operating pressure and temperature and adequate operating conditions, 99.9% of oil and grease rejection and 97.7% of chemical oxygen demand (COD) rejection were observed. No noticeable membrane fouling or flux decrease were registered until a volume concentration factor of 3. These results allow for NF permeate reuse instead of fresh caustic and for significant reduction of the wastewater contamination, which can result in savings of 1.5 M€ per year at the current prices for the largest Portuguese oil refinery. The capital investments needed for implementation of the required NF membrane system are less than 10% of those associated with the traditional wet air oxidation solution of the spent caustic problem. The operating costs are very similar, but can be less than half if reusing the NF concentrate in refinery pH control applications. The payback period was estimated to be 1.1 years. Overall, the pilot plant experimental results obtained and the process economic evaluation data indicate a very competitive solution through the proposed NF treatment process, which represents a highly promising alternative to conventional and existing spent caustic treatment units.
Resumo:
Polysaccharides are gaining increasing attention as potential environmental friendly and sustainable building blocks in many fields of the (bio)chemical industry. The microbial production of polysaccharides is envisioned as a promising path, since higher biomass growth rates are possible and therefore higher productivities may be achieved compared to vegetable or animal polysaccharides sources. This Ph.D. thesis focuses on the modeling and optimization of a particular microbial polysaccharide, namely the production of extracellular polysaccharides (EPS) by the bacterial strain Enterobacter A47. Enterobacter A47 was found to be a metabolically versatile organism in terms of its adaptability to complex media, notably capable of achieving high growth rates in media containing glycerol byproduct from the biodiesel industry. However, the industrial implementation of this production process is still hampered due to a largely unoptimized process. Kinetic rates from the bioreactor operation are heavily dependent on operational parameters such as temperature, pH, stirring and aeration rate. The increase of culture broth viscosity is a common feature of this culture and has a major impact on the overall performance. This fact complicates the mathematical modeling of the process, limiting the possibility to understand, control and optimize productivity. In order to tackle this difficulty, data-driven mathematical methodologies such as Artificial Neural Networks can be employed to incorporate additional process data to complement the known mathematical description of the fermentation kinetics. In this Ph.D. thesis, we have adopted such an hybrid modeling framework that enabled the incorporation of temperature, pH and viscosity effects on the fermentation kinetics in order to improve the dynamical modeling and optimization of the process. A model-based optimization method was implemented that enabled to design bioreactor optimal control strategies in the sense of EPS productivity maximization. It is also critical to understand EPS synthesis at the level of the bacterial metabolism, since the production of EPS is a tightly regulated process. Methods of pathway analysis provide a means to unravel the fundamental pathways and their controls in bioprocesses. In the present Ph.D. thesis, a novel methodology called Principal Elementary Mode Analysis (PEMA) was developed and implemented that enabled to identify which cellular fluxes are activated under different conditions of temperature and pH. It is shown that differences in these two parameters affect the chemical composition of EPS, hence they are critical for the regulation of the product synthesis. In future studies, the knowledge provided by PEMA could foster the development of metabolically meaningful control strategies that target the EPS sugar content and oder product quality parameters.
Resumo:
Despite the extensive literature in finding new models to replace the Markowitz model or trying to increase the accuracy of its input estimations, there is less studies about the impact on the results of using different optimization algorithms. This paper aims to add some research to this field by comparing the performance of two optimization algorithms in drawing the Markowitz Efficient Frontier and in real world investment strategies. Second order cone programming is a faster algorithm, appears to be more efficient, but is impossible to assert which algorithm is better. Quadratic Programming often shows superior performance in real investment strategies.
Resumo:
Nowadays, a significant number of banks in Portugal are facing a bank-branch restructuring problem, and Millennium BCP is not an exception. The closure of branches is a major component of profit maximization through the reduction in operational and personnel costs but also an opportunity to approach the idea of “baking of future” and start thinking on the benefits of the digital era. This dissertation centers on a current high-impact organizational problem addressed by the company and consists in a proposal of optimization to the model that Millennium BCP uses. Even though measures of performance are usually considered the most important elements in evaluating the viability of branches, there is evidence suggesting that other general factors can be important to assess branch potential, such as the influx on branches, business dimensions of a branch and its location, which will be addressed in this project.
Resumo:
Sonae MC is constantly innovating and keeping up with the new market trends, being increasingly focused on E-commerce due to its growing importance. In that area, a telephone line is available to support customers with their problems. However, rare were the cases in which those problems were solved in the first contact. Therefore, the goal of this work was to reengineer these processes to improve the service performance and consequently the customer’s satisfaction. Following an evolutionary approach, improvement opportunities were suggested and if correctly implemented the cases resolution time could decrease 1 day and Sonae MC will save €7.750 per month.
Resumo:
Earthworks involve the levelling or shaping of a target area through the moving or processing of the ground surface. Most construction projects require earthworks, which are heavily dependent on mechanical equipment (e.g., excavators, trucks and compactors). Often, earthworks are the most costly and time-consuming component of infrastructure constructions (e.g., road, railway and airports) and current pressure for higher productivity and safety highlights the need to optimize earthworks, which is a nontrivial task. Most previous attempts at tackling this problem focus on single-objective optimization of partial processes or aspects of earthworks, overlooking the advantages of a multi-objective and global optimization. This work describes a novel optimization system based on an evolutionary multi-objective approach, capable of globally optimizing several objectives simultaneously and dynamically. The proposed system views an earthwork construction as a production line, where the goal is to optimize resources under two crucial criteria (costs and duration) and focus the evolutionary search (non-dominated sorting genetic algorithm-II) on compaction allocation, using linear programming to distribute the remaining equipment (e.g., excavators). Several experiments were held using real-world data from a Portuguese construction site, showing that the proposed system is quite competitive when compared with current manual earthwork equipment allocation.
Resumo:
Earthworks tasks aim at levelling the ground surface at a target construction area and precede any kind of structural construction (e.g., road and railway construction). It is comprised of sequential tasks, such as excavation, transportation, spreading and compaction, and it is strongly based on heavy mechanical equipment and repetitive processes. Under this context, it is essential to optimize the usage of all available resources under two key criteria: the costs and duration of earthwork projects. In this paper, we present an integrated system that uses two artificial intelligence based techniques: data mining and evolutionary multi-objective optimization. The former is used to build data-driven models capable of providing realistic estimates of resource productivity, while the latter is used to optimize resource allocation considering the two main earthwork objectives (duration and cost). Experiments held using real-world data, from a construction site, have shown that the proposed system is competitive when compared with current manual earthwork design.
Resumo:
Earthworks tasks are often regarded in transportation projects as some of the most demanding processes. In fact, sequential tasks such as excavation, transportation, spreading and compaction are strongly based on heavy mechanical equipment and repetitive processes, thus becoming as economically demanding as they are time-consuming. Moreover, actual construction requirements originate higher demands for productivity and safety in earthwork constructions. Given the percentual weight of costs and duration of earthworks in infrastructure construction, the optimal usage of every resource in these tasks is paramount. Considering the characteristics of an earthwork construction, it can be looked at as a production line based on resources (mechanical equipment) and dependency relations between sequential tasks, hence being susceptible to optimization. Up to the present, the steady development of Information Technology areas, such as databases, artificial intelligence and operations research, has resulted in the emergence of several technologies with potential application bearing that purpose in mind. Among these, modern optimization methods (also known as metaheuristics), such as evolutionary computation, have the potential to find high quality optimal solutions with a reasonable use of computational resources. In this context, this work describes an optimization algorithm for earthworks equipment allocation based on a modern optimization approach, which takes advantage of the concept that an earthwork construction can be regarded as a production line.
Resumo:
In highway construction, earthworks refer to the tasks of excavation, transportation, spreading and compaction of geomaterial (e.g. soil, rockfill and soil-rockfill mixture). Whereas relying heavily on machinery and repetitive processes, these tasks are highly susceptible to optimization. In this context Artificial Intelligent techniques, such as Data Mining and modern optimization can be applied for earthworks. A survey of these applications shows that they focus on the optimization of specific objectives and/or construction phases being possible to identify the capabilities and limitations of the analyzed techniques. Thus, according to the pinpointed drawbacks of these techniques, this paper describes a novel intelligent earthwork optimization system, capable of integrating DM, modern optimization and GIS technologies in order to optimize the earthwork processes throughout all phases of design and construction work. This integration system allows significant savings in time, cost and gas emissions contributing for a more sustainable construction.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
In this work, the optimization of an extrusion die designed for the production of a wood–plastic composite (WPC) decking profile is investigated. The optimization was performed with the help of numerical tools, more precisely, by solving the continuity and momentum conservation equations that govern such flow, and aiming to balance properly the flow distribution at the extrusion die flow channel outlet. To capture the rheological behavior of the material, we used a Bird-Carreau model with parameters obtained from a fit to the (shear viscosity versus shearrate) experimental data, collected from rheological tests. To yield a balanced output flow, several numerical runs were performed by adjusting the flow restriction at different regions of the flow-channel parallel zone crosssection. The simulations were compared with the experimental results and an excellent qualitative agreement was obtained, allowing, in this way, to attain a good balancing of the output flow and emphasizing the advantages of using numerical tools to aid the design of profile extrusion dies.
Resumo:
Shifting from chemical to biotechnological processes is one of the cornerstones of 21st century industry. The production of a great range of chemicals via biotechnological means is a key challenge on the way toward a bio-based economy. However, this shift is occurring at a pace slower than initially expected. The development of efficient cell factories that allow for competitive production yields is of paramount importance for this leap to happen. Constraint-based models of metabolism, together with in silico strain design algorithms, promise to reveal insights into the best genetic design strategies, a step further toward achieving that goal. In this work, a thorough analysis of the main in silico constraint-based strain design strategies and algorithms is presented, their application in real-world case studies is analyzed, and a path for the future is discussed.
Resumo:
Transcriptional Regulatory Networks (TRNs) are powerful tool for representing several interactions that occur within a cell. Recent studies have provided information to help researchers in the tasks of building and understanding these networks. One of the major sources of information to build TRNs is biomedical literature. However, due to the rapidly increasing number of scientific papers, it is quite difficult to analyse the large amount of papers that have been published about this subject. This fact has heightened the importance of Biomedical Text Mining approaches in this task. Also, owing to the lack of adequate standards, as the number of databases increases, several inconsistencies concerning gene and protein names and identifiers are common. In this work, we developed an integrated approach for the reconstruction of TRNs that retrieve the relevant information from important biological databases and insert it into a unique repository, named KREN. Also, we applied text mining techniques over this integrated repository to build TRNs. However, was necessary to create a dictionary of names and synonyms associated with these entities and also develop an approach that retrieves all the abstracts from the related scientific papers stored on PubMed, in order to create a corpora of data about genes. Furthermore, these tasks were integrated into @Note, a software system that allows to use some methods from the Biomedical Text Mining field, including an algorithms for Named Entity Recognition (NER), extraction of all relevant terms from publication abstracts, extraction relationships between biological entities (genes, proteins and transcription factors). And finally, extended this tool to allow the reconstruction Transcriptional Regulatory Networks through using scientific literature.
Resumo:
Construction sector is one of the major responsible for energy consumption and carbon emissions and renovation of existing buildings plays an important role in the actions to mitigate climate changes. Present work is based on the methodology developed in IEA Annex 56, allowing identifying cost optimal and cost effective renovation scenarios improving the energy performance. The analysed case study is a residential neighbourhood of the municipality of Gaia in Portugal. The analysis compares a reference renovation scenario (without improving the energy performance of the building) with a series of alternative renovation scenarios, including the one that is being implemented.