975 resultados para Linear multiobjective optimization
Resumo:
Kinetic models have a great potential for metabolic engineering applications. They can be used for testing which genetic and regulatory modifications can increase the production of metabolites of interest, while simultaneously monitoring other key functions of the host organism. This work presents a methodology for increasing productivity in biotechnological processes exploiting dynamic models. It uses multi-objective dynamic optimization to identify the combination of targets (enzymatic modifications) and the degree of up- or down-regulation that must be performed in order to optimize a set of pre-defined performance metrics subject to process constraints. The capabilities of the approach are demonstrated on a realistic and computationally challenging application: a large-scale metabolic model of Chinese Hamster Ovary cells (CHO), which are used for antibody production in a fed-batch process. The proposed methodology manages to provide a sustained and robust growth in CHO cells, increasing productivity while simultaneously increasing biomass production, product titer, and keeping the concentrations of lactate and ammonia at low values. The approach presented here can be used for optimizing metabolic models by finding the best combination of targets and their optimal level of up/down-regulation. Furthermore, it can accommodate additional trade-offs and constraints with great flexibility.
Resumo:
Inspired by the relational algebra of data processing, this paper addresses the foundations of data analytical processing from a linear algebra perspective. The paper investigates, in particular, how aggregation operations such as cross tabulations and data cubes essential to quantitative analysis of data can be expressed solely in terms of matrix multiplication, transposition and the Khatri–Rao variant of the Kronecker product. The approach offers a basis for deriving an algebraic theory of data consolidation, handling the quantitative as well as qualitative sides of data science in a natural, elegant and typed way. It also shows potential for parallel analytical processing, as the parallelization theory of such matrix operations is well acknowledged.
Resumo:
Tese de Doutoramento em Engenharia Civil.
Resumo:
Dissertação de mestrado integrado em Civil Engineering
Resumo:
[Excerpt] Bioethanol from lignocellulosic materials (LCM), also called second generation bioethanol, is considered a promising alternative to first generation bioethanol. An efficient production process of lignocellulosic bioethanol involves an effective pretreatment of LCM to improve the accessibility of cellulose and thus enhance the enzymatic saccharification. One interesting approach is to use the whole slurry from treatment, since allows economical and industrial benefits: washing steps are avoided, water consumption is lower and the sugars from liquid phase can be used, increasing ethanol concentration [1]. However, during the pretreatment step some compounds (such as furans, phenolic compounds and weak acids) are produced. These compounds have an inhibitory effect on the microorganisms used for hydrolysate fermentation [2]. To overcome this, the use of a robust industrial strain together with agro-industrial by-products as nutritional supplementation was proposed to increase the ethanol productivities and yields. (...)
Resumo:
Fluorescence in situ hybridization (FISH) is a molecular technique widely used for the detection and characterization of microbial populations. FISH is affected by a wide variety of abiotic and biotic variables and the way they interact with each other. This is translated into a wide variability of FISH procedures found in the literature. The aim of this work is to systematically study the effects of pH, dextran sulfate and probe concentration in the FISH protocol, using a general peptide nucleic acid (PNA) probe for the Eubacteria domain. For this, response surface methodology was used to optimize these 3 PNA-FISH parameters for Gram-negative (Escherichia coli and Pseudomonas fluorescens) and Gram-positive species (Listeria innocua, Staphylococcus epidermidis and Bacillus cereus). The obtained results show that a probe concentration higher than 300 nM is favorable for both groups. Interestingly, a clear distinction between the two groups regarding the optimal pH and dextran sulfate concentration was found: a high pH (approx. 10), combined with lower dextran sulfate concentration (approx. 2% [w/v]) for Gram-negative species and near-neutral pH (approx. 8), together with higher dextran sulfate concentrations (approx. 10% [w/v]) for Gram-positive species. This behavior seems to result from an interplay between pH and dextran sulfate and their ability to influence probe concentration and diffusion towards the rRNA target. This study shows that, for an optimum hybridization protocol, dextran sulfate and pH should be adjusted according to the target bacteria.
Resumo:
It has been reported that growth hormone may benefit selected patients with congestive heart failure. A 63-year-old man with refractory congestive heart failure waiting for heart transplantation, depending on intravenous drugs (dobutamine) and presenting with progressive worsening of the clinical status and cachexia, despite standard treatment, received growth hormone replacement (8 units per day) for optimization of congestive heart failure management. Increase in both serum growth hormone levels (from 0.3 to 0.8 mg/l) and serum IGF-1 levels (from 130 to 300ng/ml) was noted, in association with clinical status improvement, better optimization of heart failure treatment and discontinuation of dobutamine infusion. Left ventricular ejection fraction (by MUGA) increased from 13 % to 18 % and to 28 % later, in association with reduction of pulmonary pressures and increase in exercise capacity (rise in peak VO2 to 13.4 and to 16.2ml/kg/min later). The patient was "de-listed" for heart transplantation. Growth hormone may benefit selected patients with refractory heart failure.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
Sandwich geometries, mainly in the form of panels and beams, are commonly applied in various transportation industries, such as aerospace, aeronautic and automotive. Sandwich geometries represent important advantages in structural applications, namely high specific stiffness, low weight, and possibility of design optimization prior to manufacturing. The aim of this paper is to uncover the influence of the number of reinforcements (ribs), and of the thickness on the mechanical behavior of all-metal sandwich panels subjected to uncoupled bending and torsion loadings. In this study, four geometries are compared. The orientation of the reinforcements and the effect of transversal ribs are also considered in this study. It is shown that the all the relations are non-linear, despite the elastic nature of the analysis in the Finite Element software ANSYS MECHANICAL APDL.
Resumo:
The decision support models in intensive care units are developed to support medical staff in their decision making process. However, the optimization of these models is particularly difficult to apply due to dynamic, complex and multidisciplinary nature. Thus, there is a constant research and development of new algorithms capable of extracting knowledge from large volumes of data, in order to obtain better predictive results than the current algorithms. To test the optimization techniques a case study with real data provided by INTCare project was explored. This data is concerning to extubation cases. In this dataset, several models like Evolutionary Fuzzy Rule Learning, Lazy Learning, Decision Trees and many others were analysed in order to detect early extubation. The hydrids Decision Trees Genetic Algorithm, Supervised Classifier System and KNNAdaptive obtained the most accurate rate 93.2%, 93.1%, 92.97% respectively, thus showing their feasibility to work in a real environment.
Resumo:
OBJECTIVE: To report the hemodynamic and functional responses obtained with clinical optimization guided by hemodynamic parameters in patients with severe and refractory heart failure. METHODS: Invasive hemodynamic monitoring using right heart catheterization aimed to reach low filling pressures and peripheral resistance. Frequent adjustments of intravenous diuretics and vasodilators were performed according to the hemodynamic measurements. RESULTS: We assessed 19 patients (age = 48±12 years and ejection fraction = 21±5%) with severe heart failure. The intravenous use of diuretics and vasodilators reduced by 12 mm Hg (relative reduction of 43%) pulmonary artery occlusion pressure (P<0.001), with a concomitant increment of 6 mL per beat in stroke volume (relative increment of 24%, P<0.001). We observed significant associations between pulmonary artery occlusion pressure and mean pulmonary artery pressure (r=0.76; P<0.001) and central venous pressure (r=0.63; P<0.001). After clinical optimization, improvement in functional class occurred (P< 0.001), with a tendency towards improvement in ejection fraction and no impairment to renal function. CONCLUSION: Optimization guided by hemodynamic parameters in patients with refractory heart failure provides a significant improvement in the hemodynamic profile with concomitant improvement in functional class. This study emphasizes that adjustments in blood volume result in imme-diate benefits for patients with severe heart failure.
Resumo:
Tese de Doutoramento em Engenharia Industrial e de Sistemas.
Resumo:
En este proyecto se desarrollarán algoritmos numéricos para sistemas no lineales hiperbólicos-parabólicos de ecuaciones diferenciales en derivadas parciales. Dichos sistemas tienen aplicación en propagación de ondas en ámbitos aeroespaciales y astrofísicos.Objetivos generales: 1)Desarrollo y mejora de algoritmos numéricos con la finalidad de incrementar la calidad en la simulación de propagación e interacción de ondas gasdinámicas y magnetogasdinámicas no lineales. 2)Desarrollo de códigos computacionales con la finalidad de simular flujos gasdinámicos de elevada entalpía incluyendo cambios químicos, efectos dispersivos y difusivos.3)Desarrollo de códigos computacionales con la finalidad de simular flujos magnetogasdinámicos ideales y reales.4)Aplicación de los nuevos algoritmos y códigos computacionales a la solución del flujo aerotermodinámico alrededor de cuerpos que ingresan en la atmósfera terrestre. 5)Aplicación de los nuevos algoritmos y códigos computacionales a la simulación del comportamiento dinámico no lineal de arcos magnéticos en la corona solar. 6)Desarrollo de nuevos modelos para describir el comportamiento no lineal de arcos magnéticos en la corona solar.Este proyecto presenta como objetivo principal la introducción de mejoras en algoritmos numéricos para simular la propagación e interacción de ondas no lineales en dos medios gaseosos: aquellos que no poseen carga eléctrica libre (flujos gasdinámicos) y aquellos que tienen carga eléctrica libre (flujos magnetogasdinámicos). Al mismo tiempo se desarrollarán códigos computacionales que implementen las mejoras de las técnicas numéricas.Los algoritmos numéricos se aplicarán con la finalidad de incrementar el conocimiento en tópicos de interés en la ingeniería aeroespacial como es el cálculo del flujo de calor y fuerzas aerotermodinámicas que soportan objetos que ingresan a la atmósfera terrestre y en temas de astrofísica como la propagación e interacción de ondas, tanto para la transferencia de energía como para la generación de inestabilidades en arcos magnéticos de la corona solar. Estos dos temas poseen en común las técnicas y algoritmos numéricos con los que serán tratados. Las ecuaciones gasdinámicas y magnetogasdinámicas ideales conforman sistemas hiperbólicos de ecuaciones diferenciales y pueden ser solucionados utilizando "Riemann solvers" junto con el método de volúmenes finitos (Toro 1999; Udrea 1999; LeVeque 1992 y 2005). La inclusión de efectos difusivos genera que los sistemas de ecuaciones resulten hiperbólicos-parabólicos. La contribución parabólica puede ser considerada como términos fuentes y tratada adicionalmente tanto en forma explícita como implícita (Udrea 1999; LeVeque 2005).Para analizar el flujo alrededor de cuerpos que ingresan en la atmósfera se utilizarán las ecuaciones de Navier-Stokes químicamente activas, mientras la temperatura no supere los 6000K. Para mayores temperaturas es necesario considerar efectos de ionización (Anderson, 1989). Tanto los efectos difusivos como los cambios químicos serán considerados como términos fuentes en las ecuaciones de Euler. Para tratar la propagación de ondas, transferencia de energía e inestabilidades en arcos magnéticos de la corona solar se utilizarán las ecuaciones de la magnetogasdinámica ideal y real. En este caso será también conveniente implementar términos fuente para el tratamiento de fenómenos de transporte como el flujo de calor y el de radiación. Los códigos utilizarán la técnica de volúmenes finitos, junto con esquemas "Total Variation Disminishing - TVD" sobre mallas estructuradas y no estructuradas.
Resumo:
En nuestro proyecto anterior aproximamos el cálculo de una integral definida con integrandos de grandes variaciones funcionales. Nuestra aproximación paraleliza el algoritmo de cómputo de un método adaptivo de cuadratura, basado en reglas de Newton-Cote. Los primeros resultados obtenidos fueron comunicados en distintos congresos nacionales e internacionales; ellos nos permintieron comenzar con una tipificación de las reglas de cuadratura existentes y una clasificación de algunas funciones utilizadas como funciones de prueba. Estas tareas de clasificación y tipificación no las hemos finalizado, por lo que pretendemos darle continuidad a fin de poder informar sobre la conveniencia o no de utilizar nuestra técnica. Para llevar adelante esta tarea se buscará una base de funciones de prueba y se ampliará el espectro de reglas de cuadraturas a utilizar. Además, nos proponemos re-estructurar el cálculo de algunas rutinas que intervienen en el cómputo de la mínima energía de una molécula. Este programa ya existe en su versión secuencial y está modelizado utilizando la aproximación LCAO. El mismo obtiene resultados exitosos en cuanto a precisión, comparado con otras publicaciones internacionales similares, pero requiere de un tiempo de cálculo significativamente alto. Nuestra propuesta es paralelizar el algoritmo mencionado abordándolo al menos en dos niveles: 1- decidir si conviene distribuir el cálculo de una integral entre varios procesadores o si será mejor distribuir distintas integrales entre diferentes procesadores. Debemos recordar que en los entornos de arquitecturas paralelas basadas en redes (típicamente redes de área local, LAN) el tiempo que ocupa el envío de mensajes entre los procesadores es muy significativo medido en cantidad de operaciones de cálculo que un procesador puede completar. 2- de ser necesario, paralelizar el cálculo de integrales dobles y/o triples. Para el desarrollo de nuestra propuesta se desarrollarán heurísticas para verificar y construir modelos en los casos mencionados tendientes a mejorar las rutinas de cálculo ya conocidas. A la vez que se testearán los algoritmos con casos de prueba. La metodología a utilizar es la habitual en Cálculo Numérico. Con cada propuesta se requiere: a) Implementar un algoritmo de cálculo tratando de lograr versiones superadoras de las ya existentes. b) Realizar los ejercicios de comparación con las rutinas existentes para confirmar o desechar una mejor perfomance numérica. c) Realizar estudios teóricos de error vinculados al método y a la implementación. Se conformó un equipo interdisciplinario integrado por investigadores tanto de Ciencias de la Computación como de Matemática. Metas a alcanzar Se espera obtener una caracterización de las reglas de cuadratura según su efectividad, con funciones de comportamiento oscilatorio y con decaimiento exponencial, y desarrollar implementaciones computacionales adecuadas, optimizadas y basadas en arquitecturas paralelas.
Resumo:
Bit serial, processing, digital signal processing, transmission, time division, linear programming, linear optimization