994 resultados para VLSI, floorplanning, optimization, greedy algorithim, ordered tree


Relevância:

20.00% 20.00%

Publicador:

Resumo:

White sand forests, although low in nutrients, are characterized not only by several endemic species of plants but also by several monodominant species. In general, plants in this forest have noticeably thin stems. The aim of this work was to elaborate a parallel dichotomous key for the identification of Angiosperm tree species occurring on white sand forests at the Allpahuayo Mishana National Reserve, Loreto, Peru. We compiled a list of species from several publications in order to have the most comprehensive list of species that occur on white sand forest. We found 219 species of Angiosperm, the more abundant species were Pachira brevipes (26.27%), Caraipa utilis (17.90%), Dicymbe uaiparuensis (13.27%), Dendropanax umbellatus (3.28%), Sloanea spathulata (2.52%), Ternstroemia klugiana (2.30%), Haploclathra cordata (2.28%), Parkia igneiflora (1.20%), Emmotum floribundum (1.06%), Ravenia biramosa (1.04%) among others. Most species of white sand forests can be distinguished using characteristics of stems, branches and leaves. This key is very useful for the development of floristic inventories and related projects on white sand forests from Allpahuayo Mishana National Reserve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Species distribution modeling has relevant implications for the studies of biodiversity, decision making about conservation and knowledge about ecological requirements of the species. The aim of this study was to evaluate if the use of forest inventories can improve the estimation of occurrence probability, identify the limits of the potential distribution and habitat preference of a group of timber tree species. The environmental predictor variables were: elevation, slope, aspect, normalized difference vegetation index (NDVI) and height above the nearest drainage (HAND). To estimate the distribution of species we used the maximum entropy method (Maxent). In comparison with a random distribution, using topographic variables and vegetation index as features, the Maxent method predicted with an average accuracy of 86% the geographical distribution of studied species. The altitude and NDVI were the most important variables. There were limitations to the interpolation of the models for non-sampled locations and that are outside of the elevation gradient associated with the occurrence data in approximately 7% of the basin area. Ceiba pentandra (samaúma), Castilla ulei (caucho) and Hura crepitans (assacu) is more likely to occur in nearby water course areas. Clarisia racemosa (guariúba), Amburana acreana (cerejeira), Aspidosperma macrocarpon (pereiro), Apuleia leiocarpa (cumaru cetim), Aspidosperma parvifolium (amarelão) and Astronium lecointei (aroeira) can also occur in upland forest and well drained soils. This modeling approach has potential for application on other tropical species still less studied, especially those that are under pressure from logging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A high-resolution mtDNA phylogenetic tree allowed us to look backward in time to investigate purifying selection. Purifying selection was very strong in the last 2,500 years, continuously eliminating pathogenic mutations back until the end of the Younger Dryas (∼11,000 years ago), when a large population expansion likely relaxed selection pressure. This was preceded by a phase of stable selection until another relaxation occurred in the out-of-Africa migration. Demography and selection are closely related: expansions led to relaxation of selection and higher pathogenicity mutations significantly decreased the growth of descendants. The only detectible positive selection was the recurrence of highly pathogenic nonsynonymous mutations (m.3394T>C-m.3397A>G-m.3398T>C) at interior branches of the tree, preventing the formation of a dinucleotide STR (TATATA) in the MT-ND1 gene. At the most recent time scale in 124 mother-children transmissions, purifying selection was detectable through the loss of mtDNA variants with high predicted pathogenicity. A few haplogroup-defining sites were also heteroplasmic, agreeing with a significant propensity in 349 positions in the phylogenetic tree to revert back to the ancestral variant. This nonrandom mutation property explains the observation of heteroplasmic mutations at some haplogroup-defining sites in sequencing datasets, which may not indicate poor quality as has been claimed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Mecânica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento em Engenharia de Materiais.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kinetic models have a great potential for metabolic engineering applications. They can be used for testing which genetic and regulatory modifications can increase the production of metabolites of interest, while simultaneously monitoring other key functions of the host organism. This work presents a methodology for increasing productivity in biotechnological processes exploiting dynamic models. It uses multi-objective dynamic optimization to identify the combination of targets (enzymatic modifications) and the degree of up- or down-regulation that must be performed in order to optimize a set of pre-defined performance metrics subject to process constraints. The capabilities of the approach are demonstrated on a realistic and computationally challenging application: a large-scale metabolic model of Chinese Hamster Ovary cells (CHO), which are used for antibody production in a fed-batch process. The proposed methodology manages to provide a sustained and robust growth in CHO cells, increasing productivity while simultaneously increasing biomass production, product titer, and keeping the concentrations of lactate and ammonia at low values. The approach presented here can be used for optimizing metabolic models by finding the best combination of targets and their optimal level of up/down-regulation. Furthermore, it can accommodate additional trade-offs and constraints with great flexibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento em Engenharia Civil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[Excerpt] Bioethanol from lignocellulosic materials (LCM), also called second generation bioethanol, is considered a promising alternative to first generation bioethanol. An efficient production process of lignocellulosic bioethanol involves an effective pretreatment of LCM to improve the accessibility of cellulose and thus enhance the enzymatic saccharification. One interesting approach is to use the whole slurry from treatment, since allows economical and industrial benefits: washing steps are avoided, water consumption is lower and the sugars from liquid phase can be used, increasing ethanol concentration [1]. However, during the pretreatment step some compounds (such as furans, phenolic compounds and weak acids) are produced. These compounds have an inhibitory effect on the microorganisms used for hydrolysate fermentation [2]. To overcome this, the use of a robust industrial strain together with agro-industrial by-products as nutritional supplementation was proposed to increase the ethanol productivities and yields. (...)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fluorescence in situ hybridization (FISH) is a molecular technique widely used for the detection and characterization of microbial populations. FISH is affected by a wide variety of abiotic and biotic variables and the way they interact with each other. This is translated into a wide variability of FISH procedures found in the literature. The aim of this work is to systematically study the effects of pH, dextran sulfate and probe concentration in the FISH protocol, using a general peptide nucleic acid (PNA) probe for the Eubacteria domain. For this, response surface methodology was used to optimize these 3 PNA-FISH parameters for Gram-negative (Escherichia coli and Pseudomonas fluorescens) and Gram-positive species (Listeria innocua, Staphylococcus epidermidis and Bacillus cereus). The obtained results show that a probe concentration higher than 300 nM is favorable for both groups. Interestingly, a clear distinction between the two groups regarding the optimal pH and dextran sulfate concentration was found: a high pH (approx. 10), combined with lower dextran sulfate concentration (approx. 2% [w/v]) for Gram-negative species and near-neutral pH (approx. 8), together with higher dextran sulfate concentrations (approx. 10% [w/v]) for Gram-positive species. This behavior seems to result from an interplay between pH and dextran sulfate and their ability to influence probe concentration and diffusion towards the rRNA target. This study shows that, for an optimum hybridization protocol, dextran sulfate and pH should be adjusted according to the target bacteria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been reported that growth hormone may benefit selected patients with congestive heart failure. A 63-year-old man with refractory congestive heart failure waiting for heart transplantation, depending on intravenous drugs (dobutamine) and presenting with progressive worsening of the clinical status and cachexia, despite standard treatment, received growth hormone replacement (8 units per day) for optimization of congestive heart failure management. Increase in both serum growth hormone levels (from 0.3 to 0.8 mg/l) and serum IGF-1 levels (from 130 to 300ng/ml) was noted, in association with clinical status improvement, better optimization of heart failure treatment and discontinuation of dobutamine infusion. Left ventricular ejection fraction (by MUGA) increased from 13 % to 18 % and to 28 % later, in association with reduction of pulmonary pressures and increase in exercise capacity (rise in peak VO2 to 13.4 and to 16.2ml/kg/min later). The patient was "de-listed" for heart transplantation. Growth hormone may benefit selected patients with refractory heart failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decision support models in intensive care units are developed to support medical staff in their decision making process. However, the optimization of these models is particularly difficult to apply due to dynamic, complex and multidisciplinary nature. Thus, there is a constant research and development of new algorithms capable of extracting knowledge from large volumes of data, in order to obtain better predictive results than the current algorithms. To test the optimization techniques a case study with real data provided by INTCare project was explored. This data is concerning to extubation cases. In this dataset, several models like Evolutionary Fuzzy Rule Learning, Lazy Learning, Decision Trees and many others were analysed in order to detect early extubation. The hydrids Decision Trees Genetic Algorithm, Supervised Classifier System and KNNAdaptive obtained the most accurate rate 93.2%, 93.1%, 92.97% respectively, thus showing their feasibility to work in a real environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To report the hemodynamic and functional responses obtained with clinical optimization guided by hemodynamic parameters in patients with severe and refractory heart failure. METHODS: Invasive hemodynamic monitoring using right heart catheterization aimed to reach low filling pressures and peripheral resistance. Frequent adjustments of intravenous diuretics and vasodilators were performed according to the hemodynamic measurements. RESULTS: We assessed 19 patients (age = 48±12 years and ejection fraction = 21±5%) with severe heart failure. The intravenous use of diuretics and vasodilators reduced by 12 mm Hg (relative reduction of 43%) pulmonary artery occlusion pressure (P<0.001), with a concomitant increment of 6 mL per beat in stroke volume (relative increment of 24%, P<0.001). We observed significant associations between pulmonary artery occlusion pressure and mean pulmonary artery pressure (r=0.76; P<0.001) and central venous pressure (r=0.63; P<0.001). After clinical optimization, improvement in functional class occurred (P< 0.001), with a tendency towards improvement in ejection fraction and no impairment to renal function. CONCLUSION: Optimization guided by hemodynamic parameters in patients with refractory heart failure provides a significant improvement in the hemodynamic profile with concomitant improvement in functional class. This study emphasizes that adjustments in blood volume result in imme-diate benefits for patients with severe heart failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En nuestro proyecto anterior aproximamos el cálculo de una integral definida con integrandos de grandes variaciones funcionales. Nuestra aproximación paraleliza el algoritmo de cómputo de un método adaptivo de cuadratura, basado en reglas de Newton-Cote. Los primeros resultados obtenidos fueron comunicados en distintos congresos nacionales e internacionales; ellos nos permintieron comenzar con una tipificación de las reglas de cuadratura existentes y una clasificación de algunas funciones utilizadas como funciones de prueba. Estas tareas de clasificación y tipificación no las hemos finalizado, por lo que pretendemos darle continuidad a fin de poder informar sobre la conveniencia o no de utilizar nuestra técnica. Para llevar adelante esta tarea se buscará una base de funciones de prueba y se ampliará el espectro de reglas de cuadraturas a utilizar. Además, nos proponemos re-estructurar el cálculo de algunas rutinas que intervienen en el cómputo de la mínima energía de una molécula. Este programa ya existe en su versión secuencial y está modelizado utilizando la aproximación LCAO. El mismo obtiene resultados exitosos en cuanto a precisión, comparado con otras publicaciones internacionales similares, pero requiere de un tiempo de cálculo significativamente alto. Nuestra propuesta es paralelizar el algoritmo mencionado abordándolo al menos en dos niveles: 1- decidir si conviene distribuir el cálculo de una integral entre varios procesadores o si será mejor distribuir distintas integrales entre diferentes procesadores. Debemos recordar que en los entornos de arquitecturas paralelas basadas en redes (típicamente redes de área local, LAN) el tiempo que ocupa el envío de mensajes entre los procesadores es muy significativo medido en cantidad de operaciones de cálculo que un procesador puede completar. 2- de ser necesario, paralelizar el cálculo de integrales dobles y/o triples. Para el desarrollo de nuestra propuesta se desarrollarán heurísticas para verificar y construir modelos en los casos mencionados tendientes a mejorar las rutinas de cálculo ya conocidas. A la vez que se testearán los algoritmos con casos de prueba. La metodología a utilizar es la habitual en Cálculo Numérico. Con cada propuesta se requiere: a) Implementar un algoritmo de cálculo tratando de lograr versiones superadoras de las ya existentes. b) Realizar los ejercicios de comparación con las rutinas existentes para confirmar o desechar una mejor perfomance numérica. c) Realizar estudios teóricos de error vinculados al método y a la implementación. Se conformó un equipo interdisciplinario integrado por investigadores tanto de Ciencias de la Computación como de Matemática. Metas a alcanzar Se espera obtener una caracterización de las reglas de cuadratura según su efectividad, con funciones de comportamiento oscilatorio y con decaimiento exponencial, y desarrollar implementaciones computacionales adecuadas, optimizadas y basadas en arquitecturas paralelas.