854 resultados para Heuristic optimization
Resumo:
The objective of this work was to develop a genetic transformation system for tropical maize genotypes via particle bombardment of immature zygotic embryos. Particle bombardment was carried out using a genetic construct with bar and uidA genes under control of CaMV35S promoter. The best conditions to transform maize tropical inbred lines L3 and L1345 were obtained when immature embryos were cultivated, prior to the bombardment, in higher osmolarity during 4 hours and bombarded at an acceleration helium gas pressure of 1,100 psi, two shots per plate, and a microcarrier flying distance of 6.6 cm. Transformation frequencies obtained using these conditions ranged from 0.9 to 2.31%. Integration of foreign genes into the genome of maize plants was confirmed by Southern blot analysis as well as bar and uidA gene expressions. The maize genetic transformation protocol developed in this work will possibly improve the efficiency to produce new transgenic tropical maize lines expressing desirable agronomic characteristics.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
Mixture materials, mix design, and pavement construction are not isolated steps in the concrete paving process. Each affects the other in ways that determine overall pavement quality and long-term performance. However, equipment and procedures commonly used to test concrete materials and concrete pavements have not changed in decades, leaving gaps in our ability to understand and control the factors that determine concrete durability. The concrete paving community needs tests that will adequately characterize the materials, predict interactions, and monitor the properties of the concrete. The overall objectives of this study are (1) to evaluate conventional and new methods for testing concrete and concrete materials to prevent material and construction problems that could lead to premature concrete pavement distress and (2) to examine and refine a suite of tests that can accurately evaluate concrete pavement properties. The project included three phases. In Phase I, the research team contacted each of 16 participating states to gather information about concrete and concrete material tests. A preliminary suite of tests to ensure long-term pavement performance was developed. The tests were selected to provide useful and easy-to-interpret results that can be performed reasonably and routinely in terms of time, expertise, training, and cost. The tests examine concrete pavement properties in five focal areas critical to the long life and durability of concrete pavements: (1) workability, (2) strength development, (3) air system, (4) permeability, and (5) shrinkage. The tests were relevant at three stages in the concrete paving process: mix design, preconstruction verification, and construction quality control. In Phase II, the research team conducted field testing in each participating state to evaluate the preliminary suite of tests and demonstrate the testing technologies and procedures using local materials. A Mobile Concrete Research Lab was designed and equipped to facilitate the demonstrations. This report documents the results of the 16 state projects. Phase III refined and finalized lab and field tests based on state project test data. The results of the overall project are detailed herein. The final suite of tests is detailed in the accompanying testing guide.
Resumo:
MOTIVATION: The detection of positive selection is widely used to study gene and genome evolution, but its application remains limited by the high computational cost of existing implementations. We present a series of computational optimizations for more efficient estimation of the likelihood function on large-scale phylogenetic problems. We illustrate our approach using the branch-site model of codon evolution. RESULTS: We introduce novel optimization techniques that substantially outperform both CodeML from the PAML package and our previously optimized sequential version SlimCodeML. These techniques can also be applied to other likelihood-based phylogeny software. Our implementation scales well for large numbers of codons and/or species. It can therefore analyse substantially larger datasets than CodeML. We evaluated FastCodeML on different platforms and measured average sequential speedups of FastCodeML (single-threaded) versus CodeML of up to 5.8, average speedups of FastCodeML (multi-threaded) versus CodeML on a single node (shared memory) of up to 36.9 for 12 CPU cores, and average speedups of the distributed FastCodeML versus CodeML of up to 170.9 on eight nodes (96 CPU cores in total).Availability and implementation: ftp://ftp.vital-it.ch/tools/FastCodeML/. CONTACT: selectome@unil.ch or nicolas.salamin@unil.ch.
Resumo:
Combinatorial optimization involves finding an optimal solution in a finite set of options; many everyday life problems are of this kind. However, the number of options grows exponentially with the size of the problem, such that an exhaustive search for the best solution is practically infeasible beyond a certain problem size. When efficient algorithms are not available, a practical approach to obtain an approximate solution to the problem at hand, is to start with an educated guess and gradually refine it until we have a good-enough solution. Roughly speaking, this is how local search heuristics work. These stochastic algorithms navigate the problem search space by iteratively turning the current solution into new candidate solutions, guiding the search towards better solutions. The search performance, therefore, depends on structural aspects of the search space, which in turn depend on the move operator being used to modify solutions. A common way to characterize the search space of a problem is through the study of its fitness landscape, a mathematical object comprising the space of all possible solutions, their value with respect to the optimization objective, and a relationship of neighborhood defined by the move operator. The landscape metaphor is used to explain the search dynamics as a sort of potential function. The concept is indeed similar to that of potential energy surfaces in physical chemistry. Borrowing ideas from that field, we propose to extend to combinatorial landscapes the notion of the inherent network formed by energy minima in energy landscapes. In our case, energy minima are the local optima of the combinatorial problem, and we explore several definitions for the network edges. At first, we perform an exhaustive sampling of local optima basins of attraction, and define weighted transitions between basins by accounting for all the possible ways of crossing the basins frontier via one random move. Then, we reduce the computational burden by only counting the chances of escaping a given basin via random kick moves that start at the local optimum. Finally, we approximate network edges from the search trajectory of simple search heuristics, mining the frequency and inter-arrival time with which the heuristic visits local optima. Through these methodologies, we build a weighted directed graph that provides a synthetic view of the whole landscape, and that we can characterize using the tools of complex networks science. We argue that the network characterization can advance our understanding of the structural and dynamical properties of hard combinatorial landscapes. We apply our approach to prototypical problems such as the Quadratic Assignment Problem, the NK model of rugged landscapes, and the Permutation Flow-shop Scheduling Problem. We show that some network metrics can differentiate problem classes, correlate with problem non-linearity, and predict problem hardness as measured from the performances of trajectory-based local search heuristics.
Resumo:
This article presents an optimization methodology of batch production processes assembled by shared resources which rely on a mapping of state-events into time-events allowing in this way the straightforward use of a well consolidated scheduling policies developed for manufacturing systems. A technique to generate the timed Petri net representation from a continuous dynamic representation (Differential-Algebraic Equations systems (DAEs)) of the production system is presented together with the main characteristics of a Petri nets-based tool implemented for optimization purposes. This paper describes also how the implemented tool generates the coverability tree and how it can be pruned by a general purpose heuristic. An example of a distillation process with two shared batch resources is used to illustrate the optimization methodology proposed.
Resumo:
The General Assembly Line Balancing Problem with Setups (GALBPS) was recently defined in the literature. It adds sequence-dependent setup time considerations to the classical Simple Assembly Line Balancing Problem (SALBP) as follows: whenever a task is assigned next to another at the same workstation, a setup time must be added to compute the global workstation time, thereby providing the task sequence inside each workstation. This paper proposes over 50 priority-rule-based heuristic procedures to solve GALBPS, many of which are an improvement upon heuristic procedures published to date.
Resumo:
Abstract: The objective of this work was to evaluate 41 microsatellite markers for heterologous amplifications in piracanjuba (Brycon orbignyanus). Some markers were tested for the first time. Loci were optimized for PCR conditions and applied to a sample of 49 individuals. Thirty-one loci resulted in PCR product formation, whereas ten loci yielded intelligible polymorphic patterns in the evaluated sample and can be used for amplifications in this species. From the evaluated markers, four loci (BoM1, BoM13, Bh6, and Bh16) are valid to be applied in the study of piracanjuba.
Resumo:
Diplomityön tarkoituksena oli parantaa Stora Enso Sachsenin siistausprosessissa tuotetun uusiomassan vaaleuden kehitystä ja tutkia siihen vaikuttavia tekijöitä. Työn kirjallisessa osassa käsiteltiin uusiomassan kuidutusta ja vaahdotussiistausprosessia, sekä keräyspaperin ominaisuuksia ja käyttöä paperiteollisuuden raaka-aineena. Kokeellisessa osassa keskityttiin modifioidun natriumsilikaatin annostuksenoptimointiin ja vaikutuksiin laboratorio- ja prosessioloissa, sekä kesäefektin vaikutuksen tutkimiseen kuidutuksessa ja flotaation eri vaiheissa. Natriumsilikaatin laboratoriotutkimuksessa havaittiin, että korkein vaaleus suhteellisesti pienimmällä laboratorioflotaation häviöllä saavutettiin korkeimmalla tutkitulla natriumsilikaatin annostuksella, joka oli 1,1 %. Korkea natriumsilikaattiannostus yhdistettyinä korkeisiin vetyperoksidiannostukseen, 0,5 %, sekä korkeaan kokonaisalkaliteettiin, 0.33 %, johti korkeimpaan massan vaaleuteen ja pienimpiin häviöihin. Laboratoriotutkimuksen pohjalta modifioidulla natriumsilikaatilla suoritettiin koeajoja prosessissa. Noin 1 % natriumsilikaatin annostuksella havaittiin parempi pH:n bufferointikyky, pienempi kalsiumkarbonaatin määrä flotaation primäärivaiheissa, sekä lievästi parempi massan vaaleus verrattuna prosessissa aiemmin käytettyyn standardinatriumsilikaattiin. Kesäefektitutkimuksessa havaittiin, että kesäefektillä on suurin vaikutus esiflotaation primäärivaiheeseen, sillä primäärivaiheessa kuitujen osuus on huomattavasti suurempi kuin sekundäärivaiheissa. Esiflotaation primäärivaiheen uusiomassojen laboratorioflotaatioiden avulla saavutettujen maksimivaaleuksien ero kesän ja talven välillä oli noin 1,5 %ISO. Kesäefektin ei havaittu suuresti vaikuttavan flotaation sekundäärivaiheisiin.
Resumo:
Pumppauksessa arvioidaan olevan niin teknisesti kuin taloudellisestikin huomattavia mahdollisuuksia säästää energiaa. Maailmanlaajuisesti pumppaus kuluttaa lähes 22 % sähkö-moottorien energiantarpeesta. Tietyillä teollisuudenaloilla jopa yli 50 % moottorien käyttämästä sähköenergiasta voi kulua pumppaukseen. Jäteveden pumppauksessa pumppujen toiminta perustuu tyypillisesti on-off käyntiin, jolloin pumpun ollessa päällä se käy täydellä teholla. Monissa tapauksissa pumput ovat myös ylimitoitettuja. Yhdessä nämä seikat johtavat kasvaneeseen energian kulutukseen. Työn teoriaosassa esitellään perusteet jätevesihuollosta ja jäteveden käsittelystä sekä pumppaussysteemin pääkomponentit: pumppu, putkisto, moottori ja taajuusmuuttaja. Työn empiirisessä osassa esitellään työn aikana kehitetty laskuri, jonka avulla voidaan arvioida energiansäästöpotentiaalia jäteveden pumppaussysteemeissä. Laskurilla on mandollista laskea energiansäästöpotentiaali käytettäessä pumpun tuoton ohjaustapana pyörimisnopeuden säätöä taajuusmuuttajalla on-off säädön sijasta. Laskuri ilmoittaa optimaalisimmanpumpun pyörimisnopeuden sekä ominaisenergiankulutuksen. Perustuen laskuriin, kolme kunnallista jätevedenpumppaamoa tutkittiin. Myös laboratorio-testitsuoritettiin laskurin simuloimiseksi sekä energiansäästöpotentiaalin arvioimiseksi. Tutkimukset osoittavat, että jätevedenpumppauksessa on huomattavia mandollisuuksia säästää energiaa pumpun pyörimisnopeutta pienentämällä. Geodeettisen nostokorkeuden ollessa pieni, voidaan energiaa säästää jopa 50 % ja pitkällä aikavälillä säästö voi olla merkittävä. Tulokset vahvistavat myös tarpeen jätevedenpumppaussysteemien toiminnan optimoimiseksi.
Resumo:
In this thesis, cleaning of ceramic filter media was studied. Mechanisms of fouling and dissolution of iron compounds, as well as methods for cleaning ceramic membranes fouled by iron deposits were studied in the literature part. Cleaning agents and different methods were closer examined in the experimental part of the thesis. Pyrite is found in the geologic strata. It is oxidized to form ferrous ions Fe(II) and ferric ions Fe(III). Fe(III) is further oxidized in the hydrolysis to form ferric hydroxide. Hematite and goethite, for instance, are naturally occurring iron oxidesand hydroxides. In contact with filter media, they can cause severe fouling, which common cleaning techniques competent enough to remove. Mechanisms for the dissolution of iron oxides include the ligand-promoted pathway and the proton-promoted pathway. The dissolution can also be reductive or non-reductive. The most efficient mechanism is the ligand-promoted reductive mechanism that comprises two stages: the induction period and the autocatalytic dissolution.Reducing agents(such as hydroquinone and hydroxylamine hydrochloride), chelating agents (such as EDTA) and organic acids are used for the removal of iron compounds. Oxalic acid is the most effective known cleaning agent for iron deposits. Since formulations are often more effective than organic acids, reducing agents or chelating agents alone, the citrate¿bicarbonate¿dithionite system among others is well studied in the literature. The cleaning is also enhanced with ultrasound and backpulsing.In the experimental part, oxalic acid and nitric acid were studied alone andin combinations. Also citric acid and ascorbic acid among other chemicals were tested. Soaking experiments, experiments with ultrasound and experiments for alternative methods to apply the cleaning solution on the filter samples were carried out. Permeability and ISO Brightness measurements were performed to examine the influence of the cleaning methods on the samples. Inductively coupled plasma optical emission spectroscopy (ICP-OES) analysis of the solutions was carried out to determine the dissolved metals.
Resumo:
The present work aimed at maximizing the number of plantlets obtained by the micropropagation of pineapple (Ananas comosus (L.) Merrill) cv. Pérola. Changes in benzylaminopurine (BAP) concentration, type of medium (liquid or solidified) and the type of explant in the proliferation phase were evaluated. Slips were used as the explant source, which consisted of axillary buds obtained after careful excision of the leaves. A Sterilization was done in the hood with ethanol (70%), for three minutes, followed by calcium hypochlorite (2%), for fifteen minutes, and three washes in sterile water. The explants were introduced in MS medium supplemented with 2mg L-1 BAP and maintained in a growth room at a 16h photoperiod (40 mmol.m-2.s-1), 27 ± 2ºC. After eight weeks, cultures were subcultured for multiplication in MS medium. The following treatments were tested: liquid x solidified medium with different BAP concentrations (0.0, 1.5 or 3.0 mg L-1), and the longitudinal cut, or not, of the shoot bud used as explant. The results showed that liquid medium supplemented with BAP at 1.5 mg L-1, associated with the longitudinal sectioning of the shoot bud used as explant presented the best results, maximizing shoot proliferation. On average, the best treatment would allow for an estimated production of 161,080 plantlets by the micropropagation of the axillary buds of one plant with eight slips and ten buds/slips, within a period of eight months.
Resumo:
Tutkimus keskittyy kansainväliseen hajauttamiseen suomalaisen sijoittajan näkökulmasta. Tutkimuksen toinen tavoite on selvittää tehostavatko uudet kovarianssimatriisiestimaattorit minimivarianssiportfolion optimointiprosessia. Tavallisen otoskovarianssimatriisin lisäksi optimoinnissa käytetään kahta kutistusestimaattoria ja joustavaa monimuuttuja-GARCH(1,1)-mallia. Tutkimusaineisto koostuu Dow Jonesin toimialaindekseistä ja OMX-H:n portfolioindeksistä. Kansainvälinen hajautusstrategia on toteutettu käyttäen toimialalähestymistapaa ja portfoliota optimoidaan käyttäen kahtatoista komponenttia. Tutkimusaieisto kattaa vuodet 1996-2005 eli 120 kuukausittaista havaintoa. Muodostettujen portfolioiden suorituskykyä mitataan Sharpen indeksillä. Tutkimustulosten mukaan kansainvälisesti hajautettujen investointien ja kotimaisen portfolion riskikorjattujen tuottojen välillä ei ole tilastollisesti merkitsevää eroa. Myöskään uusien kovarianssimatriisiestimaattoreiden käytöstä ei synnytilastollisesti merkitsevää lisäarvoa verrattuna otoskovarianssimatrisiin perustuvaan portfolion optimointiin.