941 resultados para Biofertilizer and optimization


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Durante l'attività di ricerca sono stati sviluppati tre progetti legati allo sviluppo e ottimizzazione di materiali compositi. In particolare, il primo anno, siamo andati a produrre materiali ceramici ultrarefrattari tenacizzati con fibre di carburo di silicio, riuscendo a migliorare il ciclo produttivo e ottenendo un materiale ottimizzato. Durante il secondo anno di attività ci siamo concentrati nello sviluppo di resine epossidiche rinforzate con particelle di elastomeri florurati che rappresentano un nuovo materiale non presente nel mercato utile per applicazioni meccaniche e navali. L'ultimo anno di ricerca è stato svolto presso il laboratorio materiali di Ansaldo Energia dove è stato studiato il comportamenteo di materiali per turbine a gas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Questo studio, che è stato realizzato in collaborazione con Hera, è un'analisi della gestione dei rifiuti a Bologna. La ricerca è stata effettuata su diversi livelli: un livello strategico il cui scopo è quello di identificare nuovi metodi per la raccolta dei rifiuti in funzione delle caratteristiche del territorio della città, un livello analitico che riguarda il miglioramento delle applicazioni informatiche di supporto, e livello ambientale che riguarda il calcolo delle emissioni in atmosfera di veicoli adibiti alla raccolta e al trasporto dei rifiuti. innanzitutto è stato necessario studiare Bologna e lo stato attuale dei servizi di raccolta dei rifiuti. È incrociando questi componenti che in questi ultimi tre anni sono state effettuate modifiche nel settore della gestione dei rifiuti. I capitoli seguenti sono inerenti le applicazioni informatiche a sostegno di tali attività: Siget e Optit. Siget è il programma di gestione del servizio, che attualmente viene utilizzato per tutte le attività connesse alla raccolta di rifiuti. È un programma costituito da moduli diversi, ma di sola la gestione dati. la sperimentazione con Optit ha aggiunto alla gestione dei dati la possibilità di avere tali dati in cartografia e di associare un algoritmo di routing. I dati archiviati in Siget hanno rappresentato il punto di partenza, l'input, e il raggiungimento di tutti punti raccolta l'obiettivo finale. L'ultimo capitolo è relativo allo studio dell'impatto ambientale di questi percorsi di raccolta dei rifiuti. Tale analisi, basata sulla valutazione empirica e sull'implementazione in Excel delle formule del Corinair mostra la fotografia del servizio nel 2010. Su questo aspetto Optit ha fornito il suo valore aggiunto, implementando nell'algoritmo anche le formule per il calcolo delle emissioni.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study led to the development of new synthesis process to obtain "nano delivery" system like aquasome, suitable to enhance the affinity between dyes and human hair for cosmetic formulation. These systems has been based on silver nanoparticles stabilized by different kind of polymers as PVP or celluloses. The research has been conducted in two steps: the first involved the study and optimization of the nano delivery system synthesis conditions as concentrations, pH and temperature. The second concerned the preparation of a stable, low hazard and with antibacterial and antifungal properties formulation containing the aquasome and a colorant already used in cosmetics (i.e. Basic Red 51) for hair dyeing application.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Enhancing the sensitivity of nuclear magnetic resonance measurements via hyperpolarization techniques like parahydrogen induced polarization (PHIP) is of high interest for spectroscopic investigations. Parahydrogen induced polarization is a chemical method, which makes use of the correlation between nuclear spins in parahydrogen to create hyperpolarized molecules. The key feature of this technique is the pairwise and simultaneous transfer of the two hydrogen atoms of parahydrogen to a double or triple bond resulting in a population of the Zeeman energy levels different from the Boltzmann equation. The obtained hyperpolarization results in antiphase peaks in the NMR spectrum with high intensities. Due to these strong NMR signals, this method finds arnlot of applications in chemistry e.g. the characterization of short-lived reaction intermediates. Also in medicine it opens up the possibility to boost the sensitivity of medical diagnostics via magnetic labeling of active contrast agents. Thus, further examination and optimization of the PHIP technique is of significant importance in order to achieve the highest possible sensitivity gain.rnrnIn this work, different aspects concerning PHIP were studied with respect to its chemical and spectroscopic background. The first part of this work mainly focused on optimizing the PHIP technique by investigating different catalyst systems and developing new setups for the parahydrogenation. Further examinations facilitated the transfer of the generated polarization from the protons to heteronuclei like 13C. The second part of this thesis examined the possibility to transfer these results to different biologically active compounds to enable their later application in medical diagnostics. Onerngroup of interesting substances is represented by metabolites or neurotransmitters in mammalian cells. Other interesting substances are clinically relevant drugs like a barbituric acid derivative or antidepressant drugs like citalopram which were investigated with regard to their applicability for the PHIP technique and the possibility to achievernpolarization transfer to 13C nuclei. The last investigated substrate is a polymerizable monomer whose polymer was used as a blood plasma expander for trauma victims after the first half of the 20th century. In this case, the utility of the monomer for the PHIP technique as a basis for later investigations of a polymerization reaction using hyperpolarized monomers was examined.rnrnHence, this thesis covers the optimization of the PHIP technology, hereby combining different fields of research like chemical and spectroscopical aspects, and transfers the results to applications of real biologally acitve compounds.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In dieser Arbeit wird eine detaillierte Untersuchung und Charakterisierung der Zwei-Photonen-induzierten Fluoreszenzverstärkung von organischen Farbstoffen auf plasmonischen Nanostrukturen vorgestellt. Diese Fluoreszenzverstärkung ist insbesondere für hochaufgelöste Fluoreszenzmikroskopie und Einzelmolekülspektroskopie von großer Bedeutung. Durch die Zwei-Photonen-Anregung resultiert eine Begrenzung des Absorptionsprozesses auf das fokale Volumen. In Kombination mit dem elektrischen Nahfeld der Nanostrukturen als Anregungsquelle entsteht eine noch stärkere Verringerung des Anregungsvolumens auf eine Größe unterhalb der Beugungsgrenze. Dies erlaubt die selektive Messung ausgewählter Farbstoffe. Durch die Herstellung der Nanopartikel mittels Kolloidlithografie wird eine definierte, reproduzierbare Geometrie erhalten. Polymermultischichten dienen als Abstandshalter, um die Farbstoffe an einer exakten Distanz zum Metall zu positionieren. Durch die kovalente Anbindung des Farbstoffs an die oberste Schicht wird eine gleichmäßige Verteilung des Farbstoffs in geringer Konzentration erhalten. rnEs wird eine Verstärkung der Fluoreszenz um den Faktor 30 für Farbstoffe auf Goldellipsen detektiert, verglichen mit Farbstoffen außerhalb des Nahfelds. Sichelförmige Nanostrukturen erzeugen eine Verstärkung von 120. Dies belegt, dass das Ausmaß der Fluoreszenzverstärkung entscheidend von der Stärke des elektrischen Nahfelds der Nanostruktur abhängt. Auch das Material der Nanostruktur ist hierbei von Bedeutung. So erzeugen Silberellipsen eine 1,5-fach höhere Fluoreszenzverstärkung als identische Goldellipsen. Distanzabhängige Fluoreszenzmessungen zeigen, dass die Zwei-Photonen-angeregte Fluoreszenzverstärkung an strukturspezifischen Abständen zum Metall maximiert wird. Elliptische Strukturen zeigen ein Maximum bei einem Abstand von 8 nm zum Metall, wohingegen bei sichelförmigen Nanostrukturen die höchste Fluoreszenzintensität bei 12 nm gemessen wird. Bei kleineren Abständen unterliegt der Farbstoff einem starken Löschprozess, sogenanntes Quenching. Dieses konkurriert mit dem Verstärkungsprozess, wodurch es zu einer geringen Nettoverstärkung kommt. Hat die untersuchte Struktur Dimensionen größer als das Auflösungsvermögen des Mikroskops, ist eine direkte Visualisierung des elektrischen Nahfelds der Nanostruktur möglich. rnrnEin weiterer Fokus dieser Arbeit lag auf der Herstellung neuartiger Nanostrukturen durch kolloidlithografische Methoden. Gestapelte Dimere sichelförmiger Nanostrukturen mit exakter vertikaler Ausrichtung und einem Separationsabstand von etwa 10 nm wurden hergestellt. Die räumliche Nähe der beiden Strukturen führt zu einem Kopplungsprozess, der neue optische Resonanzen hervorruft. Diese können als Superpositionen der Plasmonenmoden der einzelnen Sicheln beschrieben werden. Ein Hybridisierungsmodell wird angewandt, um die spektralen Unterschiede zu erklären. Computersimulationen belegen die zugrunde liegende Theorie und erweitern das Modell um experimentell nicht aufgelöste Resonanzen. rnWeiterhin wird ein neuer Herstellungsprozess für sichelförmige Nanostrukturen vorgestellt, der eine präzise Formanpassung ermöglicht. Hierdurch kann die Lage der Plasmonenresonanz exakt justiert werden. Korrelationen der geometrischen Daten mit den Resonanzwellenlängen tragen zum grundlegenden Verständnis der Plasmonenresonanzen bei. Die vorgestellten Resultate wurden mittels Computersimulationen verifiziert. Der Fabrikationsprozess erlaubt die Herstellung von Dimeren sichelförmiger Nanostrukturen in einer Ebene. Durch die räumliche Nähe überlappen die elektrischen Nahfelder, wodurch es zu kopplungs-induzierten Shifts der Plasmonenresonanzen kommt. Der Unterschied zu theoretisch berechneten ungekoppelten Nanosicheln kann auch bei den gegenüberliegenden sichelförmigen Nanostrukturen mit Hilfe des Plasmonenhybridisierungsmodells erklärt werden.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The mixing of nanoparticles with polymers to form composite materials has been applied for decades. They combine the advantages of polymers (e.g., elasticity, transparency, or dielectric properties) and inorganic nanoparticles (e.g., specific absorption of light, magneto resistance effects, chemical activity, and catalysis etc.). Nanocomposites exhibit several new characters that single-phase materials do not have. Filling the polymeric matrix with an inorganic material requires its homogeneous distribution in order to achieve the highest possible synergetic effect. To fulfill this requirement, the incompatibility between the filler and the matrix, originating from their opposite polarity, has to be resolved. A very important parameter here is the strength and irreversibility of the adsorption of the surface active compound on the inorganic material. In this work the Isothermal titration calorimetry (ITC) was applied as a method to quantify and investigate the adsorption process and binding efficiencies in organic-inorganic–hybrid-systems by determining the thermodynamic parameters (ΔH, ΔS, ΔG, KB as well as the stoichiometry n). These values provide quantification and detailed understanding of the adsorption process of surface active molecules onto inorganic particles. In this way, a direct correlation between the adsorption strength and structure of the surface active compounds can be achieved. Above all, knowledge of the adsorption mechanism in combination with the structure should facilitate a more rational design into the mainly empirically based production and optimization of nanocomposites.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent research has shown that the performance of a single, arbitrarily efficient algorithm can be significantly outperformed by using a portfolio of —possibly on-average slower— algorithms. Within the Constraint Programming (CP) context, a portfolio solver can be seen as a particular constraint solver that exploits the synergy between the constituent solvers of its portfolio for predicting which is (or which are) the best solver(s) to run for solving a new, unseen instance. In this thesis we examine the benefits of portfolio solvers in CP. Despite portfolio approaches have been extensively studied for Boolean Satisfiability (SAT) problems, in the more general CP field these techniques have been only marginally studied and used. We conducted this work through the investigation, the analysis and the construction of several portfolio approaches for solving both satisfaction and optimization problems. We focused in particular on sequential approaches, i.e., single-threaded portfolio solvers always running on the same core. We started from a first empirical evaluation on portfolio approaches for solving Constraint Satisfaction Problems (CSPs), and then we improved on it by introducing new data, solvers, features, algorithms, and tools. Afterwards, we addressed the more general Constraint Optimization Problems (COPs) by implementing and testing a number of models for dealing with COP portfolio solvers. Finally, we have come full circle by developing sunny-cp: a sequential CP portfolio solver that turned out to be competitive also in the MiniZinc Challenge, the reference competition for CP solvers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

L’oggetto principale delle attività di tesi è la caratterizzazione numerico-sperimentale di processi di colata in sabbia di ghisa sferoidale. Inizialmente è stata effettuata un’approfondita indagine bibliografica per comprendere appieno le problematiche relative all’influenza dei parametri del processo fusorio (composizione chimica, trattamento del bagno, velocità di raffreddamento) sulle proprietà microstrutturali e meccaniche di getti ottenuti e per valutare lo stato dell’arte degli strumenti numerici di simulazione delle dinamiche di solidificazione e di previsione delle microstrutture. Sono state definite, realizzate ed impiegate attrezzature sperimentali di colata per la caratterizzazione di leghe rivolte alla misura ed alla differenziazione delle condizioni di processo, in particolare le velocità di raffreddamento, ed atte a validare strumenti di simulazione numerica e modelli previsionali. Inoltre sono stati progettati ed impiegati diversi sistemi per l’acquisizione ed analisi delle temperature all’interno di getti anche di grandi dimensioni. Lo studio, mediante analisi metallografica, di campioni di materiale ottenuto in condizioni differenziate ha confermato l’effetto dei parametri di processo considerati sulle proprietà microstrutturali quali dimensioni dei noduli di grafite e contenuto di ferrite e perlite. In getti di grandi dimensioni si è riscontrata anche una forte influenza dei fenomeni di macrosegregazione e convezione della lega su microstrutture e difettologie dei getti. Le attività si sono concentrate principalmente nella simulazione numerica FEM dei processi fusori studiati e nell’impiego di modelli empirico-analitici per la previsione delle microstrutture. I dati misurati di temperature di processo e di microstrutture sono stati impiegati per la validazione ed ottimizzazione degli strumenti numerici previsionali impiegati su un ampio intervallo di condizioni di processo. L’impiego di strumenti affidabili di simulazione del processo fusorio, attraverso l’implementazione di correlazioni sperimentali microstrutture-proprietà meccaniche, permette la valutazione di proprietà e difettologie dei getti, fornendo un valido aiuto nell’ottimizzazione del prodotto finito e del relativo processo produttivo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When designing metaheuristic optimization methods, there is a trade-off between application range and effectiveness. For large real-world instances of combinatorial optimization problems out-of-the-box metaheuristics often fail, and optimization methods need to be adapted to the problem at hand. Knowledge about the structure of high-quality solutions can be exploited by introducing a so called bias into one of the components of the metaheuristic used. These problem-specific adaptations allow to increase search performance. This thesis analyzes the characteristics of high-quality solutions for three constrained spanning tree problems: the optimal communication spanning tree problem, the quadratic minimum spanning tree problem and the bounded diameter minimum spanning tree problem. Several relevant tree properties, that should be explored when analyzing a constrained spanning tree problem, are identified. Based on the gained insights on the structure of high-quality solutions, efficient and robust solution approaches are designed for each of the three problems. Experimental studies analyze the performance of the developed approaches compared to the current state-of-the-art.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cancer is one of the principal causes of death in the world; almost 8.2 million of deaths were counted in 2012. Emerging evidences indicate that most of the tumors have an increased glycolytic rate and a detriment of oxidative phosphorylation to support abnormal cell proliferation; this phenomenon is known as aerobic glycolysis or Warburg effect. This switching toward glycolysis implies that cancer tissues metabolize approximately tenfold more glucose to lactate in a given time and the amount of lactate released from cancer tissues is much greater than from normal ones. In view of these fundamental discoveries alterations of the cellular metabolism should be considered a crucial hallmark of cancer. Therefore, the investigation of the metabolic differences between normal and transformed cells is important in cancer research and it might find clinical applications. The aim of the project was to investigate the cellular metabolic alterations at single cell level, by monitoring glucose and lactate, in order to provide a better insight in cancer research. For this purpose, electrochemical techniques have been applied. Enzyme-based electrode biosensors for lactate and glucose were –ad hoc- optimized within the project and used as probes for Scanning Electrochemical Microscopy (SECM). The UME biosensor manufacturing and optimization represented a consistent part of the work and a full description of the sensor preparation protocols and of the characterization methods employed is reported. This set-up (SECM used with microbiosensor probes) enabled the non-invasive study of cellular metabolism at single cell level. The knowledge of cancer cell metabolism is required to design more efficient treatment strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In dieser Arbeit wird die Erweiterung und Optimierung eines Diodenlasersystems zur hochauflösenden Resonanzionisationsmassenspektrometrie beschrieben. Ein doppelinterferometrisches Frequenzkontrollsystem, welches Absolutstabilisierung auf ca. 1 MHz sowie sekundenschnelle Frequenzverstimmungen um mehrere GHz für bis zu drei Laser parallel ermöglicht, wurde optimiert. Dieses Lasersystem dient zwei wesentlichen Anwendungen. Ein Aspekt waren umfangreiche spektroskopische Untersuchungen an Uranisotopen mit dem Ziel der präzisen und eindeutigen Bestimmung von Energielagen, Gesamtdrehimpulsen, Hyperfeinkonstanten und Isotopieverschiebungen sowie die Entwicklung eines effizienten, mit kommerziellen Diodenlasern betreibbaren Anregungsschemas. Mit diesen Erkenntnissen wurde die Leistungsfähigkeit des Lasermassenspektrometers für die Ultraspurenanalyse des Isotops 236U, welches als Neutronendosimeter und Tracer für radioaktive anthropogene Kontaminationen in der Umwelt verwendet wird, optimiert und charakterisiert. Anhand von synthetischen Proben wurde eine Isotopenselektivität von 236U/238U=4,5(1,5)∙10-9 demonstriert.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A variety of conformationally constrained aspartate and glutamate analogues inhibit the glutamate transporter 1 (GLT-1, also known as EAAT2). To expand the search for such analogues, a virtual library of aliphatic aspartate and glutamate analogues was generated starting from the chemical universe database GDB-11, which contains 26.4 million possible molecules up to 11 atoms of C, N, O, F, resulting in 101026 aspartate analogues and 151285 glutamate analogues. Virtual screening was realized by high-throughput docking to the glutamate binding site of the glutamate transporter homologue from Pyrococcus horikoshii (PDB code: 1XFH ) using Autodock. Norbornane-type aspartate analogues were selected from the top-scoring virtual hits and synthesized. Testing and optimization led to the identification of (1R*,2R*,3S*,4R*,6R*)-2-amino-6-phenethyl-bicyclo[2.2.1]heptane-2,3-dicarboxylic acid as a new inhibitor of GLT-1 with IC(50) = 1.4 ?M against GLT-1 and no inhibition of the related transporter EAAC1. The systematic diversification of known ligands by enumeration with help of GDB followed by virtual screening, synthesis, and testing as exemplified here provides a general strategy for drug discovery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A central design challenge facing network planners is how to select a cost-effective network configuration that can provide uninterrupted service despite edge failures. In this paper, we study the Survivable Network Design (SND) problem, a core model underlying the design of such resilient networks that incorporates complex cost and connectivity trade-offs. Given an undirected graph with specified edge costs and (integer) connectivity requirements between pairs of nodes, the SND problem seeks the minimum cost set of edges that interconnects each node pair with at least as many edge-disjoint paths as the connectivity requirement of the nodes. We develop a hierarchical approach for solving the problem that integrates ideas from decomposition, tabu search, randomization, and optimization. The approach decomposes the SND problem into two subproblems, Backbone design and Access design, and uses an iterative multi-stage method for solving the SND problem in a hierarchical fashion. Since both subproblems are NP-hard, we develop effective optimization-based tabu search strategies that balance intensification and diversification to identify near-optimal solutions. To initiate this method, we develop two heuristic procedures that can yield good starting points. We test the combined approach on large-scale SND instances, and empirically assess the quality of the solutions vis-à-vis optimal values or lower bounds. On average, our hierarchical solution approach generates solutions within 2.7% of optimality even for very large problems (that cannot be solved using exact methods), and our results demonstrate that the performance of the method is robust for a variety of problems with different size and connectivity characteristics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diagnostics imaging is an essential component of patient selection and treatment planning in oral rehabilitation by means of osseointegrated implants. In 2002, the EAO produced and published guidelines on the use of diagnostic imaging in implant dentistry. Since that time, there have been significant developments in both the application of cone beam computed tomography as well as in the range of surgical and prosthetic applications that can potentially benefit from its use. However, medical exposure to ionizing radiation must always be justified and result in a net benefit to the patient. The as low a dose as is reasonably achievable principle must also be applied taking into account any alternative techniques that might achieve the same objectives. This paper reports on current EAO recommendations arising from a consensus meeting held at the Medical University of Warsaw (2011) to update these guidelines. Radiological considerations are detailed, including justification and optimization, with a special emphasis on the obligations that arise for those who prescribe or undertake such investigations. The paper pays special attention to clinical indications and radiographic diagnostic considerations as well as to future developments and trends.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To generate anatomical data on the human middle ear and adjacent structures to serve as a base for the development and optimization of new implantable hearing aid transducers. Implantable middle ear hearing aid transducers, i.e. the equivalent to the loudspeaker in conventional hearing aids, should ideally fit into the majority of adult middle ears and should utilize the limited space optimally to achieve sufficiently high maximal output levels. For several designs, more anatomical data are needed. METHODS: Twenty temporal bones of 10 formalin-fixed adult human heads were scanned by a computed tomography system (CT) using a slide thickness of 0.63 mm. Twelve landmarks were defined and 24 different distances were calculated for each temporal bone. RESULTS: A statistical description of 24 distances in the adult human middle ear which may limit or influence the design of middle ear transducers is presented. Significant inter-individual differences but no significant differences for gender, side, age or degree of pneumatization of the mastoid were found. Distances, which were not analyzed for the first time in this study, were found to be in good agreement with the results of earlier studies. CONCLUSION: A data set describing the adult human middle ear anatomy quantitatively from the point of view of designers of new implantable hearing aid transducers has been generated. In principle, the method employed in this study using standard CT scans could also be used preoperatively to rule out exclusion criteria.