961 resultados para General-purpose computing on graphics processing units (GPGPU)
Resumo:
The surprising discovery of the X(3872) resonance by the Belle experiment in 2003, and subsequent confirmation by BaBar, CDF and D0, opened up a new chapter of QCD studies and puzzles. Since then, detailed experimental and theoretical studies have been performed in attempt to determine and explain the proprieties of this state. Since the end of 2009 the world’s largest and highest-energy particle accelerator, the Large Hadron Collider (LHC), started its operations at the CERN laboratories in Geneva. One of the main experiments at LHC is CMS (Compact Muon Solenoid), a general purpose detector projected to address a wide range of physical phenomena, in particular the search of the Higgs boson, the only still unconfirmed element of the Standard Model (SM) of particle interactions and, new physics beyond the SM itself. Even if CMS has been designed to study high energy events, it’s high resolution central tracker and superior muon spectrometer made it an optimal tool to study the X(3872) state. In this thesis are presented the results of a series of study on the X(3872) state performed with the CMS experiment. Already with the first year worth of data, a clear peak for the X(3872) has been identified, and the measurement of the cross section ratio with respect to the Psi(2S) has been performed. With the increased statistic collected during 2011 it has been possible to study, in bins of transverse momentum, the cross section ratio between X(3872) and Psi(2S) and separate their prompt and non-prompt component.
Resumo:
Next generation electronic devices have to guarantee high performance while being less power-consuming and highly reliable for several application domains ranging from the entertainment to the business. In this context, multicore platforms have proven the most efficient design choice but new challenges have to be faced. The ever-increasing miniaturization of the components produces unexpected variations on technological parameters and wear-out characterized by soft and hard errors. Even though hardware techniques, which lend themselves to be applied at design time, have been studied with the objective to mitigate these effects, they are not sufficient; thus software adaptive techniques are necessary. In this thesis we focus on multicore task allocation strategies to minimize the energy consumption while meeting performance constraints. We firstly devise a technique based on an Integer Linear Problem formulation which provides the optimal solution but cannot be applied on-line since the algorithm it needs is time-demanding; then we propose a sub-optimal technique based on two steps which can be applied on-line. We demonstrate the effectiveness of the latter solution through an exhaustive comparison against the optimal solution, state-of-the-art policies, and variability-agnostic task allocations by running multimedia applications on the virtual prototype of a next generation industrial multicore platform. We also face the problem of the performance and lifetime degradation. We firstly focus on embedded multicore platforms and propose an idleness distribution policy that increases core expected lifetimes by duty cycling their activity; then, we investigate the use of micro thermoelectrical coolers in general-purpose multicore processors to control the temperature of the cores at runtime with the objective of meeting lifetime constraints without performance loss.
Analysis of the influence of epitope flanking regions on MHC class I restricted antigen presentation
Resumo:
Peptides presented by MHC class I molecules for CTL recognition are derived mainly from cytosolic proteins. For antigen presentation on the cell surface, epitopes require correct processing by cytosolic and ER proteases, efficient TAP transport and MHC class I binding affinity. The efficiency of epitope generation depends not only on the epitope itself, but also on its flanking regions. In this project, the influence of the C-terminal region of the model epitope SIINFEKL (S8L) from chicken ovalbumin (aa 257-264) on antigen processing has been investigated. S8L is a well characterized epitope presented on the murine MHC class I molecule, H-2Kb. The Flp-In 293Kb cell line was transfected with different constructs each enabling the expression of the S8L sequence with different defined C-terminal flanking regions. The constructs differed at the two first C-terminal positions after the S8L epitope, so called P1’ and P2’. At these sites, all 20 amino acids were exchanged consecutively and tested for their influence on H-2Kb/S8L presentation on the cell surface of the Flp-In 293Kb cells. The detection of this complex was performed by immunostaining and flow cytometry. The prevailing assumption is that proteasomal cleavages are exclusively responsible for the generation of the final C-termini of CTL epitopes. Nevertheless, recent publications showed that TPPII (tripeptidyl peptidase II) is required for the generation of the correct C-terminus of the HLA-A3-restricted HIV epitope Nef(73-82). With this background, the dependence of the S8L generation on proteasomal cleavage of the designed constructs was characterized using proteasomal inhibitors. The results obtained indicate that it is crucial for proteasomal cleavage, which amino acid is flanking the C-terminus of an epitope. Furthermore, partially proteasome independent S8L generation from specific S8L-precursor peptides was observed. Hence, the possibility of other existing endo- or carboxy-peptidases in the cytosol that could be involved in the correct trimming of the C-terminus of antigenic peptides for MHC class I presentation was investigated, performing specific knockdowns and using inhibitors against the target peptidases. In parallel, a purification strategy to identify the novel peptidase was established. The purified peaks showing an endopeptidase activity were further analyzed by mass spectrometry and some potential peptidases (like e.g. Lon) were identified, which have to be further characterized.
Resumo:
By pulling and releasing the tension on protein homomers with the Atomic Force Miscroscope (AFM) at different pulling speeds, dwell times and dwell distances, the observed force-response of the protein can be fitted with suitable theoretical models. In this respect we developed mathematical procedures and open-source computer codes for driving such experiments and fitting Bell’s model to experimental protein unfolding forces and protein folding frequencies. We applied the above techniques to the study of proteins GB1 (the B1 IgG-binding domain of protein G from Streptococcus) and I27 (a module of human cardiac titin) in aqueous solutions of protecting osmolytes such as dimethyl sulfoxide (DMSO), glycerol and trimethylamine N-oxide (TMAO). In order to get a molecular understanding of the experimental results we developed an Ising-like model for proteins that incorporates the osmophobic nature of their backbone. The model benefits from analytical thermodynamics and kinetics amenable to Monte-Carlo simulation. The prevailing view used to be that small protecting osmolytes bridge the separating beta-strands of proteins with mechanical resistance, presumably shifting the transition state to significantly higher distances that correlate with the molecular size of the osmolyte molecules. Our experiments showed instead that protecting osmolytes slow down protein unfolding and speed-up protein folding at physiological pH without shifting the protein transition state on the mechanical reaction coordinate. Together with the theoretical results of the Ising-model, our results lend support to the osmophobic theory according to which osmolyte stabilisation is a result of the preferential exclusion of the osmolyte molecules from the protein backbone. The results obtained during this thesis work have markedly improved our understanding of the strategy selected by Nature to strengthen protein stability in hostile environments, shifting the focus from hypothetical protein-osmolyte interactions to the more general mechanism based on the osmophobicity of the protein backbone.
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
Der Erfolg einer Schizophrenie-Behandlung ist zum größten Teil abhängig vom Ansprechen des Patienten auf seine antipsychotische Medikation. Welches Medikament und welche Dosis bei einem individuellen Patienten wirksam sind, kann derzeit erst nach mehrwöchiger Behandlung beurteilt werden. Ein Grund für variierendes Therapieansprechen sind variable Plasmakonzentrationen der Antipsychotika. Ziel dieser Arbeit war es, zu untersuchen, in wieweit der Therapieerfolg zu einem frühen Zeitpunkt der Behandlung durch objektive Symptomerfassung vorhersagbar ist und welche Faktoren die hohe Variabilität der Antipsychotikaspiegel im Blut beeinflussen. rnEine 18-monatige naturalistische klinische Studie an schizophrenen Patienten wurde durchgeführt, um folgende Fragen zu beantworten: Kann man das Therapieansprechen prädizieren und welche Instrumente sind dafür geeignet? Die Psychopathologie wurde anhand zweier Messskalen (Brief Psychiatric Rating Scale, BPRS und Clinical Global Impressions, CGI) wöchentlich ermittelt, um die Besserung der Krankheitssymptome im Verlauf von 8 Wochen zu bewerten. Therapiebegleitend wurden noch die Serum-Konzentrationen der Antipsychotika gemessen. Objektive Symptomerfassung durch BPRS oder CGI waren als Messinstrumente geeignet, Therapieansprechen vorherzusagen. Bezogen auf den Behandlungsbeginn war eine Verminderung der Symptome hoch prädiktiv für späteres Therapieversagen oder -ansprechen. Eine Verminderung um mehr als 36,5% auf der BPRS Skala in Woche 2 wurde als signifikanter Schwellenwert für Nichtansprechen ermittelt. Patienten, deren Symptombesserung unterhalb des Schwellenwertes lag, hatten eine 11,2-fach höhere Wahrscheinlichkeit, am Ende der Studie nicht auf ihre medikamentöse Therapie anzusprechen als die Patienten, die sich um mindestens 36,5% verbesserten. Andere Faktoren, wie Alter, Geschlecht, Dauer der Erkrankung oder Anzahl der stationären Aufenthalte hatten keinen Einfluss auf die Prädiktion des Therapieansprechens. Therapeutische Antipsychotika-Spiegel übten einen positiven Einfluss auf die Ansprechrate aus. Bei Patienten mit therapeutischen Spiegeln war das Ansprechen rascher und die Ansprechrate größer als unter denjenigen deren Spiegel außerhalb der therapeutisch üblichen Bereiche lag. rnEine wichtige Voraussetzung für den Einsatz von TDM ist das Vorhandensein einer präzisen, reproduzierbaren, zeit- und kostensparenden analytischen Methode zur quantitativen Bestimmung der untersuchten Substanzen. Die Entwicklung und Validierung einer solchen geeigneten Methode wurde für den Nachweis von Haloperidol vorgenommen. Eine HPLC-Methode mit Säulenschaltung erwies sich für TDM geeignet. rnBasierend auf den Ergebnissen der eigenen klinischen Studie zur Response Prädiktion wurde untersucht, welche Faktoren die Variabilität der Pharmakokinetik von Antipsychotika beeinflussen. Die Variabilität der Pharmakokinetik ist ein Grund für fehlendes oder unzureichendes Ansprechen. Es wurde zum einen der Einfluss der galenischen Formulierung auf die Freisetzung und zum anderen der Einfluss von entzündlichen Prozessen auf die Metabolisierung eines Antipsychotikums untersucht. Dazu wurden Patientendaten retrospektiv ausgewertet.rnDie Analyse von 247 Serumspiegeln von Patienten, die mit Paliperidon in OROS®Formulierung, einer neu eingeführten Retardform, behandelt wurden, zeigte, dass die intraindividuelle Variabilität der Talspiegel (Vk) von Paliperidon 35% betrug. Er war damit vergleichbar wie für nicht retardiertes Risperidon 32% (p=n.s.). Die Retardierung hatte demnach keinen Varianz mindernden Effekt auf die Talspiegel des Antipsychotikums. Der Wirkstoff-Konzentrations-Bereich lag bei 21-55 ng/ml und entsprach ebenfalls nahezu dem therapeutischen Bereich von Risperidon (20-60 ng/ml). rnEntzündliche Prozesse können die Metabolisierung von Medikamenten verändern. Dies wurde bisher für Medikamente nachgewiesen, die über CYP1A2 abgebaut werden. Durch die eigene Analyse von 84 Patienten-Serumspiegeln konnte festgestellt werden, dass die Metabolisierung von Quetiapin während eines entzündlichen Prozesses beeinträchtigt war, wahrscheinlich durch Hemmung von CYP3A4. Dies sprach dafür, dass auch Wirkstoffe, die über CYP3A4 abgebaut werden, während eines entzündlichen Prozesses im Körper in ihrer Pharmakokinetik beeinträchtigt sein können. Aus diesem Grund sollte während einer Infektion unter der Therapie mit Quetiapin besonders auf die Nebenwirkungen geachtet werden und der Serumspiegel sollte in dieser Zeit überwacht werden, um den Patienten vor eventuellen Nebenwirkungen oder sogar Intoxikationen zu schützen. rnDie Befunde dieser Arbeit zeigen, dass bei einer Behandlung schizophrener Patienten mit Antipsychotika die Messung der Psychopathologie zur Vorhersage des Therapieansprechens und die Messung der Blutspiegel zur Identifizierung von Faktoren, die die pharmakokinetische Variabilität bedingen, geeignet sind. Objektive Symptomerfassung und Therapeutisches Drug Monitoring sind demnach Instrumente, die für die Steuerung der antipsychotischen Pharmakotherapie genutzt werden sollten.rn
Resumo:
Reliable electronic systems, namely a set of reliable electronic devices connected to each other and working correctly together for the same functionality, represent an essential ingredient for the large-scale commercial implementation of any technological advancement. Microelectronics technologies and new powerful integrated circuits provide noticeable improvements in performance and cost-effectiveness, and allow introducing electronic systems in increasingly diversified contexts. On the other hand, opening of new fields of application leads to new, unexplored reliability issues. The development of semiconductor device and electrical models (such as the well known SPICE models) able to describe the electrical behavior of devices and circuits, is a useful means to simulate and analyze the functionality of new electronic architectures and new technologies. Moreover, it represents an effective way to point out the reliability issues due to the employment of advanced electronic systems in new application contexts. In this thesis modeling and design of both advanced reliable circuits for general-purpose applications and devices for energy efficiency are considered. More in details, the following activities have been carried out: first, reliability issues in terms of security of standard communication protocols in wireless sensor networks are discussed. A new communication protocol is introduced, allows increasing the network security. Second, a novel scheme for the on-die measurement of either clock jitter or process parameter variations is proposed. The developed scheme can be used for an evaluation of both jitter and process parameter variations at low costs. Then, reliability issues in the field of “energy scavenging systems” have been analyzed. An accurate analysis and modeling of the effects of faults affecting circuit for energy harvesting from mechanical vibrations is performed. Finally, the problem of modeling the electrical and thermal behavior of photovoltaic (PV) cells under hot-spot condition is addressed with the development of an electrical and thermal model.
Resumo:
Theoretical models are developed for the continuous-wave and pulsed laser incision and cut of thin single and multi-layer films. A one-dimensional steady-state model establishes the theoretical foundations of the problem by combining a power-balance integral with heat flow in the direction of laser motion. In this approach, classical modelling methods for laser processing are extended by introducing multi-layer optical absorption and thermal properties. The calculation domain is consequently divided in correspondence with the progressive removal of individual layers. A second, time-domain numerical model for the short-pulse laser ablation of metals accounts for changes in optical and thermal properties during a single laser pulse. With sufficient fluence, the target surface is heated towards its critical temperature and homogeneous boiling or "phase explosion" takes place. Improvements are seen over previous works with the more accurate calculation of optical absorption and shielding of the incident beam by the ablation products. A third, general time-domain numerical laser processing model combines ablation depth and energy absorption data from the short-pulse model with two-dimensional heat flow in an arbitrary multi-layer structure. Layer removal is the result of both progressive short-pulse ablation and classical vaporisation due to long-term heating of the sample. At low velocity, pulsed laser exposure of multi-layer films comprising aluminium-plastic and aluminium-paper are found to be characterised by short-pulse ablation of the metallic layer and vaporisation or degradation of the others due to thermal conduction from the former. At high velocity, all layers of the two films are ultimately removed by vaporisation or degradation as the average beam power is increased to achieve a complete cut. The transition velocity between the two characteristic removal types is shown to be a function of the pulse repetition rate. An experimental investigation validates the simulation results and provides new laser processing data for some typical packaging materials.
Resumo:
Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.
Resumo:
The scope of this project is to study the effectiveness of building information modelling (BIM) in performing life cycle assessment in a building. For the purposes of the study will be used “Revit” which is a BIM software and Tally which is an LCA tool integrated in Revit. The project is divided in six chapters. The first chapter consists of a theoretical introduction into building information modelling and its connection to life cycle assessment. The second chapter describes the characteristics of building information modelling (BIM). In addition, a comparison has been made with the traditional architectural, engineering and construction business model and the benefits to shift into BIM. In the third chapter it will be a review of the most well-known and available BIM software in the market. In chapter four life cycle assessment (LCA) will be described in general and later on specifically for the purpose of the case study that will be used in the following chapter. Moreover, the tools that are available to perform an LCA will be reviewed. Chapter five will present the case study that consists of a model in a BIM software (Revit) and the LCA performed by Tally, an LCA tool integrated into Revit. In the last chapter will be a discussion of the results that were obtained, the limitation and the possible future improvement in performing life cycle assessment (LCA) in a BIM model.
Resumo:
The energy harvesting research field has grown considerably in the last decade due to increasing interests in energy autonomous sensing systems, which require smart and efficient interfaces for extracting power from energy source and power management (PM) circuits. This thesis investigates the design trade-offs for minimizing the intrinsic power of PM circuits, in order to allow operation with very weak energy sources. For validation purposes, three different integrated power converter and PM circuits for energy harvesting applications are presented. They have been designed for nano-power operations and single-source converters can operate with input power lower than 1 μW. The first IC is a buck-boost converter for piezoelectric transducers (PZ) implementing Synchronous Electrical Charge Extraction (SECE), a non-linear energy extraction technique. Moreover, Residual Charge Inversion technique is exploited for extracting energy from PZ with weak and irregular excitations (i.e. lower voltage), and the implemented PM policy, named Two-Way Energy Storage, considerably reduces the start-up time of the converter, improving the overall conversion efficiency. The second proposed IC is a general-purpose buck-boost converter for low-voltage DC energy sources, up to 2.5 V. An ultra-low-power MPPT circuit has been designed in order to track variations of source power. Furthermore, a capacitive boost circuit has been included, allowing the converter start-up from a source voltage VDC0 = 223 mV. A nano-power programmable linear regulator is also included in order to provide a stable voltage to the load. The third IC implements an heterogeneous multisource buck-boost converter. It provides up to 9 independent input channels, of which 5 are specific for PZ (with SECE) and 4 for DC energy sources with MPPT. The inductor is shared among channels and an arbiter, designed with asynchronous logic to reduce the energy consumption, avoids simultaneous access to the buck-boost core, with a dynamic schedule based on source priority.
Resumo:
According to recent studies, antioxidant supplementation on gamete processing and/or storage solutions improvesgamete quality parameters, after cooling or storage at sub zero temperature. The aim of the present study was to investigate the effects of antioxidant supplementation on pig and horse gamete storage. The first study aimed to determine the effects of resveratrol (RESV) on the apoptotic status of porcine oocytes vitrified by Cryotop method, evaluating phosphatidylserine (PS) exteriorization and caspases activation. RESV(2µM) was added during: IVM (A); 2 h post-warming incubation (B); vitrification/warming and 2 h post-warming incubation (C); all previous phases (D). The obtained data demonstrate that RESV supplementation in the various steps of IVM and vitrification/warming procedure can modulate the apoptotic process, improving the resistance of porcine oocytes to cryopreservation-induced damage. In the second work different concentrations of RESV (10, 20, 40, and 80µM) were added during liquid storage of stallion sperm for 24 hours at either 10°C or 4°C, under anaerobic conditions. Our findings demonstrate that RESV supplementation does not enhance sperm quality of stallion semen after 24 hours of storage. Moreover, the highest RESV concentrations tested (40 and 80µM) could damage sperm functional status, probably acting as pro-oxidant. Finally, in the third work other two antioxidants, ascorbic acid (AA) (100 µM) and glutathione (GSH) (5mM) were added on boar freezing and/or thawing solutions. In our study different sperm parameters were evaluated before freezing and at 30 and 240 minutes after thawing. Our results showed that GSH and AA significantly improved boar sperm cryotolerance, especially when supplemented together to both freezing and thawing media. This improvement was observed in sperm viability and acrosome integrity, sperm motility, and nucleoprotein structure. Although ROS levels were not much increased by freeze-thawing procedures, the addition of GSH and AA to both freezing and thawing extenders significantly decreased intracellular peroxide levels.
Resumo:
Il presente lavoro di tesi, svolto presso i laboratori dell'X-ray Imaging Group del Dipartimento di Fisica e Astronomia dell'Università di Bologna e all'interno del progetto della V Commissione Scientifica Nazionale dell'INFN, COSA (Computing on SoC Architectures), ha come obiettivo il porting e l’analisi di un codice di ricostruzione tomografica su architetture GPU installate su System-On-Chip low-power, al fine di sviluppare un metodo portatile, economico e relativamente veloce. Dall'analisi computazionale sono state sviluppate tre diverse versioni del porting in CUDA C: nella prima ci si è limitati a trasporre la parte più onerosa del calcolo sulla scheda grafica, nella seconda si sfrutta la velocità del calcolo matriciale propria del coprocessore (facendo coincidere ogni pixel con una singola unità di calcolo parallelo), mentre la terza è un miglioramento della precedente versione ottimizzata ulteriormente. La terza versione è quella definitiva scelta perché è la più performante sia dal punto di vista del tempo di ricostruzione della singola slice sia a livello di risparmio energetico. Il porting sviluppato è stato confrontato con altre due parallelizzazioni in OpenMP ed MPI. Si è studiato quindi, sia su cluster HPC, sia su cluster SoC low-power (utilizzando in particolare la scheda quad-core Tegra K1), l’efficienza di ogni paradigma in funzione della velocità di calcolo e dell’energia impiegata. La soluzione da noi proposta prevede la combinazione del porting in OpenMP e di quello in CUDA C. Tre core CPU vengono riservati per l'esecuzione del codice in OpenMP, il quarto per gestire la GPU usando il porting in CUDA C. Questa doppia parallelizzazione ha la massima efficienza in funzione della potenza e dell’energia, mentre il cluster HPC ha la massima efficienza in velocità di calcolo. Il metodo proposto quindi permetterebbe di sfruttare quasi completamente le potenzialità della CPU e GPU con un costo molto contenuto. Una possibile ottimizzazione futura potrebbe prevedere la ricostruzione di due slice contemporaneamente sulla GPU, raddoppiando circa la velocità totale e sfruttando al meglio l’hardware. Questo studio ha dato risultati molto soddisfacenti, infatti, è possibile con solo tre schede TK1 eguagliare e forse a superare, in seguito, la potenza di calcolo di un server tradizionale con il vantaggio aggiunto di avere un sistema portatile, a basso consumo e costo. Questa ricerca si va a porre nell’ambito del computing come uno tra i primi studi effettivi su architetture SoC low-power e sul loro impiego in ambito scientifico, con risultati molto promettenti.
Resumo:
In questa tesi viene presentato un bioreattore in grado di mantenere nel tempo condizioni biologiche tali che consentano di massimizzare i cicli di evoluzione molecolare di vettori di clonazione fagici: litico (T7) o lisogeno (M13). Verranno quindi introdtti concetti legati alla Teoria della Quasispecie e alla relazione tra errori di autoreplicazione e pressioni selettive naturali o artificiali su popolazioni di virus: il modello naturale del sistema evolutivo. Tuttavia, mantenere delle popolazioni di virus significa formire loro un substrato dove replicare. Per fare ciò, altri gruppi di ricerca hanno giá sviluppato complessi e costosi prototipi di macchinari per la crescita continua di popolazioni batteriche: i compartimenti dei sistemi evolutivi. Il bioreattore, oggetto di questo lavoro, fa parte del progetto europeo Evoprog: general purpose programmable machine evolution on a chip (Jaramillo’s Lab, University of Warwick) che, utilizzando tecnologie fagiche e regolazioni sintetiche esistenti, sará in grado di produrre funzionalità biocomputazionali di due ordini di grandezza più veloci rispetto alle tecniche convenzionali, riducendo allo stesso tempo i costi complessivi. Il primo prototipo consiste in uno o piú fermentatori, dove viene fatta crescere la cultura batterica in condizioni ottimizzate di coltivazione continua, e in un cellstat, un volume separato, dove avviene solo la replicazione dei virus. Entrambi i volumi sono di pochi millilitri e appropriatamente interconnessi per consentire una sorta di screening continuo delle biomolecole prodotte all’uscita. Nella parte finale verranno presentati i risultati degli esperimenti preliminari, a dimostrazione dell’affidabilità del prototipo costruito e dei protocolli seguiti per la sterilizzazione e l’assemblaggio del bioreattore. Gli esperimenti effettuati dimostrano il successo di due coltivazioni virali continue e una ricombinazione in vivo di batteriofagi litici o lisogeni ingegnerizzati. La tesi si conclude valutando i futuri sviluppi e i limiti del sistema, tenendo in considerazione, in particolare, alcune applicazioni rivolte agli studi di una terapia batteriofagica.
Resumo:
In recent years, environmental concerns and the expected shortage in the fossil reserves have increased further development of biomaterials. Among them, poly(lactide) PLA possess some potential properties such as good ability process, excellent tensile strength and stiffness equivalent to some commercial petroleum-based polymers (PP, PS, PET, etc.). This biobased polymer is also biodegradable and biocompatible However, one great disadvantage of commercial PLA is slow crystallization rate, which restricts its use in many fields. Using of nanofillers is viewed as an efficient strategy to overcome this problem. In this thesis, the effect of bionanofillers in neat PLA and in blends of poly (L-lactide)(PLA)/poly(ε-Caprolactone) (PCL) has been investigated. The used nanofillers are: poly(L-lactide-co-ε-caprolactone) and poly(L-lactide-b-ε-caprolactone) grafted on cellulose nanowhiskers and neat cellulose nanowhiskers (CNW). The grafting reaction of poly(L-lactide-co-caprolactone) and poly (L-lactide-b-caprolactone) on the nanocellulose has been performed by the grafting from technique. In this way the polymerization reaction it is directly initiated on the substrate surface. The condition of the reaction were chosen after a temperature and solvent screening. By non-isothermal an isothermal DSC analysis the effect of bionanofillers on PLA and 80/20 PLA/PCL was evaluated. Non-isothermal DSC scans show a nucleating effect of the bionanofillers on PLA. This effect is detectable during PLA crystallization from the glassy state. Cold crystallization temperature is reduced upon the addition of the poly(L-lactide-b-caprolactone) grafted on cellulose nanowhiskers that is most performing bionanofiller in acting as a nucleating agent. On the other hand, DSC isothermal analysis on the overall crystallization rate indicate that cellulose nanowhiskers are best nucleating agents during isothermal crystallization from the melt state. In conclusion, nanofillers have different behavior depending on the processing conditions. However, the efficiency of our nanofillers as nucleating agent was clearly demonstrated in both isothermal as in non-isothermal condition.