19 resultados para non-Gaussian process
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
Il presente elaborato esplora l’attitudine delle organizzazioni nei confronti dei processi di business che le sostengono: dalla semi-assenza di struttura, all’organizzazione funzionale, fino all’avvento del Business Process Reengineering e del Business Process Management, nato come superamento dei limiti e delle problematiche del modello precedente. All’interno del ciclo di vita del BPM, trova spazio la metodologia del process mining, che permette un livello di analisi dei processi a partire dagli event data log, ossia dai dati di registrazione degli eventi, che fanno riferimento a tutte quelle attività supportate da un sistema informativo aziendale. Il process mining può essere visto come naturale ponte che collega le discipline del management basate sui processi (ma non data-driven) e i nuovi sviluppi della business intelligence, capaci di gestire e manipolare l’enorme mole di dati a disposizione delle aziende (ma che non sono process-driven). Nella tesi, i requisiti e le tecnologie che abilitano l’utilizzo della disciplina sono descritti, cosi come le tre tecniche che questa abilita: process discovery, conformance checking e process enhancement. Il process mining è stato utilizzato come strumento principale in un progetto di consulenza da HSPI S.p.A. per conto di un importante cliente italiano, fornitore di piattaforme e di soluzioni IT. Il progetto a cui ho preso parte, descritto all’interno dell’elaborato, ha come scopo quello di sostenere l’organizzazione nel suo piano di improvement delle prestazioni interne e ha permesso di verificare l’applicabilità e i limiti delle tecniche di process mining. Infine, nell’appendice finale, è presente un paper da me realizzato, che raccoglie tutte le applicazioni della disciplina in un contesto di business reale, traendo dati e informazioni da working papers, casi aziendali e da canali diretti. Per la sua validità e completezza, questo documento è stata pubblicato nel sito dell'IEEE Task Force on Process Mining.
Resumo:
Introduction 1.1 Occurrence of polycyclic aromatic hydrocarbons (PAH) in the environment Worldwide industrial and agricultural developments have released a large number of natural and synthetic hazardous compounds into the environment due to careless waste disposal, illegal waste dumping and accidental spills. As a result, there are numerous sites in the world that require cleanup of soils and groundwater. Polycyclic aromatic hydrocarbons (PAHs) are one of the major groups of these contaminants (Da Silva et al., 2003). PAHs constitute a diverse class of organic compounds consisting of two or more aromatic rings with various structural configurations (Prabhu and Phale, 2003). Being a derivative of benzene, PAHs are thermodynamically stable. In addition, these chemicals tend to adhere to particle surfaces, such as soils, because of their low water solubility and strong hydrophobicity, and this results in greater persistence under natural conditions. This persistence coupled with their potential carcinogenicity makes PAHs problematic environmental contaminants (Cerniglia, 1992; Sutherland, 1992). PAHs are widely found in high concentrations at many industrial sites, particularly those associated with petroleum, gas production and wood preserving industries (Wilson and Jones, 1993). 1.2 Remediation technologies Conventional techniques used for the remediation of soil polluted with organic contaminants include excavation of the contaminated soil and disposal to a landfill or capping - containment - of the contaminated areas of a site. These methods have some drawbacks. The first method simply moves the contamination elsewhere and may create significant risks in the excavation, handling and transport of hazardous material. Additionally, it is very difficult and increasingly expensive to find new landfill sites for the final disposal of the material. The cap and containment method is only an interim solution since the contamination remains on site, requiring monitoring and maintenance of the isolation barriers long into the future, with all the associated costs and potential liability. A better approach than these traditional methods is to completely destroy the pollutants, if possible, or transform them into harmless substances. Some technologies that have been used are high-temperature incineration and various types of chemical decomposition (for example, base-catalyzed dechlorination, UV oxidation). However, these methods have significant disadvantages, principally their technological complexity, high cost , and the lack of public acceptance. Bioremediation, on the contrast, is a promising option for the complete removal and destruction of contaminants. 1.3 Bioremediation of PAH contaminated soil & groundwater Bioremediation is the use of living organisms, primarily microorganisms, to degrade or detoxify hazardous wastes into harmless substances such as carbon dioxide, water and cell biomass Most PAHs are biodegradable unter natural conditions (Da Silva et al., 2003; Meysami and Baheri, 2003) and bioremediation for cleanup of PAH wastes has been extensively studied at both laboratory and commercial levels- It has been implemented at a number of contaminated sites, including the cleanup of the Exxon Valdez oil spill in Prince William Sound, Alaska in 1989, the Mega Borg spill off the Texas coast in 1990 and the Burgan Oil Field, Kuwait in 1994 (Purwaningsih, 2002). Different strategies for PAH bioremediation, such as in situ , ex situ or on site bioremediation were developed in recent years. In situ bioremediation is a technique that is applied to soil and groundwater at the site without removing the contaminated soil or groundwater, based on the provision of optimum conditions for microbiological contaminant breakdown.. Ex situ bioremediation of PAHs, on the other hand, is a technique applied to soil and groundwater which has been removed from the site via excavation (soil) or pumping (water). Hazardous contaminants are converted in controlled bioreactors into harmless compounds in an efficient manner. 1.4 Bioavailability of PAH in the subsurface Frequently, PAH contamination in the environment is occurs as contaminants that are sorbed onto soilparticles rather than in phase (NAPL, non aqueous phase liquids). It is known that the biodegradation rate of most PAHs sorbed onto soil is far lower than rates measured in solution cultures of microorganisms with pure solid pollutants (Alexander and Scow, 1989; Hamaker, 1972). It is generally believed that only that fraction of PAHs dissolved in the solution can be metabolized by microorganisms in soil. The amount of contaminant that can be readily taken up and degraded by microorganisms is defined as bioavailability (Bosma et al., 1997; Maier, 2000). Two phenomena have been suggested to cause the low bioavailability of PAHs in soil (Danielsson, 2000). The first one is strong adsorption of the contaminants to the soil constituents which then leads to very slow release rates of contaminants to the aqueous phase. Sorption is often well correlated with soil organic matter content (Means, 1980) and significantly reduces biodegradation (Manilal and Alexander, 1991). The second phenomenon is slow mass transfer of pollutants, such as pore diffusion in the soil aggregates or diffusion in the organic matter in the soil. The complex set of these physical, chemical and biological processes is schematically illustrated in Figure 1. As shown in Figure 1, biodegradation processes are taking place in the soil solution while diffusion processes occur in the narrow pores in and between soil aggregates (Danielsson, 2000). Seemingly contradictory studies can be found in the literature that indicate the rate and final extent of metabolism may be either lower or higher for sorbed PAHs by soil than those for pure PAHs (Van Loosdrecht et al., 1990). These contrasting results demonstrate that the bioavailability of organic contaminants sorbed onto soil is far from being well understood. Besides bioavailability, there are several other factors influencing the rate and extent of biodegradation of PAHs in soil including microbial population characteristics, physical and chemical properties of PAHs and environmental factors (temperature, moisture, pH, degree of contamination). Figure 1: Schematic diagram showing possible rate-limiting processes during bioremediation of hydrophobic organic contaminants in a contaminated soil-water system (not to scale) (Danielsson, 2000). 1.5 Increasing the bioavailability of PAH in soil Attempts to improve the biodegradation of PAHs in soil by increasing their bioavailability include the use of surfactants , solvents or solubility enhancers.. However, introduction of synthetic surfactant may result in the addition of one more pollutant. (Wang and Brusseau, 1993).A study conducted by Mulder et al. showed that the introduction of hydropropyl-ß-cyclodextrin (HPCD), a well-known PAH solubility enhancer, significantly increased the solubilization of PAHs although it did not improve the biodegradation rate of PAHs (Mulder et al., 1998), indicating that further research is required in order to develop a feasible and efficient remediation method. Enhancing the extent of PAHs mass transfer from the soil phase to the liquid might prove an efficient and environmentally low-risk alternative way of addressing the problem of slow PAH biodegradation in soil.
Resumo:
Il mapping di grandezze fisiche risulta estremamente importante, essendo in grado di fornire un adeguato supporto per la localizzazione e il monitoraggio di parametri ambientali sensibili. Nel caso indoor, in assenza di un sistema di localizzazione di riferimento analogo al GPS per il caso outdoor, sfruttando appieno le potenzialità della sensoristica a bordo degli smartphone, si è fatto progressivamente strada il mapping di grandezze fisiche quali, ad esempio, il segnale Wi-Fi e il campo magnetico terrestre. In questo caso il mapping, senza richiedere alcuna infrastruttura e coadiuvato dall'utilizzo di dispositivi portatili largamente diffusi ad uso quotidiano, rappresenta una soluzione relativamente recente ridefinibile come Mobile Crowd Sensing. Il MCS rappresenta un nuovo paradigma di servizio, volto a sfruttare l'interconnettività tra dispositivi portatili per effettuare misurazioni di caratteristiche ambientali in maniera automatizzata, aggregandole in un sistema cloud usufruibile ad una vasta comunità. Tuttavia , il considerevole flusso di dati generato, la variabilità temporale delle grandezze di interesse e il rumore insito nelle misurazioni costituiscono problematiche fondamentali per l'utilizzo e la gestione delle misurazioni effettuate. Per tali motivi l'attività di tesi ha previsto i seguenti obiettivi: (i) fornire una panoramica delle principali tecniche e tecnologie di localizzazione volta a motivare l'importanza del mapping di grandezze fisiche ambientali; (ii) individuazione di grandezze fisiche appetibili per la creazione di mappe affidabili e realizzabili nei contesti applicativi più disparati, sfruttando risorse già presenti nell'ambiente; (iii) sviluppo di un algoritmo statistico in grado di fornire una stima accurata dell'andamento spaziale della grandezza di interesse attraverso un numero limitato di misurazioni, mantenendo la compatibilità con processi MCS e una bassa complessità computazionale. L’algoritmo sviluppato è stato validato attraverso simulazioni e misurazioni svolte in ambienti reali. In particolare, prove sperimentali sono state effettuate nell’arena Vicon nei laboratori DEI dell’Università di Bologna, sede Cesena, concepita dal gruppo di ricerca Casy.
Resumo:
Gli oli microbici stanno ricevendo sempre più attenzioni come possibile alternativa agli oli vegetali, nel processo di sostituzione dei combustibili fossili. Tuttavia, diversi aspetti necessitano di essere ottimizzati al fine di ottenere oli economicamente competitivi e con caratteristiche chimico-fisiche desiderate. In questa ricerca, sono stati utilizzati due differenti approcci per poter realizzare l’obiettivo preposto. Il primo, si è basato sull’ingegnerizzazione genetica del lievito C. oleaginous, al fine di incrementare la produttività di lipidi e modificare la composizione dei trigliceridi (TAG) sintetizzati. Un protocollo basato su una trasformazione genetica mediata da Agrobacterium è stato utilizzato per sovraesprimere la diacilglicerol trasnferasi (DGA1), l’enzima responsabile dell’ultimo step della sintesi dei TAG, e la Δ9-desaturasi, l’enzima che catalizza la conversione dell’acido stearico (C18:0) in acido oleico (C18:1). La selezione di colonie positive e l’analisi dei mutanti ottenuti ha confermato la buona riuscita della trasformazione. Il secondo approccio ha mirato a studiare l’influenza sulla crescita e sul profilo di lipidi accumulati da C. oleaginous da parte di diversi acidi grassi volatili (VFAs), una materia prima ottenibile da trattamenti di scarti industriali. A questo proposito, sono state utilizzate fermentazioni fed-batch su scala da 1-L basate su glucosio e miscele sintetiche di acido acetico e di VFAs come fonte di carbonio. L’utilizzo simultaneo di acido acetico e acidi secondari ha mostrato come sia possibile stimolare il metabolismo microbico al fine di incrementare l'accumulo di oli e ottenere una composizione chimica lipidica desiderata.
Resumo:
Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.
Resumo:
The subject of this work is the diffusion of turbulence in a non-turbulent flow. Such phenomenon can be found in almost every practical case of turbulent flow: all types of shear flows (wakes, jet, boundary layers) present some boundary between turbulence and the non-turbulent surround; all transients from a laminar flow to turbulence must account for turbulent diffusion; mixing of flows often involve the injection of a turbulent solution in a non-turbulent fluid. The mechanism of what Phillips defined as “the erosion by turbulence of the underlying non-turbulent flow”, is called entrainment. It is usually considered to operate on two scales with different mechanics. The small scale nibbling, which is the entrainment of fluid by viscous diffusion of turbulence, and the large scale engulfment, which entraps large volume of flow to be “digested” subsequently by viscous diffusion. The exact role of each of them in the overall entrainment rate is still not well understood, as it is the interplay between these two mechanics of diffusion. It is anyway accepted that the entrainment rate scales with large properties of the flow, while is not understood how the large scale inertial behavior can affect an intrinsically viscous phenomenon as diffusion of vorticity. In the present work we will address then the problem of turbulent diffusion through pseudo-spectral DNS simulations of the interface between a volume of decaying turbulence and quiescent flow. Such simulations will give us first hand measures of velocity, vorticity and strains fields at the interface; moreover the framework of unforced decaying turbulence will permit to study both spatial and temporal evolution of such fields. The analysis will evidence that for this kind of flows the overall production of enstrophy , i.e. the square of vorticity omega^2 , is dominated near the interface by the local inertial transport of “fresh vorticity” coming from the turbulent flow. Viscous diffusion instead plays a major role in enstrophy production in the outbound of the interface, where the nibbling process is dominant. The data from our simulation seems to confirm the theory of an inertially stirred viscous phenomenon proposed by others authors before and provides new data about the inertial diffusion of turbulence across the interface.
Resumo:
Nowadays the medical field is struggling to decrease bacteria biofilm formation which leads to infection. Biomedical devices sterilization has not changed over a long period of time. This results in high costs for hospitals healthcare managements. The objective of this project is to investigate electric field effects and surface energy manipulation as solutions for preventing bacteria biofilm for future devices. Based on electrokinectic environments 2 different methods were tested: feasibility of electric gradient through mediums (DEP) reinforced by numerical simulations; and EWOD by the fabrication of golden interdigitated electrodes on silicon glass substrates, standard ~480 nm Teflon (PTFE) layer and polymeric gasket to contain the bacteria medium. In the first experiment quantitative analysis was carried out to achieve forces required to reject bacteria without considering dielectric environment limitations as bacteria and medium frequency dependence. In the second experiment applied voltages was characterized by droplets contact angle measurements and put to the live bacteria tests. The project resulted on promising results for DEP application due to its wide range of frequency that can be used to make a “general” bacteria rejecting; but in terms of practicality, EWOD probably have higher potential for success but more experiments are needed to verify if can prevent biofilm adhesion besides the Teflon non-adhesive properties (including limitations as Teflon breakthrough, layer sensitivity) at incubation times larger than 24 hours.
Resumo:
Large-scale structures can be considered an interesting and useful "laboratory" to better investigate the Universe; in particular the filaments connecting clusters and superclusters of galaxies can be a powerful tool for this intent, since they are not virialised systems yet. The large structures in the Universe have been studied in different bands, in particular the present work takes into consideration the emission in the radio band. In the last years both compact and diffuse radio emission have been detected, revealing to be associated to single objects and clusters of galaxies respectively. The detection of these sources is important, because the radiation process is the synchrotron emission, which in turn is linked to the presence of a magnetic field: therefore studying these radio sources can help in investigating the magnetic field which permeates different portions of space. Furthermore, radio emission in optical filaments have been detected recently, opening new chances to further improve the understanding of structure formation. Filaments can be seen as the net which links clusters and superclusters. This work was made with the aim of investigating non-thermal properties in low-density regions, looking for possible filaments associated to the diffuse emission. The analysed sources are 0917+75, which is located at a redshift z = 0.125, and the double cluster system A399-A401, positioned at z = 0.071806 and z = 0.073664 respectively. Data were taken from VLA/JVLA observations, and reduced and calibrated with the package AIPS, following the standard procedure. Isocountour and polarisation maps were yielded, allowing to derive the main physical properties. Unfortunately, because of a low quality data for A399-A401, it was not possible to see any radio halo or bridge.
Resumo:
The first chapter of this work has the aim to provide a brief overview of the history of our Universe, in the context of string theory and considering inflation as its possible application to cosmological problems. We then discuss type IIB string compactifications, introducing the study of the inflaton, a scalar field candidated to describe the inflation theory. The Large Volume Scenario (LVS) is studied in the second chapter paying particular attention to the stabilisation of the Kähler moduli which are four-dimensional gravitationally coupled scalar fields which parameterise the size of the extra dimensions. Moduli stabilisation is the process through which these particles acquire a mass and can become promising inflaton candidates. The third chapter is devoted to the study of Fibre Inflation which is an interesting inflationary model derived within the context of LVS compactifications. The fourth chapter tries to extend the zone of slow-roll of the scalar potential by taking larger values of the field φ. Everything is done with the purpose of studying in detail deviations of the cosmological observables, which can better reproduce current experimental data. Finally, we present a slight modification of Fibre Inflation based on a different compactification manifold. This new model produces larger tensor modes with a spectral index in good agreement with the date released in February 2015 by the Planck satellite.
Resumo:
The goal of this master thesis is to explain in detail the application of Non-Destructive-Inspection on the Automotive and the Marine sectors. Nowadays, these two particular industries faces many challenges, including increased global competition, the need for higher performance, a reduction in costs and tighter environmental and safety requirements. The materials used for these applications play key roles in overcoming these challenges. So, also an NDI procedure need to be planned in order to avoid problems during the manufacturing process and the after sale life of the structures. The entire thesis work has been done in collaboration with Vetorix Engineering.
Resumo:
Lo studio nasce dalla necessità di verificare l’applicazione di alcuni punti della ISO/TS 16949:2009 nel reparto di pressatura dell’azienda. Era importante valutare il livello di consapevolezza dei dipendenti relativamente al problema qualità in un’azienda automotive. Perciò, a partire dalla normativa, è stata fatta un’Indagine di Consapevolezza tra tutti i dipendenti il cui lavoro ha influenza sulla qualità del processo produttivo e del prodotto realizzato. Inoltre, serviva controllare se la logistica del reparto di pressatura fosse adeguata per permettere agli operatori di svolgere al meglio l’ autocontrollo, ossia un controllo in process delle quote geometriche e dimensionali durante la produzione di serie. È stata verificata l’adeguatezza delle unità di carico per la movimentazione dei prodotti, sia a livello logistico che di carico di lavoro per chi le deve movimentare. È risultato che le UdC attualmente in uso, ossia carrelli mossi manualmente, non possono essere sostituite, ma le si può rendere più maneggevoli cambiandone le ruote. Si è studiato il carico di lavoro del reparto di pressatura, per valutare se fosse possibile introdurre misure che diano la possibilità agli operatori di reparto di svolgere i controlli in process con più accuratezza, ipotizzando l’introduzione di una figura specifica, l’Operatore dell’Autocontrollo, che possa occuparsi solo di questo incarico, sollevando gli altri operatori da tale incombenza. Infine, ci si è posti il problema di ridurre l’utilizzo dei carrelli elevatori all’interno del reparto. È risultato necessario mantenerne l’uso per poter garantire la grande flessibilità nella produzione di quest’azienda, ma si è ipotizzato di separare il reparto in due diverse zone al momento del rinnovo del parco macchine, in modo da eliminarne l’utilizzo in una delle due zone e concentrarlo nell’altra, rendendoli indispensabili solo per un determinato tipo di produzione che si svolga in un settore ben definito del reparto.
Resumo:
The increasing attention to environmental issues of recent times encourages us to find new methods for the production of energy from renewable sources, and to improve existing ones, increasing their energy yield. Most of the waste and agricultural residues, with a high content of lignin and non-hydrolysable polymers, cannot be effectively transformed into biofuels with existing technology. The purpose of the study was to develop a new thermochemical/ biological process (named Py-AD) for the valorization of scarcely biodegradable substances. A complete continuous prototype was design built and run for 1 year. This consists into a slow pyrolysis system coupled with two sequential digesters and showed to produce a clean pyrobiogas (a biogas with significant amount of C2-C3 hydrocarbons and residual CO/H2), biochar and bio-oil. Py-AD yielded 31.7% w/w biochar 32.5% w/w oil and 24.8% w/w pyrobiogas. The oil condensate obtained was fractionated in its aqueous and organic fraction (87% and 13% respectively). Subsequently, the anaerobic digestion of aqueous fraction was tested in a UASB reactor, for 180 days, in increasing organic loading rate (OLR). The maximum convertible concentration without undergoing instability phenomena and with complete degradation of pyrogenic chemicals was 1.25 gCOD L digester-1 d-1. The final yield of biomethane was equal to 40% of the theoretical yield and with a noticeable additional production equal to 20% of volatile fatty acids. The final results confirm that anaerobic digestion can be used as a useful tool for cleaning of slow pyrolysis products (both gas and condensable fraction) and the obtaining of relatively clean pyrobiogas that could be directly used in internal combustion engine.
Resumo:
Nowadays the leukodepletion is one of the most important processes done on the blood in order to reduce the risk of transfusion diseases. It can be performed through different techniques but the most popular one is the filtration due to its simplicity and efficiency. This work aims at improving a current commercial product, by developing a new filter based on Fenton-type reaction to cross-link a hydrogel on to the base material. The filters for leukodepletion are preferably made through the melt flow technique resulting in a non-woven tissue; the functionalization should increase the stability of the filter restricting the extraction of substances to minimum amount when in contact with blood. Through the modification the filters can acquire new properties including wettability, surface charge and good resistance to the extractions. The most important for leukodepletion is the surface charge due to the nature of the filtration process. All the modified samples results have been compared to the commercial product. Three different polymers (A, B and C) have been studied for the filter modifications and every modified filter has been tested in order to determine its properties.
Resumo:
Extra cellular vesicles are membrane bound and lipid based nano particles having the size range of 30 to 1000 nm released by a plethora of cells. Their prime function is cellular communication but in the recent studies, the potential of these vesicles to maintain physiological and pathological processes as well as their nano-sized constituents opened doors to its applications in therapeutics, and diagnostics of variety of diseases such as cancer. Their main constituents include lipids, proteins, and RNAs. They are categorized into subtypes such as exosomes, micro-vesicles and apoptotic bodies In recent studies, extracellular vesicles that are derived from plants are gaining high regard due to their variety of advantages such as safety, non-toxicity, and high availability which promotes large scale production. EVs are isolated from mammalian and plant cells using multitude of techniques such as Ultracentrifugation, SEC, Precipitation and so on. Due to the variety in the sources as well as shortcomings arising from the isolation method, a scalable and inexpensive EV isolation method is yet to be designed. This study focusses on isolation of EVs from citrus lemon juice through diafiltration. Lemon is a promising source due to its biological properties to act as antioxidant, anticancer, and anti-inflammatory agents. Lemon derived vesicles was proven to have several proteins analogous to mammalian vesicles. A diafiltration could be carried out for successful removal of impurities and it is a scalable, continuous technique with potentially lower process times. The concentration of purified product and impurities are analysed using Size Exclusion Chromatography in analytical mode. It is also considered imperative to compare the results from diafiltration with gold standard UC. BCA is proposed to evaluate total protein content and DLS for size measurements. Finally, the ideal mode of storage of EVs to protect its internals and its structure is analysed with storage tests.
Resumo:
The rate at which petroleum based plastics are being produced, used and thrown away is increasing every year because of an increase in the global population. Polyhydroxyalkanoates can represent a valid alternative to petroleum based plastics. They are biodegradable polymers that can be produced by some microorganisms as intracellular reserves. The actual problem is represented by the production cost of these bioplastics, which is still not competitive if compared to the one of petroleum based plastics. Mixed microbial cultures can be fed with substrates obtained from the acidogenic fermentation of carbon rich wastes, such as cheese whey, municipal effluents and various kinds of food wastes, that have a low or sometimes even inexisting cost and in this way wastes can be valorized instead of being discharged. The process consists of three phases: acidogenic fermentation in which the substrate is obtained, culture selection in which a PHA-storing culture is selected and enriched eliminating organisms that do not show this property and accumulation, in which the culture is fed until reaching the maximum storage capacity. In this work the possibility to make the process cheaper was explored trying to couple the selection and accumulation steps and a halotolerant culture collected from seawater was used and fed with an artificially salted synthetic substrated made of an aqueous solution containing a mixture of volatile fatty acids in order to explore also if its performance can allow to use it to treat substrates derived from saline effluents, as these streams cannot be treated properly by bacterias found in activated sludge plants due to inhibition caused by high salt concentrations. Generating and selling the produced PHAs obtained from these bacterias it could be possible to lower, nullify or even overcome the costs associated to the new section of a treating plant dedicated to saline effluents.