870 resultados para geopolimeri, beni culturali, caolino, meta-caolino, caratterizzazione chimico-fisica
Resumo:
Analisi di logiche modali con quantificatori parzialmente ordinati.
Resumo:
MFA and LCA methodologies were applied to analyse the anthropogenic aluminium cycle in Italy with focus on historical evolution of stocks and flows of the metal, embodied GHG emissions, and potentials from recycling to provide key features to Italy for prioritizing industrial policy toward low-carbon technologies and materials. Historical trend series were collected from 1947 to 2009 and balanced with data from production, manufacturing and waste management of aluminium-containing products, using a ‘top-down’ approach to quantify the contemporary in-use stock of the metal, and helping to identify ‘applications where aluminium is not yet being recycled to its full potential and to identify present and future recycling flows’. The MFA results were used as a basis for the LCA aimed at evaluating the carbon footprint evolution, from primary and electrical energy, the smelting process and the transportation, embodied in the Italian aluminium. A discussion about how the main factors, according to the Kaya Identity equation, they did influence the Italian GHG emissions pattern over time, and which are the levers to mitigate it, it has been also reported. The contemporary anthropogenic reservoirs of aluminium was estimated at about 320 kg per capita, mainly embedded within the transportation and building and construction sectors. Cumulative in-use stock represents approximately 11 years of supply at current usage rates (about 20 Mt versus 1.7 Mt/year), and it would imply a potential of about 160 Mt of CO2eq emissions savings. A discussion of criticality related to aluminium waste recovery from the transportation and the containers and packaging sectors was also included in the study, providing an example for how MFA and LCA may support decision-making at sectorial or regional level. The research constitutes the first attempt of an integrated approach between MFA and LCA applied to the aluminium cycle in Italy.
Resumo:
In this work, new tools in atmospheric pollutant sampling and analysis were applied in order to go deeper in source apportionment study. The project was developed mainly by the study of atmospheric emission sources in a suburban area influenced by a municipal solid waste incinerator (MSWI), a medium-sized coastal tourist town and a motorway. Two main research lines were followed. For what concerns the first line, the potentiality of the use of PM samplers coupled with a wind select sensor was assessed. Results showed that they may be a valid support in source apportionment studies. However, meteorological and territorial conditions could strongly affect the results. Moreover, new markers were investigated, particularly focusing on the processes of biomass burning. OC revealed a good biomass combustion process indicator, as well as all determined organic compounds. Among metals, lead and aluminium are well related to the biomass combustion. Surprisingly PM was not enriched of potassium during bonfire event. The second research line consists on the application of Positive Matrix factorization (PMF), a new statistical tool in data analysis. This new technique was applied to datasets which refer to different time resolution data. PMF application to atmospheric deposition fluxes identified six main sources affecting the area. The incinerator’s relative contribution seemed to be negligible. PMF analysis was then applied to PM2.5 collected with samplers coupled with a wind select sensor. The higher number of determined environmental indicators allowed to obtain more detailed results on the sources affecting the area. Vehicular traffic revealed the source of greatest concern for the study area. Also in this case, incinerator’s relative contribution seemed to be negligible. Finally, the application of PMF analysis to hourly aerosol data demonstrated that the higher the temporal resolution of the data was, the more the source profiles were close to the real one.
Resumo:
It is well known that many realistic mathematical models of biological systems, such as cell growth, cellular development and differentiation, gene expression, gene regulatory networks, enzyme cascades, synaptic plasticity, aging and population growth need to include stochasticity. These systems are not isolated, but rather subject to intrinsic and extrinsic fluctuations, which leads to a quasi equilibrium state (homeostasis). The natural framework is provided by Markov processes and the Master equation (ME) describes the temporal evolution of the probability of each state, specified by the number of units of each species. The ME is a relevant tool for modeling realistic biological systems and allow also to explore the behavior of open systems. These systems may exhibit not only the classical thermodynamic equilibrium states but also the nonequilibrium steady states (NESS). This thesis deals with biological problems that can be treat with the Master equation and also with its thermodynamic consequences. It is organized into six chapters with four new scientific works, which are grouped in two parts: (1) Biological applications of the Master equation: deals with the stochastic properties of a toggle switch, involving a protein compound and a miRNA cluster, known to control the eukaryotic cell cycle and possibly involved in oncogenesis and with the propose of a one parameter family of master equations for the evolution of a population having the logistic equation as mean field limit. (2) Nonequilibrium thermodynamics in terms of the Master equation: where we study the dynamical role of chemical fluxes that characterize the NESS of a chemical network and we propose a one parameter parametrization of BCM learning, that was originally proposed to describe plasticity processes, to study the differences between systems in DB and NESS.
Resumo:
The thesis work concerns X-ray spectrometry for both medical and space applications and is divided into two sections. The first section addresses an X-ray spectrometric system designed to study radiological beams and is devoted to the optimization of diagnostic procedures in medicine. A parametric semi-empirical model capable of efficiently reconstructing diagnostic X-ray spectra in 'middle power' computers was developed and tested. In addition, different silicon diode detectors were tested as real-time detectors in order to provide a real-time evaluation of the spectrum during diagnostic procedures. This project contributes to the field by presenting an improved simulation of a realistic X-ray beam emerging from a common X-ray tube with a complete and detailed spectrum that lends itself to further studies of added filtration, thus providing an optimized beam for different diagnostic applications in medicine. The second section describes the preliminary tests that have been carried out on the first version of an Application Specific Integrated Circuit (ASIC), integrated with large area position-sensitive Silicon Drift Detector (SDD) to be used on board future space missions. This technology has been developed for the ESA project: LOFT (Large Observatory for X-ray Timing), a new medium-class space mission that the European Space Agency has been assessing since February of 2011. The LOFT project was proposed as part of the Cosmic Vision Program (2015-2025).
Resumo:
Air quality represents a key issue in the so-called pollution “hot spots”: environments in which anthropogenic sources are concentrated and dispersion of pollutants is limited. One of these environments, the Po Valley, normally experiences exceedances of PM10 and PM2.5 concentration limits, especially in winter when the ventilation of the lower layers of the atmosphere is reduced. This thesis provides a highlight of the chemical properties of particulate matter and fog droplets in the Po Valley during the cold season, when fog occurrence is very frequent. Fog-particles interactions were investigated with the aim to determine their impact on the regional air quality. Size-segregated aerosol samples were collected in Bologna, urban site, and San Pietro Capofiume (SPC), rural site, during two campaigns (November 2011; February 2013) in the frame of Supersito project. The comparison between particles size-distribution and chemical composition in both sites showed the relevant contribution of the regional background and secondary processes in determining the Po Valley aerosol concentration. Occurrence of fog in November 2011 campaign in SPC allowed to investigate the role of fog formation and fog chemistry in the formation, processing and deposition of PM10. Nucleation scavenging was investigated with relation to the size and the chemical composition of particles. We found that PM1 concentration is reduced up to 60% because of fog scavenging. Furthermore, aqueous-phase secondary aerosol formation mechanisms were investigated through time-resolved measurements. In SPC fog samples have been systematically collected and analysed since the nineties; a 20 years long database has been assembled. This thesis reports for the first time the results of this long time series of measurements, showing a decrease of sulphate and nitrate concentration and an increase of pH that reached values close to neutrality. A detailed discussion about the occurred changes in fog water composition over two decades is presented.
Resumo:
L’obiettivo del lavoro consiste nell’implementare una metodologia operativa volta alla progettazione di reti di monitoraggio e di campagne di misura della qualità dell’aria con l’utilizzo del laboratorio mobile, ottimizzando le posizioni dei dispositivi di campionamento rispetto a differenti obiettivi e criteri di scelta. La revisione e l’analisi degli approcci e delle indicazioni fornite dalla normativa di riferimento e dai diversi autori di lavori scientifici ha permesso di proporre un approccio metodologico costituito da due fasi operative principali, che è stato applicato ad un caso studio rappresentato dal territorio della provincia di Ravenna. La metodologia implementata prevede l’integrazione di numerosi strumenti di supporto alla valutazione dello stato di qualità dell’aria e degli effetti che gli inquinanti atmosferici possono generare su specifici recettori sensibili (popolazione residente, vegetazione, beni materiali). In particolare, la metodologia integra approcci di disaggregazione degli inventari delle emissioni attraverso l’utilizzo di variabili proxy, strumenti modellistici per la simulazione della dispersione degli inquinanti in atmosfera ed algoritmi di allocazione degli strumenti di monitoraggio attraverso la massimizzazione (o minimizzazione) di specifiche funzioni obiettivo. La procedura di allocazione sviluppata è stata automatizzata attraverso lo sviluppo di un software che, mediante un’interfaccia grafica di interrogazione, consente di identificare delle aree ottimali per realizzare le diverse campagne di monitoraggio
Resumo:
In questo lavoro di tesi si è elaborato un quadro di riferimento per l’utilizzo combinato di due metodologie di valutazione di impatti LCA e RA, per tecnologie emergenti. L’originalità dello studio sta nell’aver proposto e anche applicato il quadro di riferimento ad un caso studio, in particolare ad una tecnologia innovativa di refrigerazione, basata su nanofluidi (NF), sviluppata da partner del progetto Europeo Nanohex che hanno collaborato all’elaborazione degli studi soprattutto per quanto riguarda l’inventario dei dati necessari. La complessità dello studio è da ritrovare tanto nella difficile integrazione di due metodologie nate per scopi differenti e strutturate per assolvere a quegli scopi, quanto nel settore di applicazione che seppur in forte espansione ha delle forti lacune di informazioni circa processi di produzione e comportamento delle sostanze. L’applicazione è stata effettuata sulla produzione di nanofluido (NF) di allumina secondo due vie produttive (single-stage e two-stage) per valutare e confrontare gli impatti per la salute umana e l’ambiente. Occorre specificare che il LCA è stato quantitativo ma non ha considerato gli impatti dei NM nelle categorie di tossicità. Per quanto concerne il RA è stato sviluppato uno studio di tipo qualitativo, a causa della problematica di carenza di parametri tossicologici e di esposizione su citata avente come focus la categoria dei lavoratori, pertanto è stata fatta l’assunzione che i rilasci in ambiente durante la fase di produzione sono trascurabili. Per il RA qualitativo è stato utilizzato un SW specifico, lo Stoffenmanger-Nano che rende possibile la prioritizzazione dei rischi associati ad inalazione in ambiente di lavoro. Il quadro di riferimento prevede una procedura articolata in quattro fasi: DEFINIZIONE SISTEMA TECNOLOGICO, RACCOLTA DATI, VALUTAZIONE DEL RISCHIO E QUANTIFICAZIONE DEGLI IMPATTI, INTERPRETAZIONE.
Resumo:
La sempre maggiore importanza data al luogo, soprattutto dalla normativa attraverso il codice dei Beni Culturali e del Paesaggio, obbliga i progettisti a prestare maggiore attenzione al contesto in cui operano. I luoghi non possono più essere concepiti come spazi neutri capaci di accogliere qualsiasi forma di progetto, ma devono essere studiati e compresi nella loro essenza più profonda. In aiuto viene il concetto di Genius loci che fin dall'epoca romana soprassedeva i luoghi obbligando l'uomo a scendere a patti con esso prima di qualsiasi pratica progettuale. Nel tempo questo concetto si è trasformato ed ha mutato di senso, andando a coincidere con l'identità propria di un determinato luogo. Per luogo si intende una somma complessa di più elementi non scindibili e in rapporto tra loro nel costruirne l'identità specifica. Capire e rispettare l'identità di un luogo significa capire e rispettare il Genius loci. Filo conduttore di questa disamina è il saggio di Christian Norberg-Schulz “Genius loci. Paesaggio ambiente architettura”, in cui i temi del luogo e dell‟identità vengono trattati principalmente in chiave architettonica. Partendo da questo ho cercato di mettere in evidenza questi concetti in tre progetti sviluppati a scale diverse, evidenziandone l‟applicazione e le problematiche in tre ambiti differenti. I progetti presi in esame sono: in ambito rurale, l‟ecovillaggio sviluppato a San Biagio; in ambito urbano, la riqualificazione di un‟area industriale a Forlimpopoli; in ambito metropolitano, il progetto di abitazioni collettive a Bogotá.
Resumo:
Theories and numerical modeling are fundamental tools for understanding, optimizing and designing present and future laser-plasma accelerators (LPAs). Laser evolution and plasma wave excitation in a LPA driven by a weakly relativistically intense, short-pulse laser propagating in a preformed parabolic plasma channel, is studied analytically in 3D including the effects of pulse steepening and energy depletion. At higher laser intensities, the process of electron self-injection in the nonlinear bubble wake regime is studied by means of fully self-consistent Particle-in-Cell simulations. Considering a non-evolving laser driver propagating with a prescribed velocity, the geometrical properties of the non-evolving bubble wake are studied. For a range of parameters of interest for laser plasma acceleration, The dependence of the threshold for self-injection in the non-evolving wake on laser intensity and wake velocity is characterized. Due to the nonlinear and complex nature of the Physics involved, computationally challenging numerical simulations are required to model laser-plasma accelerators operating at relativistic laser intensities. The numerical and computational optimizations, that combined in the codes INF&RNO and INF&RNO/quasi-static give the possibility to accurately model multi-GeV laser wakefield acceleration stages with present supercomputing architectures, are discussed. The PIC code jasmine, capable of efficiently running laser-plasma simulations on Graphics Processing Units (GPUs) clusters, is presented. GPUs deliver exceptional performance to PIC codes, but the core algorithms had to be redesigned for satisfying the constraints imposed by the intrinsic parallelism of the architecture. The simulation campaigns, run with the code jasmine for modeling the recent LPA experiments with the INFN-FLAME and CNR-ILIL laser systems, are also presented.
Resumo:
This thesis provides a thoroughly theoretical background in network theory and shows novel applications to real problems and data. In the first chapter a general introduction to network ensembles is given, and the relations with “standard” equilibrium statistical mechanics are described. Moreover, an entropy measure is considered to analyze statistical properties of the integrated PPI-signalling-mRNA expression networks in different cases. In the second chapter multilayer networks are introduced to evaluate and quantify the correlations between real interdependent networks. Multiplex networks describing citation-collaboration interactions and patterns in colorectal cancer are presented. The last chapter is completely dedicated to control theory and its relation with network theory. We characterise how the structural controllability of a network is affected by the fraction of low in-degree and low out-degree nodes. Finally, we present a novel approach to the controllability of multiplex networks
Resumo:
En este estudio se analizó la Administración Pública que tiene el encargo de la tutela y revalorización del Patrimonio Cultural, en una perspectiva comparativa entre los dos países. La investigación se dividió en dos partes. Una primera en la que se analizan las soluciones legales adoptadas por los diversos regímenes durante los siglos. Se busca analizar la respuesta legal dada históricamente a los problemática social de la conservación del Patrimonio Cultural y entender las políticas que la Administración había adoptado finalmente. Este histórico viaje terminará con la legislación vigente, y con el análisis de la Legislación inminentemente anterior, que ha sentado las bases de la actual estructura de la Administración Pública, que ejercerá las competencias relativas a la protección del patrimonio cultural. El estudio continúa con una segunda parte en la que, con dos capítulos, se procede a examinar la legislación que regula la organización de la Administración Pública tanto de Italia como de España. En cada uno de los países se analizan todos los niveles territoriales, así como los organismos o instituciones autónomas creadas dentro de los Organismos Públicos de cada Estado. El tercer capítulo supone una comparación y una crítica de ambas organizaciones. Ambas estructuras en ocasiones han surgido a partir de un punto en común, incluso, a pesar de haber tenido evoluciones distintas de conformidad a la especificidad del país, todavía presentan similitudes. Por otra parte, ambos países se han visto perjudicadas recientemente por las políticas relacionadas con los recortes en el gasto público, lo que llevó a la reducción de la Administración Pública, aunque no de un modo tan satisfactorio, como esperábamos. El estudio finaliza con las conclusiones obtenidas tras el análisis concienzudo de ambas administraciones.
Resumo:
La ricerca nel campo del cultural heritage management ha adottato negli ultimi decenni le tecnologie web quali strumenti privilegiati per stabilire i nuovi approcci e indirizzi nella valorizzazione della conoscenza. Questa tesi si colloca nell'ambito interdisciplinare tra le scienze umanistiche e informatiche e si fonda sulla consapevolezza del reciproco arricchimento che può derivare dal continuo confronto, le une disponendo di mezzi più espressivi e popolari per divulgare il proprio patrimonio e le altre usufruendo di “materia prima” autorevole (ossia dati strutturati di qualità e alto livello di fiducia) in fase di sperimentazione. Lo studio dei punti di tangenza tra le discipline muove da due ambiti precisi, ovvero le applicazioni informatiche nel campo dell'archivistica e gli sviluppi del semantic web nel settore delle digital humanities.
Resumo:
In this work, the well-known MC code FLUKA was used to simulate the GE PETrace cyclotron (16.5 MeV) installed at “S. Orsola-Malpighi” University Hospital (Bologna, IT) and routinely used in the production of positron emitting radionuclides. Simulations yielded estimates of various quantities of interest, including: the effective dose distribution around the equipment; the effective number of neutron produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of 41Ar, the assessment of the saturation yield of radionuclides used in nuclear medicine. The simulations were validated against experimental measurements in terms of physical and transport parameters to be used at the energy range of interest in the medical field. The validated model was also extensively used in several practical applications uncluding the direct cyclotron production of non-standard radionuclides such as 99mTc, the production of medical radionuclides at TRIUMF (Vancouver, CA) TR13 cyclotron (13 MeV), the complete design of the new PET facility of “Sacro Cuore – Don Calabria” Hospital (Negrar, IT), including the ACSI TR19 (19 MeV) cyclotron, the dose field around the energy selection system (degrader) of a proton therapy cyclotron, the design of plug-doors for a new cyclotron facility, in which a 70 MeV cyclotron will be installed, and the partial decommissioning of a PET facility, including the replacement of a Scanditronix MC17 cyclotron with a new TR19 cyclotron.
Resumo:
During the PhD program in chemistry, curriculum in environmental chemistry, at the University of Bologna the sustainability of industry was investigated through the application of the LCA methodology. The efforts were focused on the chemical sector in order to investigate reactions dealing with the Green Chemistry and Green Engineering principles, evaluating their sustainability in comparison with traditional pathways by a life cycle perspective. The environmental benefits associated with a reduction in the synthesis steps and the use of renewable feedstock were assessed through a holistic approach selecting two case studies with high relevance from an industrial point of view: the synthesis of acrylonitrile and the production of acrolein. The current approach wants to represent a standardized application of LCA methodology to the chemical sector, which could be extended to several case studies, and also an improvement of the current databases, since the lack of data to fill the inventories of the chemical productions represent a huge limitation, difficult to overcome and that can affects negatively the results of the studies. Results emerged from the analyses confirms that the sustainability in the chemical sector should be evaluated from a cradle-to-gate approach, considering all the stages and flows involved in each pathways in order to avoid shifting the environmental burdens from a steps to another. Moreover, if possible, LCA should be supported by other tools able to investigate the other two dimensions of sustainability represented by the social and economic issues.