125 resultados para Beni Mtir, Tunisia
Resumo:
Il progetto propone uno studio per verificare la fattibilita' di un piano territoriale (ideato per il bacino del Po ma di fatto estendibile a tutti i bacini fluviali) per la creazione di una filiera di colture bioenergetiche (biomasse) che, trasportate per mezzo della navigazione fluviale (uno dei mezzi di trasporto a minore emissione di CO2), alimentino una o piu' centrali a nuova tecnologia che associno alla produzione di calore (teleriscaldamento e raffreddamento) e di energia la separazione dei fumi. La CO2 catturata dalla crescita delle biomasse e recuperata dalla combustione, puo' quindi essere segregata nel sottosuolo di aree costiere subsidenti contrastando il fenomeno dellâabbassamento del suolo. Ricavando benefici in tutti i passaggi di attuazione del piano territoriale (lancio dell'agricoltura bioenergetica, rilancio della navigazione a corrente libera, avvio di una economia legata alla logistica del trasporto e dello stoccaggio delle biomasse, generazione di energia pulita, lotta alla subsidenza) il progetto, di fatto, consente di catturare ingenti quantitativi di CO2 dall'atmosfera e di segregarli nel sottosuolo, riducendo l'effetto serra. Nel corso del Dottorato e' stata sviluppata una metodologia di valutazione della sostenibilita' economica ed ambientale del progetto ad un bacino fluviale, che consta di una modulistica di raccolta dei dati di base e di una procedura informatizzata di analisi.
Resumo:
The aim of the present work is a historical survey on Gestalt trends in psychological research between late 19th and the first half of 20th century with privileged reference to sound and musical perception by means of a reconsideration of experimental and theoretical literature. Ernst Mach and Christian von Ehrenfels gave rise to the debate about Gestaltqualität which notably grew thanks to the ‘Graz School’ (Alexius Meinong, Stephan Witasek, Anton Faist, Vittorio Benussi), where the object theory and the production theory of perception were worked out. Stumpf’s research on Tonpsychologie and Franz Brentano’s tradition of ‘act psychology’ were directly involved in this debate, opposing to Wilhelm Wundt’s conception of the discipline; this clearly came to light in Stumpf’s controversy with Carl Lorenz and Wundt on Tondistanzen. Stumpf’s concept of Verschmelzung and his views about consonance and concordance led him to some disputes with Theodor Lipps and Felix Krueger, lasting more than two decades. Carl Stumpf was responsible for education of a new generation of scholars during his teaching at the Berlin University: his pupils Wolfgang Köhler, Kurt Koffka and Max Wertheimer established the so-called ‘Berlin School’ and promoted the official Gestalt theory since the 1910s. After 1922 until 1938 they gave life and led together with other distinguished scientists the «Psychologische Forschung», a scientific journal in which ‘Gestalt laws’ and many other acoustical studies on different themes (such as sound localization, successive comparison, phonetic phenomena) were exposed. During the 1920s Erich Moritz von Hornbostel gave important contributions towards the definition of an organic Tonsystem in which sound phenomena could find adequate arrangement. Last section of the work contains descriptions of Albert Wellek’s studies, Kurt Huber’s vowel researches and aspects of melody perception, apparent movement and phi-phenomenon in acoustical field. The work contains also some considerations on the relationships among tone psychology, musical psychology, Gestalt psychology, musical aesthetics and musical theory. Finally, the way Gestalt psychology changed earlier interpretations is exemplified by the decisive renewal of perception theory, the abandon of Konstanzannahme, some repercussions on theory of meaning as organization and on feelings in musical experience.
Resumo:
Molteplici studi, portati a termine di recente in Europa ed oltreoceano, hanno focalizzato l’attenzione sulle problematiche indotte dal trasporto merci in ambito urbano e contribuito ad identificarne possibili soluzioni (city logistics). Le aree urbane, dovrebbero idealmente essere luoghi ove abitare, svolgere attività economiche, sociali e ricreative. Esse possono vedere compromessa la loro predisposizione a tali scopi anche a causa del crescente traffico delle merci, il cui trasporto è effettuato principalmente su gomma, per via delle brevi distanze da coprire e delle carenze infrastrutturali. I veicoli commerciali, ad eccezione di quelli di ultima generazione, incidono negativamente sulla qualità dell’ambiente urbano, generando inquinamento atmosferico e acustico. La politica del “just in time”, che prevede l’assenza di magazzini di stoccaggio delle merci, incrementa i movimenti commerciali. Nella presente tesi vengono trattati alcuni aspetti logistici di regolamentazione della sosta e degli accessi per i mezzi di trasporto merci, in grado di rendere più efficiente la distribuzione dei beni, mitigando le problematiche indotte dal traffico e, quindi, salvaguardando la qualità di vita nei centri cittadini.
Resumo:
Environmental Management includes many components, among which we can include Environmental Management Systems (EMS), Environmental Reporting and Analysis, Environmental Information Systems and Environmental Communication. In this work two applications are presented: the developement and implementation of an Environmental Management System in local administrations, according to the European scheme "EMAS", and the analysis of a territorial energy system through scenario building and environmental sustainability assessment. Both applications are linked by the same objective, which is the quest for more scientifically sound elements; in fact, both EMS and energy planning are oftec carachterized by localism and poor comparability. Emergy synthesis, proposed by ecologist H.T. Odum and described in his book "Environmental Accounting: Emergy and Environmental Decision Making" (1996) has been chosen and applied as an environmental evaluation tool, in order complete the analysis with an assessment of the "global value" of goods and processes. In particular, eMergy syntesis has been applied in order to improve the evaluation of the significance of environmental aspects in an EMS, and in order to evaluate the environmental performance of three scenarios of future evolution of the energy system. Regarding EMS, in this work an application of an EMS together with the CLEAR methodology for environmental accounting is discussed, in order to improve the identification of the environmental aspects; data regarding environmental aspects and significant ones for 4 local authorities are also presented, together with a preliminary proposal for the integration of the assessment of the significance of environmental aspects with eMergy synthesis. Regarding the analysis of an energy system, in this work the carachterization of the current situation is presented together with the overall energy balance and the evaluation of the emissions of greenhouse gases; moreover, three scenarios of future evolution are described and discussed. The scenarios have been realized with the support of the LEAP software ("Long Term Energy Alternatives Planning System" by SEI - "Stockholm Environment Institute"). Finally, the eMergy synthesis of the current situation and of the three scenarios is shown.
Resumo:
The main objective of this thesis was the chemical characterization of synthetic secondary organic aerosol (SOA) produced from atmospherically relevant anthropogenic and biogenic VOCs during reaction chamber experiments. In parallel, the resulting chemical features of these laboratory-SOA were used to interpret the composition of ambient samples of atmospheric fine particulate matter collected at several sites in Europe, in order to determine the fraction of ambient aerosol organic mass accounted for by biogenic and anthropogenic SOA.
Resumo:
The upgrade of the CERN accelerator complex has been planned in order to further increase the LHC performances in exploring new physics frontiers. One of the main limitations to the upgrade is represented by the collective instabilities. These are intensity dependent phenomena triggered by electromagnetic fields excited by the interaction of the beam with its surrounding. These fields are represented via wake fields in time domain or impedances in frequency domain. Impedances are usually studied assuming ultrarelativistic bunches while we mainly explored low and medium energy regimes in the LHC injector chain. In a non-ultrarelativistic framework we carried out a complete study of the impedance structure of the PSB which accelerates proton bunches up to 1.4 GeV. We measured the imaginary part of the impedance which creates betatron tune shift. We introduced a parabolic bunch model which together with dedicated measurements allowed us to point to the resistive wall impedance as the source of one of the main PSB instability. These results are particularly useful for the design of efficient transverse instability dampers. We developed a macroparticle code to study the effect of the space charge on intensity dependent instabilities. Carrying out the analysis of the bunch modes we proved that the damping effects caused by the space charge, which has been modelled with semi-analytical method and using symplectic high order schemes, can increase the bunch intensity threshold. Numerical libraries have been also developed in order to study, via numerical simulations of the bunches, the impedance of the whole CERN accelerator complex. On a different note, the experiment CNGS at CERN, requires high-intensity beams. We calculated the interpolating Hamiltonian of the beam for highly non-linear lattices. These calculations provide the ground for theoretical and numerical studies aiming to improve the CNGS beam extraction from the PS to the SPS.
Resumo:
Il rapido progresso della tecnologia, lo sviluppo di prodotti altamente sofisticati, la forte competizione globale e l’aumento delle aspettative dei clienti hanno messo nuove pressioni sui produttori per garantire la commercializzazione di beni caratterizzati da una qualità sempre crescente. Sono gli stessi clienti che da anni si aspettano di trovare sul mercato prodotti contraddistinti da un livello estremo di affidabilità e sicurezza. Tutti siamo consapevoli della necessità per un prodotto di essere quanto più sicuro ed affidabile possibile; ma, nonostante siano passati oramai 30 anni di studi e ricerche, quando cerchiamo di quantificare ingegneristicamente queste caratteristiche riconducibili genericamente al termine qualità, oppure quando vogliamo provare a calcolare i benefici concreti che l’attenzione a questi fattori quali affidabilità e sicurezza producono su un business, allora le discordanze restano forti. E le discordanze restano evidenti anche quando si tratta di definire quali siano gli “strumenti più idonei” da utilizzare per migliorare l’affidabilità e la sicurezza di un prodotto o processo. Sebbene lo stato dell’arte internazionale proponga un numero significativo di metodologie per il miglioramento della qualità, tutte in continuo perfezionamento, tuttavia molti di questi strumenti della “Total Quality” non sono concretamente applicabili nella maggior parte delle realtà industriale da noi incontrate. La non applicabilità di queste tecniche non riguarda solo la dimensione più limitata delle aziende italiane rispetto a quelle americane e giapponesi dove sono nati e stati sviluppati questi strumenti, oppure alla poca possibilità di effettuare investimenti massicci in R&D, ma è collegata anche alla difficoltà che una azienda italiana avrebbe di sfruttare opportunamente i risultati sui propri territori e propri mercati. Questo lavoro si propone di sviluppare una metodologia semplice e organica per stimare i livelli di affidabilità e di sicurezza raggiunti dai sistemi produttivi e dai prodotti industriali. Si pone inoltre di andare al di là del semplice sviluppo di una metodologia teorica, per quanto rigorosa e completa, ma di applicare in forma integrata alcuni dei suoi strumenti a casi concreti di elevata valenza industriale. Questa metodologia come anche, più in generale, tutti gli strumenti di miglioramento di affidabilità qui presentati, interessano potenzialmente una vasta gamma di campi produttivi, ma si prestano con particolare efficacia in quei settori dove coesistono elevate produzioni e fortissime esigenze qualitative dei prodotti. Di conseguenza, per la validazione ed applicazione ci si è rivolti al settore dell’automotive, che da sempre risulta particolarmente sensibile ai problemi di miglioramento di affidabilità e sicurezza. Questa scelta ha portato a conclusioni la cui validità va al di là di valori puramente tecnici, per toccare aspetti non secondari di “spendibilità” sul mercato dei risultati ed ha investito aziende di primissimo piano sul panorama industriale italiano.
Resumo:
To date the hospital radiological workflow is completing a transition from analog to digital technology. Since the X-rays digital detection technologies have become mature, hospitals are trading on the natural devices turnover to replace the conventional screen film devices with digital ones. The transition process is complex and involves not just the equipment replacement but also new arrangements for image transmission, display (and reporting) and storage. This work is focused on 2D digital detector’s characterization with a concern to specific clinical application; the systems features linked to the image quality are analyzed to assess the clinical performances, the conversion efficiency, and the minimum dose necessary to get an acceptable image. The first section overviews the digital detector technologies focusing on the recent and promising technological developments. The second section contains a description of the characterization methods considered in this thesis categorized in physical, psychophysical and clinical; theory, models and procedures are described as well. The third section contains a set of characterizations performed on new equipments that appears to be some of the most advanced technologies available to date. The fourth section deals with some procedures and schemes employed for quality assurance programs.
Resumo:
Drying oils, and in particular linseed oil, were the most common binding media employed in painting between XVI and XIX centuries. Artists usually operated some pre-treatments on the oils to obtain binders with modified properties, such as different handling qualities or colour. Oil processing has a key role on the subsequent ageing of and degradation of linseed oil paints. In this thesis a multi-analytical approach was adopted to investigate the drying, polymerization and oxidative degradation of the linseed oil paints. In particular, thermogravimetry analysis (TGA), yielding information on the macromolecular scale, were compared with gas-chromatography mass-spectrometry (GC-MS) and direct exposure mass spectrometry (DEMS) providing information on the molecular scale. The study was performed on linseed oils and paint reconstructions prepared according to an accurate historical description of the painting techniques of the 19th century. TGA revealed that during ageing the molecular weight of the oils changes and that higher molecular weight fractions formed. TGA proved to be an excellent tool to compare the oils and paint reconstructions. This technique is able to highlight the different physical behaviour of oils that were processed using different methods and of paint layers on the basis of the different processed oil and /or the pigment used. GC/MS and DE-MS were used to characterise the soluble and non-polymeric fraction of the oils and paint reconstructions. GC/MS allowed us to calculate the ratios of palmitic to stearic acid (P/S), and azelaic to palmitic acid (A/P) and to evaluate effects produced by oil pre-treatments and the presence of different pigments. This helps to understand the role of the pre-treatments and of the pigments on the oxidative degradation undergone by siccative oils during ageing. DE-MS enabled the various molecular weight fractions of the samples to be simultaneously studied, and thus helped to highlight the presence of oxidation and hydrolysis reactions, and the formation of carboxylates that occur during ageing and with the changing of the oil pre-treatments and the pigments. The combination of thermal analysis with molecular techniques such as GC-MS, DEMS and FTIR enabled a model to be developed, for unravelling some crucial issues: 1) how oil pre-treatments produce binders with different physical-chemical qualities, and how this can influence the ageing of an oil paint film; 2) which is the role of the interaction between oil and pigments in the ageing and degradation process.