155 resultados para Massimo, Vittorio.
Resumo:
L’obiettivo del lavoro consiste nell’implementare una metodologia operativa volta alla progettazione di reti di monitoraggio e di campagne di misura della qualità dell’aria con l’utilizzo del laboratorio mobile, ottimizzando le posizioni dei dispositivi di campionamento rispetto a differenti obiettivi e criteri di scelta. La revisione e l’analisi degli approcci e delle indicazioni fornite dalla normativa di riferimento e dai diversi autori di lavori scientifici ha permesso di proporre un approccio metodologico costituito da due fasi operative principali, che è stato applicato ad un caso studio rappresentato dal territorio della provincia di Ravenna. La metodologia implementata prevede l’integrazione di numerosi strumenti di supporto alla valutazione dello stato di qualità dell’aria e degli effetti che gli inquinanti atmosferici possono generare su specifici recettori sensibili (popolazione residente, vegetazione, beni materiali). In particolare, la metodologia integra approcci di disaggregazione degli inventari delle emissioni attraverso l’utilizzo di variabili proxy, strumenti modellistici per la simulazione della dispersione degli inquinanti in atmosfera ed algoritmi di allocazione degli strumenti di monitoraggio attraverso la massimizzazione (o minimizzazione) di specifiche funzioni obiettivo. La procedura di allocazione sviluppata è stata automatizzata attraverso lo sviluppo di un software che, mediante un’interfaccia grafica di interrogazione, consente di identificare delle aree ottimali per realizzare le diverse campagne di monitoraggio
Resumo:
With the increasing importance that nanotechnologies have in everyday life, it is not difficult to realize that also a single molecule, if properly designed, can be a device able to perform useful functions: such a chemical species is called chemosensor, that is a molecule of abiotic origin that signals the presence of matter or energy. Signal transduction is the mechanism by which an interaction of a sensor with an analyte yields a measurable form of energy. When dealing with the design of a chemosensor, we need to take into account a “communication requirement” between its three component: the receptor unit, responsible for the selective analyte binding, the spacer, which controls the geometry of the system and modulates the electronic interaction between the receptor and the signalling unit, whose physico-chemical properties change upon complexation. A luminescent chemosensor communicates a variation of the physico-chemical properties of the receptor unit with a luminescence output signal. This thesis work consists in the characterization of new molecular and nanoparticle-based system which can be used as sensitive materials for the construction of new optical transduction devices able to provide information about the concentration of analytes in solution. In particular two direction were taken. The first is to continue in the development of new chemosensors, that is the first step for the construction of reliable and efficient devices, and in particular the work will be focused on chemosensors for metal ions for biomedical and environmental applications. The second is to study more efficient and complex organized systems, such as derivatized silica nanoparticles. These system can potentially have higher sensitivity than molecular systems, and present many advantages, like the possibility to be ratiometric, higher Stokes shifts and lower signal-to-noise ratio.
Resumo:
Automatically recognizing faces captured under uncontrolled environments has always been a challenging topic in the past decades. In this work, we investigate cohort score normalization that has been widely used in biometric verification as means to improve the robustness of face recognition under challenging environments. In particular, we introduce cohort score normalization into undersampled face recognition problem. Further, we develop an effective cohort normalization method specifically for the unconstrained face pair matching problem. Extensive experiments conducted on several well known face databases demonstrate the effectiveness of cohort normalization on these challenging scenarios. In addition, to give a proper understanding of cohort behavior, we study the impact of the number and quality of cohort samples on the normalization performance. The experimental results show that bigger cohort set size gives more stable and often better results to a point before the performance saturates. And cohort samples with different quality indeed produce different cohort normalization performance. Recognizing faces gone after alterations is another challenging problem for current face recognition algorithms. Face image alterations can be roughly classified into two categories: unintentional (e.g., geometrics transformations introduced by the acquisition devide) and intentional alterations (e.g., plastic surgery). We study the impact of these alterations on face recognition accuracy. Our results show that state-of-the-art algorithms are able to overcome limited digital alterations but are sensitive to more relevant modifications. Further, we develop two useful descriptors for detecting those alterations which can significantly affect the recognition performance. In the end, we propose to use the Structural Similarity (SSIM) quality map to detect and model variations due to plastic surgeries. Extensive experiments conducted on a plastic surgery face database demonstrate the potential of SSIM map for matching face images after surgeries.
Resumo:
The present PhD thesis exploits the design skills I have been improving since my master thesis’ research. A brief description of the chapters’ content follows. Chapter 1: the simulation of a complete front–end is a very complex problem and, in particular, is the basis upon which the prediction of the overall performance of the system is possible. By means of a commercial EM simulation tool and a rigorous nonlinear/EM circuit co–simulation based on the Reciprocity Theorem, the above–mentioned prediction can be achieved and exploited for wireless links characterization. This will represent the theoretical basics of the entire present thesis and will be supported by two RF applications. Chapter 2: an extensive dissertation about Magneto–Dielectric (MD) materials will be presented, together with their peculiar characteristics as substrates for antenna miniaturization purposes. A designed and tested device for RF on–body applications will be described in detail. Finally, future research will be discussed. Chapter 3: this chapter will deal with the issue regarding the exploitation of renewable energy sources for low–energy consumption devices. Hence the problem related to the so–called energy harvesting will be tackled and a first attempt to deploy THz solar energy in an innovative way will be presented and discussed. Future research will be proposed as well. Chapter 4: graphene is a very promising material for devices to be exploited in the RF and THz frequency range for a wide range of engineering applications, including those ones marked as the main research goal of the present thesis. This chapter will present the results obtained during my research period at the National Institute for Research and Development in Microtechnologies (IMT) in Bucharest, Romania. It will concern the design and manufacturing of antennas and diodes made in graphene–based technology for detection/rectification purposes.
Resumo:
The Schroeder's backward integration method is the most used method to extract the decay curve of an acoustic impulse response and to calculate the reverberation time from this curve. In the literature the limits and the possible improvements of this method are widely discussed. In this work a new method is proposed for the evaluation of the energy decay curve. The new method has been implemented in a Matlab toolbox. Its performance has been tested versus the most accredited literature method. The values of EDT and reverberation time extracted from the energy decay curves calculated with both methods have been compared in terms of the values themselves and in terms of their statistical representativeness. The main case study consists of nine Italian historical theatres in which acoustical measurements were performed. The comparison of the two extraction methods has also been applied to a critical case, i.e. the structural impulse responses of some building elements. The comparison underlines that both methods return a comparable value of the T30. Decreasing the range of evaluation, they reveal increasing differences; in particular, the main differences are in the first part of the decay, where the EDT is evaluated. This is a consequence of the fact that the new method returns a “locally" defined energy decay curve, whereas the Schroeder's method accumulates energy from the tail to the beginning of the impulse response. Another characteristic of the new method for the energy decay extraction curve is its independence on the background noise estimation. Finally, a statistical analysis is performed on the T30 and EDT values calculated from the impulse responses measurements in the Italian historical theatres. The aim of this evaluation is to know whether a subset of measurements could be considered representative for a complete characterization of these opera houses.
Resumo:
Essential, primary, or idiopathic hypertension is defined as high BP in which secondary causes such as renovascular disease, renal failure, pheochromocytoma, hyperaldosteronism, or other causes of secondary hypertension are not present. Essential hypertension accounts for 80-90% of all cases of hypertension; it is a heterogeneous disorder, with different patients having different causal factors that may lead to high BP. Life-style, diet, race, physical activity, smoke, cultural level, environmental factors, age, sex and genetic characteristics play a key role in the increasing risk. Conversely to the essential hypertension, secondary hypertension is often associated with the presence of other pathological conditions such as dyslipidaemia, hypercholesterolemia, diabetes mellitus, obesity and primary aldosteronism. Amongst them, primary aldosteronism represents one of the most common cause of secondary hypertension, with a prevalence of 5-15% depending on the severity of blood pressure. Besides high blood pressure values, a principal feature of primary aldosteronism is the hypersecretion of mineralcorticoid hormone, aldosterone, in a manner that is fairly autonomous of the renin-angiotensin system. Primary aldosteronism is a heterogeneous pathology that may be divided essentially in two groups, idiopathic and familial form. Despite all this knowledge, there are so many hypertensive cases that cannot be explained. These individuals apparently seem to be healthy, but they have a great risk to develop CVD. The lack of known risk factors makes difficult their classification in a scale of risk. Over the last three decades a good help has been given by the pharmacogenetics/pharmacogenomics, a new area of the traditional pharmacology that try to explain and find correlations between genetic variation, (rare variations, SNPs, mutations), and the risk to develop a particular disease.
Resumo:
Architettura e musica. Spazio e tempo. Suono. Esperienza. Queste le parole chiave da cui ha preso avvio la mia ricerca. Tutto è iniziato dall’intuizione dell’esistenza di un legame tra due discipline cui ho dedicato molto tempo e studio, completando due percorsi accademici paralleli, la Facoltà di architettura e il Conservatorio. Dopo un lavoro d’individuazione e analisi degli infiniti spunti di riflessione che il tema offriva, ho focalizzato l’attenzione su uno degli esempi più emblematici di collaborazione tra un architetto e un musicista realizzatasi nel Novecento: Prometeo, tragedia dell’ascolto (1984), composta da Luigi Nono con la collaborazione di Massimo Cacciari e Renzo Piano. Attraverso lo studio di Prometeo ho potuto affrontare la trattazione di molte delle possibili declinazioni del rapporto interdisciplinare tra musica e architettura. La ricerca si è svolta principalmente sullo studio dei materiali conservati presso l’Archivio Luigi Nono e l’archivio della Fondazione Renzo Piano. La tesi è organizzata in tre parti: una prima parte in cui si affronta il tema del ruolo dello spazio nelle opere di Nono precedenti a Prometeo, facendo emergere l’importanza dell’ambiente culturale e sonoro veneziano; una seconda parte in cui si approfondisce il processo compositivo che ha portato alle rappresentazioni di Prometeo a Venezia, Milano e a Parigi; una terza parte in cui si prende in considerazione quanto avvenuto dopo Prometeo e si riflette sui contributi che questa esperienza può portare alla progettazione di spazi per la musica, analizzando diversi allestimenti dell’opera senza arca e prendendo in considerazione i progetti dell’auditorium dell’International Art Village di Akiyoshidai e della sala della nuova Philharmonie di Parigi. Lo studio dell’esperienza di Prometeo ha lo scopo di stimolare la curiosità verso la ricerca e la sperimentazione di quegli infiniti possibili della composizione architettonica e musicale di cui parla Nono.
Resumo:
E2F-1 is a transcription factor that plays a key role in cell-cycle control at G1/S check-point level by regulating the timely expression of many target genes whose products are required for S phase entry and progression. In mammalian cells, E2F-1 is negatively regulated by hypo-phosphorylated Retinoblastoma protein (pRb) whereas it is protected against degradation by its binding to Mouse Double Minute 2 protein (MDM2). In this study we experimented a drug combination in order to obtain a strong down-regulation of E2F-1 by acting on two different mechanisms of E2F-1 regulation mentioned above. This was achieved by combining drugs inhibiting the phosphorylation of pRb with drugs inactivating the MDM2 binding capability. The mechanism of action of these drugs in down-regulating E2F-1 level and activity is p53 independent. As expected, when combined, these drugs strongly inhibits E2F-1 and hinder cell proliferation in p53-/- and p53-mutated cells by blocking them in G1 phase of cell cycle, suggesting that E2F-1 down-regulation may represent a valid chemotherapeutic approach to inhibit proliferation in tumors independently of p53 status.
Resumo:
In this thesis the evolution of the techno-social systems analysis methods will be reported, through the explanation of the various research experience directly faced. The first case presented is a research based on data mining of a dataset of words association named Human Brain Cloud: validation will be faced and, also through a non-trivial modeling, a better understanding of language properties will be presented. Then, a real complex system experiment will be introduced: the WideNoise experiment in the context of the EveryAware european project. The project and the experiment course will be illustrated and data analysis will be displayed. Then the Experimental Tribe platform for social computation will be introduced . It has been conceived to help researchers in the implementation of web experiments, and aims also to catalyze the cumulative growth of experimental methodologies and the standardization of tools cited above. In the last part, three other research experience which already took place on the Experimental Tribe platform will be discussed in detail, from the design of the experiment to the analysis of the results and, eventually, to the modeling of the systems involved. The experiments are: CityRace, about the measurement of human traffic-facing strategies; laPENSOcosì, aiming to unveil the political opinion structure; AirProbe, implemented again in the EveryAware project framework, which consisted in monitoring air quality opinion shift of a community informed about local air pollution. At the end, the evolution of the technosocial systems investigation methods shall emerge together with the opportunities and the threats offered by this new scientific path.
Resumo:
La tesi indaga la ricezione di Carducci nella cultura italiana ed europea dei primi decenni del XX secolo attraverso lo studio delle commemorazioni, delle memorie, degli articoli e dei saggi dedicati al poeta maremmano, al fine di mettere in luce il complesso ruolo ricoperto dallo scrittore e le strumentalizzazioni di cui è stato vittima.
Resumo:
La tesi analizza il rapporto tra rischio e liberalismo economico nel contratto e la qualificazione giuridica dei contratti in prodotti finanziari derivati.
Resumo:
Combinatorial Optimization is becoming ever more crucial, in these days. From natural sciences to economics, passing through urban centers administration and personnel management, methodologies and algorithms with a strong theoretical background and a consolidated real-word effectiveness is more and more requested, in order to find, quickly, good solutions to complex strategical problems. Resource optimization is, nowadays, a fundamental ground for building the basements of successful projects. From the theoretical point of view, Combinatorial Optimization rests on stable and strong foundations, that allow researchers to face ever more challenging problems. However, from the application point of view, it seems that the rate of theoretical developments cannot cope with that enjoyed by modern hardware technologies, especially with reference to the one of processors industry. In this work we propose new parallel algorithms, designed for exploiting the new parallel architectures available on the market. We found that, exposing the inherent parallelism of some resolution techniques (like Dynamic Programming), the computational benefits are remarkable, lowering the execution times by more than an order of magnitude, and allowing to address instances with dimensions not possible before. We approached four Combinatorial Optimization’s notable problems: Packing Problem, Vehicle Routing Problem, Single Source Shortest Path Problem and a Network Design problem. For each of these problems we propose a collection of effective parallel solution algorithms, either for solving the full problem (Guillotine Cuts and SSSPP) or for enhancing a fundamental part of the solution method (VRP and ND). We endorse our claim by presenting computational results for all problems, either on standard benchmarks from the literature or, when possible, on data from real-world applications, where speed-ups of one order of magnitude are usually attained, not uncommonly scaling up to 40 X factors.
Resumo:
This dissertation will be focused on the characterization of an atmospheric pressure plasma jet source with an application oriented diagnostic approach and the description of processes supported by this plasma source. The plasma source investigated is a single electrode plasma jet. Schlieren images, optical emission spectra, temperature and heat flux profiles are analyzed to deeply investigate the fluid dynamic, the chemical composition and the thermal output of the plasma generated with a nanosecond-pulsed high voltage generator. The maximum temperature measured is about 45 °C and values close to the room temperature are reached 10 mm down the source outlet, ensuring the possibility to use the plasma jet for the treatment of thermosensitive materials, such as, for example, biological substrate or polymers. Electrospinning of polymeric solution allows the production of nanofibrous non-woven mats and the plasma pre-treatment of the solutions leads to the realization of defect free nanofibers. The use of the plasma jet allows the electrospinnability of a non-spinnable poly(L-lactic acid) (PLLA) solution, suitable for the production of biological scaffold for the wound dressing.
Resumo:
La traduzione poetica viene affrontata sul piano empirico dell'analisi testuale. Una breve introduzione presenta le riflessioni più importanti sulla traduzione del testo poetico, da Benjamin e Steiner fino alle teorie più recenti di Meschonnic, Apel, Berman e Mattioli. Alla luce di queste teorie vengono analizzate le opere di due coppie di poeti e poeti-traduttori. Nel primo esempio troviamo il poeta svizzero (francofono) Philippe Jaccottet alle prese con l'intera opera di Ungaretti; nel secondo il rapporto travagliato di Vittorio Sereni con la poesia di René Char. Oltre a indagare la natura problematica della traduzione poetica come pratica e come esperienza, questa tesi di Letteratura Comparata vuole presentare la traduzione come strumento ermeneutico e come meccanismo rienunciativo: il suo ruolo nella dialettica delle influenze e dell'evoluzione letteraria è da considerarsi infatti essenziale. La vocazione originariamente etica della traduzione è sfondo costante della trattazione.
Resumo:
Il presente elaborato concerne le problematiche giuridiche connesse alla regolamentazione del settore dell’autotrasporto di cose per conto di terzi in Italia, con particolare attenzione alla disciplina dei profili tariffari ed alle dinamiche consolidatesi nella prassi in relazione alle pratiche di dumping sociale, outsourcing e delocalizzazione. Nella prima parte, dopo una premessa finalizzata a descrivere le caratteristiche strutturali dei fornitori di servizi di autotrasporto in ambito nazionale e comunitario nonchè le principali peculiarità del mercato di riferimento, viene descritta ed analizzata l’evoluzione normativa e giurisprudenziale verificatasi con riguardo ai profili tariffari dell’autotrasporto, esaminando in particolare le caratteristiche ed i profili di criticità propri delle discipline in materia di “tariffe a forcella” di cui alla L. n. 298/1974 e di “costi minimi di sicurezza” di cui all’art. 83-bis del D.L. n. 112/2008, fino a giungere all’analisi degli scenari conseguenti alla recente riforma del settore apportata dalla Legge di Stabilità 2015 (L. 23/12/2014, n. 190). Nella seconda parte, vengono esaminate alcune tematiche problematiche che interessano il settore, sia a livello nazionale che comunitario, e che risultano strettamente connesse ai sopra menzionati profili tariffari. In particolare, si fa riferimento alle fattispecie del cabotaggio stradale, del distacco transazionale di lavoratori e dell’abuso della libertà di stabilimento in ambito comunitario concretantesi nella fattispecie della esterovestizione. Tali problematiche sono state analizzate dapprima attraverso la ricostruzione del quadro normativo nazionale e comunitario di riferimento; in secondo luogo, attraverso l’esame dei profili critici emersi alla luce delle dinamiche di mercato invalse nel settore e, infine, in relazione all’analisi dello scenario futuro desumibile dalle iniziative legislative ed amministrative in atto, nonché dagli indirizzi interpretativi affermatisi in ambito giurisprudenziale.