857 resultados para serrated aperture


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The durability of stone building materials is an issue of utmost importance in the field of monument conservation. In order to be able to preserve our built cultural heritage, the thorough knowledge of its constituent materials and the understanding of the processes that affect them are indispensable. The main objective of this research was to evaluate the durability of a special stone type, the crystalline stones, in correlation with their intrinsic characteristics, the petrophysical properties. The crystalline stones are differentiated from the cemented stones on the basis of textural features. Their most important specific property is the usually low, fissure-like porosity. Stone types of significant monumental importance, like the marble or granite belong to this group. The selected materials for this investigation, indeed, are a marble (Macael marble, Spain) and a granite (Silvestre Vilachán granite, Spain). In addition, an andesite (Szob andesite, Hungary) also of significant monumental importance was selected. This way a wide range of crystalline rocks is covered in terms of petrogenesis: stones of metamorphic, magmatic and volcanic origin, which can be of importance in terms of mineralogical, petrological or physical characteristics. After the detailed characterization of the petrophysical properties of the selected stones, their durability was assessed by means of artificial ageing. The applied ageing tests were: the salt crystallization, the frost resistance in pure water and in the presence of soluble salts, the salt mist and the action of SO2 in the presence of humidity. The research aimed at the understanding of the mechanisms of each weathering process and at finding the petrophysical properties most decisive in the degradation of these materials. Among the several weathering mechanisms, the most important ones were found to be the physical stress due to crystallization pressure of both salt and ice, the thermal fatigue due to cyclic temperature changes and the chemical reactions (mostly the acidic attack) between the mineral phases and the external fluids. The properties that fundamentally control the degradation processes, and thus the durability of stones were found to be: the mineralogical and chemical composition; the hydraulic properties especially the water uptake, the permeability and the drying; the void space structure, especially the void size and aperture size distribution and the connectivity of the porous space; and the thermal and mechanical properties. Because of the complexity of the processes and the high number of determining properties, no mechanisms or characteristics could be identified as typical for crystalline stones. The durability or alterability of each stone type must be assessed according to its properties and not according to the textural or petrophysical classification they belong to. Finally, a critical review of standardized methods is presented, based on which an attempt was made for recommendations of the most adequate methodology for the characterization and durability assessment of crystalline stones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Auf einer drei Anbauperioden umfassenden Ground Truth Datenbasis wird der Informationsgehalt multitemporaler ERS-1/-2 Synthetic Aperture Radar (SAR) Daten zur Erfassung der Arteninventare und des Zustandes landwirtschaftlich genutzter Böden und Vegetation in Agrarregionen Bayerns evaluiert.Dazu wird ein für Radardaten angepaßtes, multitemporales, auf landwirtschaftlichen Schlägen beruhendes Klassifizierungsverfahren ausgearbeitet, das auf bildstatistischen Parametern der ERS-Zeitreihen beruht. Als überwachte Klassifizierungsverfahren wird vergleichend der Maximum-Likelihood-Klassifikator und ein Neuronales-Backpropagation-Netz eingesetzt. Die auf Radarbildkanälen beruhenden Gesamtgenauigkeiten variieren zwischen 75 und 85%. Darüber hinaus wird gezeigt, daß die interferometrische Kohärenz und die Kombination mit Bildkanälen optischer Sensoren (Landsat-TM, SPOT-PAN und IRS-1C-PAN) zur Verbesserung der Klassifizierung beitragen. Gleichermaßen können die Klassifizierungsergebnisse durch eine vorgeschaltete Grobsegmentierung des Untersuchungsgebietes in naturräumlich homogene Raumeinheiten verbessert werden. Über die Landnutzungsklassifizierung hinaus, werden weitere bio- und bodenphysikalische Parameter aus den SAR-Daten anhand von Regressionsmodellen abgeleitet. Im Mittelpunkt stehen die Paramter oberflächennahen Bodenfeuchte vegetationsfreier/-armer Flächen sowie die Biomasse landwirtschaftlicher Kulturen. Die Ergebnisse zeigen, daß mit ERS-1/-2 SAR-Daten eine Messung der Bodenfeuchte möglich ist, wenn Informationen zur Bodenrauhigkeit vorliegen. Hinsichtlich der biophysikalischen Parameter sind signifikante Zusammenhänge zwischen der Frisch- bzw. Trockenmasse des Vegetationsbestandes verschiedener Getreide und dem Radarsignal nachweisbar. Die Biomasse-Informationen können zur Korrektur von Wachstumsmodellen genutzt werden und dazu beitragen, die Genauigkeit von Ertragsschätzungen zu steigern.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Satellite SAR (Synthetic Aperture Radar) interferometry represents a valid technique for digital elevation models (DEM) generation, providing metric accuracy even without ancillary data of good quality. Depending on the situations the interferometric phase could be interpreted both as topography and as a displacement eventually occurred between the two acquisitions. Once that these two components have been separated it is possible to produce a DEM from the first one or a displacement map from the second one. InSAR DEM (Digital Elevation Model) generation in the cryosphere is not a straightforward operation because almost every interferometric pair contains also a displacement component, which, even if small, when interpreted as topography during the phase to height conversion step could introduce huge errors in the final product. Considering a glacier, assuming the linearity of its velocity flux, it is therefore necessary to differentiate at least two pairs in order to isolate the topographic residue only. In case of an ice shelf the displacement component in the interferometric phase is determined not only by the flux of the glacier but also by the different heights of the two tides. As a matter of fact even if the two scenes of the interferometric pair are acquired at the same time of the day only the main terms of the tide disappear in the interferogram, while the other ones, smaller, do not elide themselves completely and so correspond to displacement fringes. Allowing for the availability of tidal gauges (or as an alternative of an accurate tidal model) it is possible to calculate a tidal correction to be applied to the differential interferogram. It is important to be aware that the tidal correction is applicable only knowing the position of the grounding line, which is often a controversial matter. In this thesis it is described the methodology applied for the generation of the DEM of the Drygalski ice tongue in Northern Victoria Land, Antarctica. The displacement has been determined both in an interferometric way and considering the coregistration offsets of the two scenes. A particular attention has been devoted to investigate the importance of the role of some parameters, such as timing annotations and orbits reliability. Results have been validated in a GIS environment by comparison with GPS displacement vectors (displacement map and InSAR DEM) and ICEsat GLAS points (InSAR DEM).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die vorliegende Arbeit untersucht mittels lichtunterstützter Tunnelmikroskopie (STM) den Elektronentransport in farbstoffbedeckten, nanoporösen TiO2-Schichten, die in photoelektrochemischen Solarzellen eingesetzt werden. Transportrelevante Eigenschaften wie die elektronische Zustandsdichte sowie lichtinduzierte Vorgänge wie der Aufbau einer lichtinduzierten Oberflächenladung und lokale Photoströme werden ortsaufgelöst gemessen. Für einen möglichen Einsatz in lichtunterstützter Tunnelmikroskopie werden desweiteren Gold-Nanopartikel auf einer Amino-Hexanthiol-Monolage auf Coulomb-Blockaden untersucht. Den zweite Schwerpunkt stellen methodische Arbeiten zur Messung optischer Nahfelder in STM-Experimenten dar. Erstens sollen die Vorteile von Apertur- und aperturloser optischer Rasternahfeld-Mikroskopie mit komplett metallisierten Faserspitzen verbunden werden, die durch die Faser beleuchtet werden. Es gelingt nicht, theoretisch vorhergesagte hohe optische Auflösungen zu bestätigen. Zweitens werden transparente Spitzen aus Sb-dotiertem Zinnoxid erfolgreich als Tunnelspitzen getestet. Die Spitzen ermöglichen STM-Elektrolumineszenz-Experimente zur Charakterisierung optischer Nahfelder, ohne diese durch eine metallische Spitze zu beeinträchtigen. In einer STM-Studie wird das Selbstorganisations-Verhalten von Oktanthiol und Oktandithiol auf Au(111) aus Ethanol untersucht. Bei geringer relativer Konzentration der Dithiole (1:2000), bildet sich eine Phase liegender Dithiole, deren Ordnung durch die Präsenz der Oktanthiole katalysiert wird. Schließlich wird ein als 'dynamische Tunnelmikroskopie' bezeichneter Modus für die Tunnelmikroskopie in elektrisch leitfähiger Umgebung erfolgreich getestet, der zur Unterdrückung des elektrochemischen Leckstromanteils die Ableitung des Stroms nach dem Abstand als STM-Abstandssignal verwendet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Für die Zukunft wird eine Zunahme an Verkehr prognostiziert, gleichzeitig herrscht ein Mangel an Raum und finanziellen Mitteln, um weitere Straßen zu bauen. Daher müssen die vorhandenen Kapazitäten durch eine bessere Verkehrssteuerung sinnvoller genutzt werden, z.B. durch Verkehrsleitsysteme. Dafür werden räumlich aufgelöste, d.h. den Verkehr in seiner flächenhaften Verteilung wiedergebende Daten benötigt, die jedoch fehlen. Bisher konnten Verkehrsdaten nur dort erhoben werden, wo sich örtlich feste Meßeinrichtungen befinden, jedoch können damit die fehlenden Daten nicht erhoben werden. Mit Fernerkundungssystemen ergibt sich die Möglichkeit, diese Daten flächendeckend mit einem Blick von oben zu erfassen. Nach jahrzehntelangen Erfahrungen mit Fernerkundungsmethoden zur Erfassung und Untersuchung der verschiedensten Phänomene auf der Erdoberfläche wird nun diese Methodik im Rahmen eines Pilotprojektes auf den Themenbereich Verkehr angewendet. Seit Ende der 1990er Jahre wurde mit flugzeuggetragenen optischen und Infrarot-Aufnahmesystemen Verkehr beobachtet. Doch bei schlechten Wetterbedingungen und insbesondere bei Bewölkung, sind keine brauchbaren Aufnahmen möglich. Mit einem abbildenden Radarverfahren werden Daten unabhängig von Wetter- und Tageslichtbedingungen oder Bewölkung erhoben. Im Rahmen dieser Arbeit wird untersucht, inwieweit mit Hilfe von flugzeuggetragenem synthetischem Apertur Radar (SAR) Verkehrsdaten aufgenommen, verarbeitet und sinnvoll angewendet werden können. Nicht nur wird die neue Technik der Along-Track Interferometrie (ATI) und die Prozessierung und Verarbeitung der aufgenommenen Verkehrsdaten ausführlich dargelegt, es wird darüberhinaus ein mit dieser Methodik erstellter Datensatz mit einer Verkehrssimulation verglichen und bewertet. Abschließend wird ein Ausblick auf zukünftige Entwicklungen der Radarfernerkundung zur Verkehrsdatenerfassung gegeben.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The hard X-ray band (10 - 100 keV) has been only observed so far by collimated and coded aperture mask instruments, with a sensitivity and an angular resolution lower than two orders of magnitude as respects the current X-ray focusing telescopes operating below 10 - 15 keV. The technological advance in X-ray mirrors and detection systems is now able to extend the X-ray focusing technique to the hard X-ray domain, filling the gap in terms of observational performances and providing a totally new deep view on some of the most energetic phenomena of the Universe. In order to reach a sensitivity of 1 muCrab in the 10 - 40 keV energy range, a great care in the background minimization is required, a common issue for all the hard X-ray focusing telescopes. In the present PhD thesis, a comprehensive analysis of the space radiation environment, the payload design and the resulting prompt X-ray background level is presented, with the aim of driving the feasibility study of the shielding system and assessing the scientific requirements of the future hard X-ray missions. A Geant4 based multi-mission background simulator, BoGEMMS, is developed to be applied to any high energy mission for which the shielding and instruments performances are required. It allows to interactively create a virtual model of the telescope and expose it to the space radiation environment, tracking the particles along their path and filtering the simulated background counts as a real observation in space. Its flexibility is exploited to evaluate the background spectra of the Simbol-X and NHXM mission, as well as the soft proton scattering by the X-ray optics and the selection of the best shielding configuration. Altough the Simbol-X and NHXM missions are the case studies of the background analysis, the obtained results can be generalized to any future hard X-ray telescope. For this reason, a simplified, ideal payload model is also used to select the major sources of background in LEO. All the results are original contributions to the assessment studies of the cited missions, as part of the background groups activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La ricerca affronta in modo unitario e nell’ottica europea i multiformi fenomeni della doppia imposizione economica e giuridica, assumendo come paradigma iniziale la tassazione dei dividendi cross-border. Definito lo statuto giuridico della doppia imposizione, se ne motiva la contrarietà all’ordinamento europeo e si indagano gli strumenti comunitari per raggiungere l’obiettivo europeo della sua eliminazione. In assenza di un’armonizzazione positiva, il risultato sostanziale viene raggiunto grazie all’integrazione negativa. Si dimostra che il riserbo della Corte di Giustizia di fronte a opzioni di politica fiscale è soltanto un’impostazione di facciata, valorizzando le aperture giurisprudenziali per il suo superamento. Questi, in sintesi, i passaggi fondamentali. Si parte dall’evoluzione delle libertà fondamentali in diritti di rango costituzionale, che ne trasforma il contenuto economico e la portata giuridica, attribuendo portata costituzionale ai valori di neutralità e non restrizione. Si evidenzia quindi il passaggio dal divieto di discriminazioni al divieto di restrizioni, constatando il fallimento del tentativo di configurare il divieto di doppia imposizione come principio autonomo dell’ordinamento europeo. Contemporaneamente, però, diventa opportuno riesaminare la distinzione tra doppia imposizione economica e giuridica, e impostare un unico inquadramento teorico della doppia imposizione come ipotesi paradigmatica di restrizione alle libertà. Conseguentemente, viene razionalizzato l’impianto giurisprudenziale delle cause di giustificazione. Questo consente agevolmente di legittimare scelte comunitarie per la ripartizione dei poteri impositivi tra Stati Membri e l’attribuzione delle responsabilità per l’eliminazione degli effetti della doppia imposizione. In conclusione, dunque, emerge una formulazione europea dell’equilibrato riparto di poteri impositivi a favore dello Stato della fonte. E, accanto ad essa, una concezione comunitaria del principio di capacità contributiva, con implicazioni dirompenti ancora da verificare. Sul piano metodologico, l’analisi si concentra criticamente sull’operato della Corte di Giustizia, svelando punti di forza e di debolezza della sua azione, che ha posto le basi per la risposta europea al problema della doppia imposizione.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The carbonate outcrops of the anticline of Monte Conero (Italy) were studied in order to characterize the geometry of the fractures and to establish their influence on the petrophysical properties (hydraulic conductivity) and on the vulnerability to pollution. The outcrops form an analog for a fractured aquifer and belong to the Maiolica Fm. and the Scaglia Rossa Fm. The geometrical properties of fractures such as orientation, length, spacing and aperture were collected and statistically analyzed. Five types of mechanical fractures were observed: veins, joints, stylolites, breccias and faults. The types of fractures are arranged in different sets and geometric assemblages which form fracture networks. In addition, the fractures were analyzed at the microscale using thin sections. The fracture age-relationships resulted similar to those observed at the outcrop scale, indicating that at least three geological episodes have occurred in Monte Conero. A conceptual model for fault development was based on the observations of veins and stylolites. The fracture sets were modelled by the code FracSim3D to generate fracture network models. The permeability of a breccia zone was estimated at microscale by and point counting and binary image methods, whereas at the outcrop scale with Oda’s method. Microstructure analysis revealed that only faults and breccias are potential pathways for fluid flow since all veins observed are filled with calcite. According this, three scenarios were designed to asses the vulnerability to pollution of the analogue aquifer: the first scenario considers the Monte Conero without fractures, second scenario with all observed systematic fractures and the third scenario with open veins, joints and faults/breccias. The fractures influence the carbonate aquifer by increasing its porosity and hydraulic conductivity. The vulnerability to pollution depends also on the presence of karst zones, detric zones and the material of the vadose zone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il progetto di ricerca di questa tesi è stato focalizzato sulla sintesi di tre classi di molecole: β-lattami, Profeni e α-amminonitrili, utilizzando moderne tecniche di sintesi organica, metodologie ecosostenibili e strategie biocatalitiche. I profeni sono una categoria di antiinfiammatori molto diffusa e in particolare abbiamo sviluppato e ottimizzato una procedura in due step per ottenere (S)-Profeni da 2-arilpropanali raceme. Il primo step consiste in una bioriduzione delle aldeidi per dare i relativi (S)-2-Aril Propanoli tramite un processo DKR mediato dall’enzima Horse Liver Alcohol Dehydrogenase. Il secondo, l’ossidazione a (S)-Profeni, è promossa da NaClO2 e TEMPO come catalizzatore. Con lo scopo di migliorare il processo, in collaborazione con il gruppo di ricerca di Francesca Paradisi all’University College Dublino abbiamo immobilizzato l’enzima HLADH, ottenendo buone rese e una migliore enantioselettività. Abbiamo inoltre proposto un interessante approccio enzimatico per l’ossidazione degli (S)-2-Aril Propanoli utilizzando una laccasi da Trametes Versicolor. L’anello β-lattamico è un eterociclo molto importante, noto per essere un interessante farmacoforo. Abbiamo sintetizzato nuovi N-metiltio beta-lattami, che hanno mostrato un’attività antibatterica molto interessante contro ceppi resistenti di Staphilococcus Aureus prelevati da pazienti affetti da fibrosis cistica. Abbiamo poi coniugato gruppi polifenolici a questi nuovi β-lattami ottenendo molecule antiossidanti e antibatteriche, cioè con attività duale. Abbiamo poi sintetizzato un nuovo ibrido retinoide-betalattame che ha indotto differenziazione si cellule di neuroblastoma. Abbiamo poi sfruttato la reazione di aperture dell’anello monobattamico tramite enzimi idrolitici, con lo scopo di ottenere β-amminoacidi chirali desimmetrizzati come il monoestere dell’acido β–amminoglutammico. Per quando riguarda gli α-amminonitrili, è stato sviluppato un protocollo di Strecker. Le reazioni sono state molto efficienti utilizzando come fonte di cianuro l’acetone cianidrina in acqua, utilizzando differenti aldeidi e chetoni, ammine primarie e secondarie. Per mettere a punto una versione asimmetrica del protocollo, abbiamo usato ammine chirali con lo scopo di ottenere nuovi α-amminonitrili chirali.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nitrogen is an essential nutrient. It is for human, animal and plants a constituent element of proteins and nucleic acids. Although the majority of the Earth’s atmosphere consists of elemental nitrogen (N2, 78 %) only a few microorganisms can use it directly. To be useful for higher plants and animals elemental nitrogen must be converted to a reactive oxidized form. This conversion happens within the nitrogen cycle by free-living microorganisms, symbiotic living Rhizobium bacteria or by lightning. Humans are able to synthesize reactive nitrogen through the Haber-Bosch process since the beginning of the 20th century. As a result food security of the world population could be improved noticeably. On the other side the increased nitrogen input results in acidification and eutrophication of ecosystems and in loss of biodiversity. Negative health effects arose for humans such as fine particulate matter and summer smog. Furthermore, reactive nitrogen plays a decisive role at atmospheric chemistry and global cycles of pollutants and nutritive substances.rnNitrogen monoxide (NO) and nitrogen dioxide (NO2) belong to the reactive trace gases and are grouped under the generic term NOx. They are important components of atmospheric oxidative processes and influence the lifetime of various less reactive greenhouse gases. NO and NO2 are generated amongst others at combustion process by oxidation of atmospheric nitrogen as well as by biological processes within soil. In atmosphere NO is converted very quickly into NO2. NO2 is than oxidized to nitrate (NO3-) and to nitric acid (HNO3), which bounds to aerosol particles. The bounded nitrate is finally washed out from atmosphere by dry and wet deposition. Catalytic reactions of NOx are an important part of atmospheric chemistry forming or decomposing tropospheric ozone (O3). In atmosphere NO, NO2 and O3 are in photosta¬tionary equilibrium, therefore it is referred as NO-NO2-O3 triad. At regions with elevated NO concentrations reactions with air pollutions can form NO2, altering equilibrium of ozone formation.rnThe essential nutrient nitrogen is taken up by plants mainly by dissolved NO3- entering the roots. Atmospheric nitrogen is oxidized to NO3- within soil via bacteria by nitrogen fixation or ammonium formation and nitrification. Additionally atmospheric NO2 uptake occurs directly by stomata. Inside the apoplast NO2 is disproportionated to nitrate and nitrite (NO2-), which can enter the plant metabolic processes. The enzymes nitrate and nitrite reductase convert nitrate and nitrite to ammonium (NH4+). NO2 gas exchange is controlled by pressure gradients inside the leaves, the stomatal aperture and leaf resistances. Plant stomatal regulation is affected by climate factors like light intensity, temperature and water vapor pressure deficit. rnThis thesis wants to contribute to the comprehension of the effects of vegetation in the atmospheric NO2 cycle and to discuss the NO2 compensation point concentration (mcomp,NO2). Therefore, NO2 exchange between the atmosphere and spruce (Picea abies) on leaf level was detected by a dynamic plant chamber system under labo¬ratory and field conditions. Measurements took place during the EGER project (June-July 2008). Additionally NO2 data collected during the ECHO project (July 2003) on oak (Quercus robur) were analyzed. The used measuring system allowed simultaneously determina¬tion of NO, NO2, O3, CO2 and H2O exchange rates. Calculations of NO, NO2 and O3 fluxes based on generally small differences (∆mi) measured between inlet and outlet of the chamber. Consequently a high accuracy and specificity of the analyzer is necessary. To achieve these requirements a highly specific NO/NO2 analyzer was used and the whole measurement system was optimized to an enduring measurement precision.rnData analysis resulted in a significant mcomp,NO2 only if statistical significance of ∆mi was detected. Consequently, significance of ∆mi was used as a data quality criterion. Photo-chemical reactions of the NO-NO2-O3 triad in the dynamic plant chamber’s volume must be considered for the determination of NO, NO2, O3 exchange rates, other¬wise deposition velocity (vdep,NO2) and mcomp,NO2 will be overestimated. No significant mcomp,NO2 for spruce could be determined under laboratory conditions, but under field conditions mcomp,NO2 could be identified between 0.17 and 0.65 ppb and vdep,NO2 between 0.07 and 0.42 mm s-1. Analyzing field data of oak, no NO2 compensation point concentration could be determined, vdep,NO2 ranged between 0.6 and 2.71 mm s-1. There is increasing indication that forests are mainly a sink for NO2 and potential NO2 emissions are low. Only when assuming high NO soil emissions, more NO2 can be formed by reaction with O3 than plants are able to take up. Under these circumstance forests can be a source for NO2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Con le "Imagini degli dei degli antichi", pubblicate a Venezia nel 1556 e poi in più edizioni arricchite e illustrate, l’impegnato gentiluomo estense Vincenzo Cartari realizza il primo, fortunatissimo manuale mitografico italiano in lingua volgare, diffuso e tradotto in tutta l’Europa moderna. Cartari rimodula, secondo accenti divulgativi ma fedeli, fonti latine tradizionali: come le ricche "Genealogie deorum gentilium" di Giovanni Boccaccio, l’appena precedente "De deis gentium varia et multiplex historia" di Lilio Gregorio Giraldi, i curiosi "Fasti" ovidiani, da lui stesso commentati e tradotti. Soprattutto, però, introduce il patrimonio millenario di favole ed esegesi classiche, di aperture egiziane, mediorientali, sassoni, a una chiave di lettura inedita, agile e vitalissima: l’ecfrasi. Le divinità e i loro cortei di creature minori, aneddoti leggendari e attributi identificativi si susseguono secondo un taglio iconico e selettivo. Sfilano, in trionfi intrisi di raffinato petrarchismo neoplatonico e di emblematica picta poesis rinascimentale, soltanto gli aspetti figurabili e distintivi dei personaggi mitici: perché siano «raccontate interamente» tutte le cose attinenti alle figure antiche, «con le imagini quasi di tutti i dei, e le ragioni perché fossero così dipinti». Così, le "Imagini" incontrano il favore di lettori colti e cortigiani eleganti, di pittori e ceramisti, di poeti e artigiani. Allestiscono una sorta di «manuale d’uso» pronto all’inchiostro del poeta o al pennello dell’artista, una suggestiva raccolta di «libretti figurativi» ripresi tanto dalla maniera di Paolo Veronese o di Giorgio Vasari, quanto dal classicismo dei Carracci e di Nicolas Poussin. Si rivelano, infine, summa erudita capace di attirare appunti e revisioni: l’antiquario padovano Lorenzo Pignoria, nel 1615 e di nuovo nel 1626, vi aggiunge appendici archeologiche e comparatistiche, interessate al remoto regno dei faraoni quanto agli esotici idoli orientali e dei Nuovi Mondi.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il Global Positioning System (GPS) e l’Interferometric Synthetic Aperture Radar (InSAR) sono due tecniche osservative di grande importanza che utilizzano segnali nel campo delle microonde. Questa tesi intende contribuire a sviluppare una base di confronto tra i risultati derivati da queste due tecniche osservative. Una parte del lavoro riguarda uno studio delle deformazioni del suolo, in particolare, la stima dei movimenti verticali e di quelli che riguardano la componente Est della posizione delle stazioni. Un secondo ambito di ricerca è invece focalizzato alla determinazione del ritardo introdotto, nella propagazione dei segnali GPS e SAR, dal loro passaggio in atmosfera. In particolare, si è studiato l’effetto della componente umida della troposfera.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The idea of balancing the resources spent in the acquisition and encoding of natural signals strictly to their intrinsic information content has interested nearly a decade of research under the name of compressed sensing. In this doctoral dissertation we develop some extensions and improvements upon this technique's foundations, by modifying the random sensing matrices on which the signals of interest are projected to achieve different objectives. Firstly, we propose two methods for the adaptation of sensing matrix ensembles to the second-order moments of natural signals. These techniques leverage the maximisation of different proxies for the quantity of information acquired by compressed sensing, and are efficiently applied in the encoding of electrocardiographic tracks with minimum-complexity digital hardware. Secondly, we focus on the possibility of using compressed sensing as a method to provide a partial, yet cryptanalysis-resistant form of encryption; in this context, we show how a random matrix generation strategy with a controlled amount of perturbations can be used to distinguish between multiple user classes with different quality of access to the encrypted information content. Finally, we explore the application of compressed sensing in the design of a multispectral imager, by implementing an optical scheme that entails a coded aperture array and Fabry-Pérot spectral filters. The signal recoveries obtained by processing real-world measurements show promising results, that leave room for an improvement of the sensing matrix calibration problem in the devised imager.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Questo lavoro di tesi verte sulla progettazione architettonica di un grattacielo ad uso misto nel cuore di Dubai. E’ stato scelto come sito di collocazione proprio Dubai in quanto fiorente cittadina in grande e continua espansione. In uno skyline così eterogeneo, caratterizzato da grattacieli imponenti, è stato possibile progettare un edificio dall’importante volumetria e dalla particolare conformazione. Partendo da un modello di riferimento in campo biologico, il Saguaro Cactus, si è tratto spunto al fine di creare un ambiente che, seppure nella sua imponenza, potesse, dal suo interno, trasmettere un senso di spazio fluido e continuo ai suoi fruitori. A raggiungimento di tal scopo si è pensato ad una superficie fluida, continua, scanalata che avvolgesse tutta la struttura, creando rientranze, aggetti ed aperture trattandone