11 resultados para APERTURE

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several MCAO systems are under study to improve the angular resolution of the current and of the future generation large ground-based telescopes (diameters in the 8-40 m range). The subject of this PhD Thesis is embedded in this context. Two MCAO systems, in dierent realization phases, are addressed in this Thesis: NIRVANA, the 'double' MCAO system designed for one of the interferometric instruments of LBT, is in the integration and testing phase; MAORY, the future E-ELT MCAO module, is under preliminary study. These two systems takle the sky coverage problem in two dierent ways. The layer oriented approach of NIRVANA, coupled with multi-pyramids wavefront sensors, takes advantage of the optical co-addition of the signal coming from up to 12 NGS in a annular 2' to 6' technical FoV and up to 8 in the central 2' FoV. Summing the light coming from many natural sources permits to increase the limiting magnitude of the single NGS and to improve considerably the sky coverage. One of the two Wavefront Sensors for the mid- high altitude atmosphere analysis has been integrated and tested as a stand- alone unit in the laboratory at INAF-Osservatorio Astronomico di Bologna and afterwards delivered to the MPIA laboratories in Heidelberg, where was integrated and aligned to the post-focal optical relay of one LINC-NIRVANA arm. A number of tests were performed in order to characterize and optimize the system functionalities and performance. A report about this work is presented in Chapter 2. In the MAORY case, to ensure correction uniformity and sky coverage, the LGS-based approach is the current baseline. However, since the Sodium layer is approximately 10 km thick, the articial reference source looks elongated, especially when observed from the edge of a large aperture. On a 30-40 m class telescope, for instance, the maximum elongation varies between few arcsec and 10 arcsec, depending on the actual telescope diameter, on the Sodium layer properties and on the laser launcher position. The centroiding error in a Shack-Hartmann WFS increases proportionally to the elongation (in a photon noise dominated regime), strongly limiting the performance. To compensate for this effect a straightforward solution is to increase the laser power, i.e. to increase the number of detected photons per subaperture. The scope of Chapter 3 is twofold: an analysis of the performance of three dierent algorithms (Weighted Center of Gravity, Correlation and Quad-cell) for the instantaneous LGS image position measurement in presence of elongated spots and the determination of the required number of photons to achieve a certain average wavefront error over the telescope aperture. An alternative optical solution to the spot elongation problem is proposed in Section 3.4. Starting from the considerations presented in Chapter 3, a first order analysis of the LGS WFS for MAORY (number of subapertures, number of detected photons per subaperture, RON, focal plane sampling, subaperture FoV) is the subject of Chapter 4. An LGS WFS laboratory prototype was designed to reproduce the relevant aspects of an LGS SH WFS for the E-ELT and to evaluate the performance of different centroid algorithms in presence of elongated spots as investigated numerically and analytically in Chapter 3. This prototype permits to simulate realistic Sodium proles. A full testing plan for the prototype is set in Chapter 4.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quality of astronomical sites is the first step to be considered to have the best performances from the telescopes. In particular, the efficiency of large telescopes in UV, IR, radio etc. is critically dependent on atmospheric transparency. It is well known that the random optical effects induced on the light propagation by turbulent atmosphere also limit telescope’s performances. Nowadays, clear appears the importance to correlate the main atmospheric physical parameters with the optical quality reachable by large aperture telescopes. The sky quality evaluation improved with the introduction of new techniques, new instrumentations and with the understanding of the link between the meteorological (or synoptical parameters and the observational conditions thanks to the application of the theories of electromagnetic waves propagation in turbulent medias: what we actually call astroclimatology. At the present the site campaigns are evolved and are performed using the classical scheme of optical seeing properties, meteorological parameters, sky transparency, sky darkness and cloudiness. New concept are added and are related to the geophysical properties such as seismicity, microseismicity, local variability of the climate, atmospheric conditions related to the ground optical turbulence and ground wind regimes, aerosol presence, use of satellite data. The purpose of this project is to provide reliable methods to analyze the atmospheric properties that affect ground-based optical astronomical observations and to correlate them with the main atmospheric parameters generating turbulence and affecting the photometric accuracy. The first part of the research concerns the analysis and interpretation of longand short-time scale meteorological data at two of the most important astronomical sites located in very different environments: the Paranal Observatory in the Atacama Desert (Chile), and the Observatorio del Roque de Los Muchachos(ORM) located in La Palma (Canary Islands, Spain). The optical properties of airborne dust at ORM have been investigated collecting outdoor data using a ground-based dust monitor. Because of its dryness, Paranal is a suitable observatory for near-IR observations, thus the extinction properties in the spectral range 1.00-2.30 um have been investigated using an empirical method. Furthermore, this PhD research has been developed using several turbulence profilers in the selection of the site for the European Extremely Large Telescope(E-ELT). During the campaigns the properties of the turbulence at different heights at Paranal and in the sites located in northern Chile and Argentina have been studied. This given the possibility to characterize the surface layer turbulence at Paranal and its connection with local meteorological conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The durability of stone building materials is an issue of utmost importance in the field of monument conservation. In order to be able to preserve our built cultural heritage, the thorough knowledge of its constituent materials and the understanding of the processes that affect them are indispensable. The main objective of this research was to evaluate the durability of a special stone type, the crystalline stones, in correlation with their intrinsic characteristics, the petrophysical properties. The crystalline stones are differentiated from the cemented stones on the basis of textural features. Their most important specific property is the usually low, fissure-like porosity. Stone types of significant monumental importance, like the marble or granite belong to this group. The selected materials for this investigation, indeed, are a marble (Macael marble, Spain) and a granite (Silvestre Vilachán granite, Spain). In addition, an andesite (Szob andesite, Hungary) also of significant monumental importance was selected. This way a wide range of crystalline rocks is covered in terms of petrogenesis: stones of metamorphic, magmatic and volcanic origin, which can be of importance in terms of mineralogical, petrological or physical characteristics. After the detailed characterization of the petrophysical properties of the selected stones, their durability was assessed by means of artificial ageing. The applied ageing tests were: the salt crystallization, the frost resistance in pure water and in the presence of soluble salts, the salt mist and the action of SO2 in the presence of humidity. The research aimed at the understanding of the mechanisms of each weathering process and at finding the petrophysical properties most decisive in the degradation of these materials. Among the several weathering mechanisms, the most important ones were found to be the physical stress due to crystallization pressure of both salt and ice, the thermal fatigue due to cyclic temperature changes and the chemical reactions (mostly the acidic attack) between the mineral phases and the external fluids. The properties that fundamentally control the degradation processes, and thus the durability of stones were found to be: the mineralogical and chemical composition; the hydraulic properties especially the water uptake, the permeability and the drying; the void space structure, especially the void size and aperture size distribution and the connectivity of the porous space; and the thermal and mechanical properties. Because of the complexity of the processes and the high number of determining properties, no mechanisms or characteristics could be identified as typical for crystalline stones. The durability or alterability of each stone type must be assessed according to its properties and not according to the textural or petrophysical classification they belong to. Finally, a critical review of standardized methods is presented, based on which an attempt was made for recommendations of the most adequate methodology for the characterization and durability assessment of crystalline stones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Satellite SAR (Synthetic Aperture Radar) interferometry represents a valid technique for digital elevation models (DEM) generation, providing metric accuracy even without ancillary data of good quality. Depending on the situations the interferometric phase could be interpreted both as topography and as a displacement eventually occurred between the two acquisitions. Once that these two components have been separated it is possible to produce a DEM from the first one or a displacement map from the second one. InSAR DEM (Digital Elevation Model) generation in the cryosphere is not a straightforward operation because almost every interferometric pair contains also a displacement component, which, even if small, when interpreted as topography during the phase to height conversion step could introduce huge errors in the final product. Considering a glacier, assuming the linearity of its velocity flux, it is therefore necessary to differentiate at least two pairs in order to isolate the topographic residue only. In case of an ice shelf the displacement component in the interferometric phase is determined not only by the flux of the glacier but also by the different heights of the two tides. As a matter of fact even if the two scenes of the interferometric pair are acquired at the same time of the day only the main terms of the tide disappear in the interferogram, while the other ones, smaller, do not elide themselves completely and so correspond to displacement fringes. Allowing for the availability of tidal gauges (or as an alternative of an accurate tidal model) it is possible to calculate a tidal correction to be applied to the differential interferogram. It is important to be aware that the tidal correction is applicable only knowing the position of the grounding line, which is often a controversial matter. In this thesis it is described the methodology applied for the generation of the DEM of the Drygalski ice tongue in Northern Victoria Land, Antarctica. The displacement has been determined both in an interferometric way and considering the coregistration offsets of the two scenes. A particular attention has been devoted to investigate the importance of the role of some parameters, such as timing annotations and orbits reliability. Results have been validated in a GIS environment by comparison with GPS displacement vectors (displacement map and InSAR DEM) and ICEsat GLAS points (InSAR DEM).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The hard X-ray band (10 - 100 keV) has been only observed so far by collimated and coded aperture mask instruments, with a sensitivity and an angular resolution lower than two orders of magnitude as respects the current X-ray focusing telescopes operating below 10 - 15 keV. The technological advance in X-ray mirrors and detection systems is now able to extend the X-ray focusing technique to the hard X-ray domain, filling the gap in terms of observational performances and providing a totally new deep view on some of the most energetic phenomena of the Universe. In order to reach a sensitivity of 1 muCrab in the 10 - 40 keV energy range, a great care in the background minimization is required, a common issue for all the hard X-ray focusing telescopes. In the present PhD thesis, a comprehensive analysis of the space radiation environment, the payload design and the resulting prompt X-ray background level is presented, with the aim of driving the feasibility study of the shielding system and assessing the scientific requirements of the future hard X-ray missions. A Geant4 based multi-mission background simulator, BoGEMMS, is developed to be applied to any high energy mission for which the shielding and instruments performances are required. It allows to interactively create a virtual model of the telescope and expose it to the space radiation environment, tracking the particles along their path and filtering the simulated background counts as a real observation in space. Its flexibility is exploited to evaluate the background spectra of the Simbol-X and NHXM mission, as well as the soft proton scattering by the X-ray optics and the selection of the best shielding configuration. Altough the Simbol-X and NHXM missions are the case studies of the background analysis, the obtained results can be generalized to any future hard X-ray telescope. For this reason, a simplified, ideal payload model is also used to select the major sources of background in LEO. All the results are original contributions to the assessment studies of the cited missions, as part of the background groups activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La ricerca affronta in modo unitario e nell’ottica europea i multiformi fenomeni della doppia imposizione economica e giuridica, assumendo come paradigma iniziale la tassazione dei dividendi cross-border. Definito lo statuto giuridico della doppia imposizione, se ne motiva la contrarietà all’ordinamento europeo e si indagano gli strumenti comunitari per raggiungere l’obiettivo europeo della sua eliminazione. In assenza di un’armonizzazione positiva, il risultato sostanziale viene raggiunto grazie all’integrazione negativa. Si dimostra che il riserbo della Corte di Giustizia di fronte a opzioni di politica fiscale è soltanto un’impostazione di facciata, valorizzando le aperture giurisprudenziali per il suo superamento. Questi, in sintesi, i passaggi fondamentali. Si parte dall’evoluzione delle libertà fondamentali in diritti di rango costituzionale, che ne trasforma il contenuto economico e la portata giuridica, attribuendo portata costituzionale ai valori di neutralità e non restrizione. Si evidenzia quindi il passaggio dal divieto di discriminazioni al divieto di restrizioni, constatando il fallimento del tentativo di configurare il divieto di doppia imposizione come principio autonomo dell’ordinamento europeo. Contemporaneamente, però, diventa opportuno riesaminare la distinzione tra doppia imposizione economica e giuridica, e impostare un unico inquadramento teorico della doppia imposizione come ipotesi paradigmatica di restrizione alle libertà. Conseguentemente, viene razionalizzato l’impianto giurisprudenziale delle cause di giustificazione. Questo consente agevolmente di legittimare scelte comunitarie per la ripartizione dei poteri impositivi tra Stati Membri e l’attribuzione delle responsabilità per l’eliminazione degli effetti della doppia imposizione. In conclusione, dunque, emerge una formulazione europea dell’equilibrato riparto di poteri impositivi a favore dello Stato della fonte. E, accanto ad essa, una concezione comunitaria del principio di capacità contributiva, con implicazioni dirompenti ancora da verificare. Sul piano metodologico, l’analisi si concentra criticamente sull’operato della Corte di Giustizia, svelando punti di forza e di debolezza della sua azione, che ha posto le basi per la risposta europea al problema della doppia imposizione.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The carbonate outcrops of the anticline of Monte Conero (Italy) were studied in order to characterize the geometry of the fractures and to establish their influence on the petrophysical properties (hydraulic conductivity) and on the vulnerability to pollution. The outcrops form an analog for a fractured aquifer and belong to the Maiolica Fm. and the Scaglia Rossa Fm. The geometrical properties of fractures such as orientation, length, spacing and aperture were collected and statistically analyzed. Five types of mechanical fractures were observed: veins, joints, stylolites, breccias and faults. The types of fractures are arranged in different sets and geometric assemblages which form fracture networks. In addition, the fractures were analyzed at the microscale using thin sections. The fracture age-relationships resulted similar to those observed at the outcrop scale, indicating that at least three geological episodes have occurred in Monte Conero. A conceptual model for fault development was based on the observations of veins and stylolites. The fracture sets were modelled by the code FracSim3D to generate fracture network models. The permeability of a breccia zone was estimated at microscale by and point counting and binary image methods, whereas at the outcrop scale with Oda’s method. Microstructure analysis revealed that only faults and breccias are potential pathways for fluid flow since all veins observed are filled with calcite. According this, three scenarios were designed to asses the vulnerability to pollution of the analogue aquifer: the first scenario considers the Monte Conero without fractures, second scenario with all observed systematic fractures and the third scenario with open veins, joints and faults/breccias. The fractures influence the carbonate aquifer by increasing its porosity and hydraulic conductivity. The vulnerability to pollution depends also on the presence of karst zones, detric zones and the material of the vadose zone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il progetto di ricerca di questa tesi è stato focalizzato sulla sintesi di tre classi di molecole: β-lattami, Profeni e α-amminonitrili, utilizzando moderne tecniche di sintesi organica, metodologie ecosostenibili e strategie biocatalitiche. I profeni sono una categoria di antiinfiammatori molto diffusa e in particolare abbiamo sviluppato e ottimizzato una procedura in due step per ottenere (S)-Profeni da 2-arilpropanali raceme. Il primo step consiste in una bioriduzione delle aldeidi per dare i relativi (S)-2-Aril Propanoli tramite un processo DKR mediato dall’enzima Horse Liver Alcohol Dehydrogenase. Il secondo, l’ossidazione a (S)-Profeni, è promossa da NaClO2 e TEMPO come catalizzatore. Con lo scopo di migliorare il processo, in collaborazione con il gruppo di ricerca di Francesca Paradisi all’University College Dublino abbiamo immobilizzato l’enzima HLADH, ottenendo buone rese e una migliore enantioselettività. Abbiamo inoltre proposto un interessante approccio enzimatico per l’ossidazione degli (S)-2-Aril Propanoli utilizzando una laccasi da Trametes Versicolor. L’anello β-lattamico è un eterociclo molto importante, noto per essere un interessante farmacoforo. Abbiamo sintetizzato nuovi N-metiltio beta-lattami, che hanno mostrato un’attività antibatterica molto interessante contro ceppi resistenti di Staphilococcus Aureus prelevati da pazienti affetti da fibrosis cistica. Abbiamo poi coniugato gruppi polifenolici a questi nuovi β-lattami ottenendo molecule antiossidanti e antibatteriche, cioè con attività duale. Abbiamo poi sintetizzato un nuovo ibrido retinoide-betalattame che ha indotto differenziazione si cellule di neuroblastoma. Abbiamo poi sfruttato la reazione di aperture dell’anello monobattamico tramite enzimi idrolitici, con lo scopo di ottenere β-amminoacidi chirali desimmetrizzati come il monoestere dell’acido β–amminoglutammico. Per quando riguarda gli α-amminonitrili, è stato sviluppato un protocollo di Strecker. Le reazioni sono state molto efficienti utilizzando come fonte di cianuro l’acetone cianidrina in acqua, utilizzando differenti aldeidi e chetoni, ammine primarie e secondarie. Per mettere a punto una versione asimmetrica del protocollo, abbiamo usato ammine chirali con lo scopo di ottenere nuovi α-amminonitrili chirali.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Con le "Imagini degli dei degli antichi", pubblicate a Venezia nel 1556 e poi in più edizioni arricchite e illustrate, l’impegnato gentiluomo estense Vincenzo Cartari realizza il primo, fortunatissimo manuale mitografico italiano in lingua volgare, diffuso e tradotto in tutta l’Europa moderna. Cartari rimodula, secondo accenti divulgativi ma fedeli, fonti latine tradizionali: come le ricche "Genealogie deorum gentilium" di Giovanni Boccaccio, l’appena precedente "De deis gentium varia et multiplex historia" di Lilio Gregorio Giraldi, i curiosi "Fasti" ovidiani, da lui stesso commentati e tradotti. Soprattutto, però, introduce il patrimonio millenario di favole ed esegesi classiche, di aperture egiziane, mediorientali, sassoni, a una chiave di lettura inedita, agile e vitalissima: l’ecfrasi. Le divinità e i loro cortei di creature minori, aneddoti leggendari e attributi identificativi si susseguono secondo un taglio iconico e selettivo. Sfilano, in trionfi intrisi di raffinato petrarchismo neoplatonico e di emblematica picta poesis rinascimentale, soltanto gli aspetti figurabili e distintivi dei personaggi mitici: perché siano «raccontate interamente» tutte le cose attinenti alle figure antiche, «con le imagini quasi di tutti i dei, e le ragioni perché fossero così dipinti». Così, le "Imagini" incontrano il favore di lettori colti e cortigiani eleganti, di pittori e ceramisti, di poeti e artigiani. Allestiscono una sorta di «manuale d’uso» pronto all’inchiostro del poeta o al pennello dell’artista, una suggestiva raccolta di «libretti figurativi» ripresi tanto dalla maniera di Paolo Veronese o di Giorgio Vasari, quanto dal classicismo dei Carracci e di Nicolas Poussin. Si rivelano, infine, summa erudita capace di attirare appunti e revisioni: l’antiquario padovano Lorenzo Pignoria, nel 1615 e di nuovo nel 1626, vi aggiunge appendici archeologiche e comparatistiche, interessate al remoto regno dei faraoni quanto agli esotici idoli orientali e dei Nuovi Mondi.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The idea of balancing the resources spent in the acquisition and encoding of natural signals strictly to their intrinsic information content has interested nearly a decade of research under the name of compressed sensing. In this doctoral dissertation we develop some extensions and improvements upon this technique's foundations, by modifying the random sensing matrices on which the signals of interest are projected to achieve different objectives. Firstly, we propose two methods for the adaptation of sensing matrix ensembles to the second-order moments of natural signals. These techniques leverage the maximisation of different proxies for the quantity of information acquired by compressed sensing, and are efficiently applied in the encoding of electrocardiographic tracks with minimum-complexity digital hardware. Secondly, we focus on the possibility of using compressed sensing as a method to provide a partial, yet cryptanalysis-resistant form of encryption; in this context, we show how a random matrix generation strategy with a controlled amount of perturbations can be used to distinguish between multiple user classes with different quality of access to the encrypted information content. Finally, we explore the application of compressed sensing in the design of a multispectral imager, by implementing an optical scheme that entails a coded aperture array and Fabry-Pérot spectral filters. The signal recoveries obtained by processing real-world measurements show promising results, that leave room for an improvement of the sensing matrix calibration problem in the devised imager.