8 resultados para Elizabeth, N.J., Battle of, 1780--Maps.

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, new advances in the development of spectroscopic based methods for the characterization of heritage materials have been achieved. As concern FTIR spectroscopy new approaches aimed at exploiting near and far IR region for the characterization of inorganic or organic materials have been tested. Paint cross-section have been analysed by FTIR spectroscopy in the NIR range and an “ad hoc” chemometric approach has been developed for the elaboration of hyperspectral maps. Moreover, a new method for the characterization of calcite based on the use of grinding curves has been set up both in MIR and in FAR region. Indeed, calcite is a material widely applied in cultural heritage, and this spectroscopic approach is an efficient and rapid tool to distinguish between different calcite samples. Different enhanced vibrational techniques for the characterisation of dyed fibres have been tested. First a SEIRA (Surface Enhanced Infra-Red Absorption) protocol has been optimised allowing the analysis of colorant micro-extracts thanks to the enhancement produced by the addition of gold nanoparticles. These preliminary studies permitted to identify a new enhanced FTIR method, named ATR/RAIRS, which allowed to reach lower detection limits. Regarding Raman microscopy, the research followed two lines, which have in common the aim of avoiding the use of colloidal solutions. AgI based supports obtained after deposition on a gold-coated glass slides have been developed and tested spotting colorant solutions. A SERS spectrum can be obtained thanks to the photoreduction, which the laser may induce on the silver salt. Moreover, these supports can be used for the TLC separation of a mixture of colorants and the analyses by means of both Raman/SERS and ATR-RAIRS can be successfully reached. Finally, a photoreduction method for the “on fiber” analysis of colorant without the need of any extraction have been optimised.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last decades the impact of natural disasters to the global environment is becoming more and more severe. The number of disasters has dramatically increased, as well as the cost to the global economy and the number of people affected. Among the natural disaster, flood catastrophes are considered to be the most costly, devastating, broad extent and frequent, because of the tremendous fatalities, injuries, property damage, economic and social disruption they cause to the humankind. In the last thirty years, the World has suffered from severe flooding and the huge impact of floods has caused hundreds of thousands of deaths, destruction of infrastructures, disruption of economic activity and the loss of property for worth billions of dollars. In this context, satellite remote sensing, along with Geographic Information Systems (GIS), has become a key tool in flood risk management analysis. Remote sensing for supporting various aspects of flood risk management was investigated in the present thesis. In particular, the research focused on the use of satellite images for flood mapping and monitoring, damage assessment and risk assessment. The contribution of satellite remote sensing for the delineation of flood prone zones, the identification of damaged areas and the development of hazard maps was explored referring to selected cases of study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Aim: Acute cardiac rejection is currently diagnosed by endomyocardial biopsy (EMB), but multiparametric cardiac magnetic resonance (CMR) may be a non-invasive alternative by its capacity for myocardial structure and function characterization. Our primary aim was to determine the utility of multiparametric CMR in identifying acute graft rejection in paediatric heart transplant recipients. The second aim was to compare textural features of parametric maps in cases of rejection versus those without rejection. Methods: Fifteen patients were prospectively enrolled for contrast-enhanced CMR followed by EMB and right heart catheterization. Images were acquired on a 1,5 Tesla scanner including T1 mapping (modified Look-Locker inversion recovery sequence – MOLLI) and T2 mapping (modified GraSE sequence). The extracellular volume (ECV) was calculated using pre- and post-gadolinium T1 times of blood and myocardium and the patient’s hematocrit. Markers of graft dysfunction including hemodynamic measurements from echocardiography, catheterization and CMR were collated. Patients were divided into two groups based on degree of rejection at EMB: no rejection with no change in treatment (Group A) and acute rejection requiring new therapy (Group B). Statistical analysis included student’t t test and Pearson correlation. Results: Acute rejection was diagnosed in five patients. Mean T1 values were significantly associated with acute rejection. A monotonic, increasing trend was noted in both mean and peak T1 values, with increasing degree of rejection. ECV was significantly higher in Group B. There was no difference in T2 signal between two groups. Conclusion: Multiparametric CMR serves as a noninvasive screening tool during surveillance encounters and may be used to identify those patients that may be at higher risk of rejection and therefore require further evaluation. Future and multicenter studies are necessary to confirm these results and explore whether multiparametric CMR can decrease the number of surveillance EMBs in paediatric heart transplant recipients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

“Cartographic heritage” is different from “cartographic history”. The second term refers to the study of the development of surveying and drawing techniques related to maps, through time, i.e. through different types of cultural environment which were background for the creation of maps. The first term concerns the whole amount of ancient maps, together with these different types of cultural environment, which the history has brought us and which we perceive as cultural values to be preserved and made available to many users (public, institutions, experts). Unfortunately, ancient maps often suffer preservation problems of their analog support, mostly due to aging. Today, metric recovery in digital form and digital processing of historical cartography allow preserving map heritage. Moreover, modern geomatic techniques give us new chances of using historical information, which would be unachievable on analog supports. In this PhD thesis, the whole digital processing of recovery and elaboration of ancient cartography is reported, with a special emphasis on the use of digital tools in preservation and elaboration of cartographic heritage. It is possible to divide the workflow into three main steps, that reflect the chapter structure of the thesis itself: • map acquisition: conversion of the ancient map support from analog to digital, by means of high resolution scanning or 3D surveying (digital photogrammetry or laser scanning techniques); this process must be performed carefully, with special instruments, in order to reduce deformation as much as possible; • map georeferencing: reproducing in the digital image the native metric content of the map, or even improving it by selecting a large number of still existing ground control points; this way it is possible to understand the projection features of the historical map, as well as to evaluate and represent the degree of deformation induced by the old type of cartographic transformation (that can be unknown to us), by surveying errors or by support deformation, usually all errors of too high value with respect to our standards; • data elaboration and management in a digital environment, by means of modern software tools: vectorization, giving the map a new and more attractive graphic view (for instance, by creating a 3D model), superimposing it on current base maps, comparing it to other maps, and finally inserting it in GIS or WebGIS environment as a specific layer. The study is supported by some case histories, each of them interesting from the point of view of one digital cartographic elaboration step at least. The ancient maps taken into account are the following ones: • three maps of the Po river delta, made at the end of the XVI century by a famous land-surveyor, Ottavio Fabri (he is single author in the first map, co-author with Gerolamo Pontara in the second map, co-author with Bonajuto Lorini and others in the third map), who wrote a methodological textbook where he explains a new topographical instrument, the squadra mobile (mobile square) invented and used by himself; today all maps are preserved in the State Archive of Venice; • the Ichnoscenografia of Bologna by Filippo de’ Gnudi, made in the 1702 and today preserved in the Archiginnasio Library of Bologna; it is a scenographic view of the city, captured in a bird’s eye flight, but also with an icnographic value, as the author himself declares; • the map of Bologna by the periti Gregorio Monari and Antonio Laghi, the first map of the city derived from a systematic survey, even though it was made only ten years later (1711–1712) than the map by de’ Gnudi; in this map the scenographic view was abandoned, in favor of a more correct representation by means of orthogonal projection; today the map is preserved in the State Archive of Bologna; • the Gregorian Cadastre of Bologna, made in 1831 and updated until 1927, now preserved in the State Archive of Bologna; it is composed by 140 maps and 12 brogliardi (register volumes). In particular, the three maps of the Po river delta and the Cadastre were studied with respect to their acquisition procedure. Moreover, the first maps were analyzed from the georeferencing point of view, and the Cadastre was analyzed with respect to a possible GIS insertion. Finally, the Ichnoscenografia was used to illustrate a possible application of digital elaboration, such as 3D modeling. Last but not least, we must not forget that the study of an ancient map should start, whenever possible, from the consultation of the precious original analogical document; analysis by means of current digital techniques allow us new research opportunities in a rich and modern multidisciplinary context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’ucronia (Alternate History) è un fenomeno letterario ormai popolare e molto studiato, ma non lo sono altrettanto lo sue origini e i suoi rapporti con la storia “fatta con i se” (Counterfactual History), che risale a Erodoto e a Tito Livio. Solo nell’Ottocento alcuni autori, per vie largamente autonome, fecero della storia alternativa un genere di fiction: Louis Geoffroy con Napoléon apocryphe (1836), storia «della conquista del mondo e della monarchia universale», e Charles Renouvier con Uchronie. L’utopie dans l’histoire (1876), storia «della civiltà europea quale avrebbe potuto essere» se il cristianesimo fosse stato fermato nel II secolo. Questi testi intrattengono relazioni complesse con la letteratura dell’epoca di genere sia realistico, sia fantastico, ma altresì con fenomeni di altra natura: la storiografia, nelle sue forme e nel suo statuto epistemico, e ancor più il senso del possibile - o la filosofia della storia - derivato dall’esperienza della rivoluzione e dal confronto con le teorie utopistiche e di riforma sociale. Altri testi, prodotti in Inghilterra e negli Stati Uniti nello stesso periodo, esplorano le possibilità narrative e speculative del genere: tra questi P.s’ Correspondence di Nathaniel Hawthorne (1845), The Battle of Dorking di George Chesney (1871) e Hands Off di Edward Hale (1881); fino a una raccolta del 1931, If It Had Happened Otherwise, che anticipò molte forme e temi delle ucronie successive. Queste opere sono esaminate sia nel contesto storico e letterario in cui furono prodotte, sia con gli strumenti dell’analisi testuale. Una particolare attenzione è dedicata alle strategie di lettura prescritte dai testi, che subordinano i significati al confronto mentale tra gli eventi narrati e la serie dei fatti autentici. Le teorie sui counterfactuals prodotte in altri campi disciplinari, come la storia e la psicologia, arricchiscono la comprensione dei testi e dei loro rapporti con fenomeni extra-letterari.