9 resultados para Restraint System Tests.
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
An Adaptive Optic (AO) system is a fundamental requirement of 8m-class telescopes. We know that in order to obtain the maximum possible resolution allowed by these telescopes we need to correct the atmospheric turbulence. Thanks to adaptive optic systems we are able to use all the effective potential of these instruments, drawing all the information from the universe sources as best as possible. In an AO system there are two main components: the wavefront sensor (WFS) that is able to measure the aberrations on the incoming wavefront in the telescope, and the deformable mirror (DM) that is able to assume a shape opposite to the one measured by the sensor. The two subsystem are connected by the reconstructor (REC). In order to do this, the REC requires a “common language" between these two main AO components. It means that it needs a mapping between the sensor-space and the mirror-space, called an interaction matrix (IM). Therefore, in order to operate correctly, an AO system has a main requirement: the measure of an IM in order to obtain a calibration of the whole AO system. The IM measurement is a 'mile stone' for an AO system and must be done regardless of the telescope size or class. Usually, this calibration step is done adding to the telescope system an auxiliary artificial source of light (i.e a fiber) that illuminates both the deformable mirror and the sensor, permitting the calibration of the AO system. For large telescope (more than 8m, like Extremely Large Telescopes, ELTs) the fiber based IM measurement requires challenging optical setups that in some cases are also impractical to build. In these cases, new techniques to measure the IM are needed. In this PhD work we want to check the possibility of a different method of calibration that can be applied directly on sky, at the telescope, without any auxiliary source. Such a technique can be used to calibrate AO system on a telescope of any size. We want to test the new calibration technique, called “sinusoidal modulation technique”, on the Large Binocular Telescope (LBT) AO system, which is already a complete AO system with the two main components: a secondary deformable mirror with by 672 actuators, and a pyramid wavefront sensor. My first phase of PhD work was helping to implement the WFS board (containing the pyramid sensor and all the auxiliary optical components) working both optical alignments and tests of some optical components. Thanks to the “solar tower” facility of the Astrophysical Observatory of Arcetri (Firenze), we have been able to reproduce an environment very similar to the telescope one, testing the main LBT AO components: the pyramid sensor and the secondary deformable mirror. Thanks to this the second phase of my PhD thesis: the measure of IM applying the sinusoidal modulation technique. At first we have measured the IM using a fiber auxiliary source to calibrate the system, without any kind of disturbance injected. After that, we have tried to use this calibration technique in order to measure the IM directly “on sky”, so adding an atmospheric disturbance to the AO system. The results obtained in this PhD work measuring the IM directly in the Arcetri solar tower system are crucial for the future development: the possibility of the acquisition of IM directly on sky means that we are able to calibrate an AO system also for extremely large telescope class where classic IM measurements technique are problematic and, sometimes, impossible. Finally we have not to forget the reason why we need this: the main aim is to observe the universe. Thanks to these new big class of telescopes and only using their full capabilities, we will be able to increase our knowledge of the universe objects observed, because we will be able to resolve more detailed characteristics, discovering, analyzing and understanding the behavior of the universe components.
Resumo:
The thesis work concerns X-ray spectrometry for both medical and space applications and is divided into two sections. The first section addresses an X-ray spectrometric system designed to study radiological beams and is devoted to the optimization of diagnostic procedures in medicine. A parametric semi-empirical model capable of efficiently reconstructing diagnostic X-ray spectra in 'middle power' computers was developed and tested. In addition, different silicon diode detectors were tested as real-time detectors in order to provide a real-time evaluation of the spectrum during diagnostic procedures. This project contributes to the field by presenting an improved simulation of a realistic X-ray beam emerging from a common X-ray tube with a complete and detailed spectrum that lends itself to further studies of added filtration, thus providing an optimized beam for different diagnostic applications in medicine. The second section describes the preliminary tests that have been carried out on the first version of an Application Specific Integrated Circuit (ASIC), integrated with large area position-sensitive Silicon Drift Detector (SDD) to be used on board future space missions. This technology has been developed for the ESA project: LOFT (Large Observatory for X-ray Timing), a new medium-class space mission that the European Space Agency has been assessing since February of 2011. The LOFT project was proposed as part of the Cosmic Vision Program (2015-2025).
Resumo:
In this thesis the application of biotechnological processes based on microbial metabolic degradation of halogenated compound has been investigated. Several studies showed that most of these pollutants can be biodegraded by single bacterial strains or mixed microbial population via aerobic direct metabolism or cometabolism using as a growth substrates aromatic or aliphatic hydrocarbons. The enhancement of two specific processes has been here object of study in relation with its own respective scenario described as follow: 1st) the bioremediation via aerobic cometabolism of soil contaminated by a high chlorinated compound using a mixed microbial population and the selection and isolation of consortium specific for the compound. 2nd) the implementation of a treatment technology based on direct metabolism of two pure strains at the exact point source of emission, preventing dilution and contamination of large volumes of waste fluids polluted by several halogenated compound minimizing the environmental impact. In order to verify the effect of these two new biotechnological application to remove halogenated compound and purpose them as a more efficient alternative continuous and batch tests have been set up in the experimental part of this thesis. Results obtained from the continuous tests in the second scenario have been supported by microbial analysis via Fluorescence in situ Hybridisation (FISH) and by a mathematical model of the system. The results showed that both process in its own respective scenario offer an effective solutions for the biological treatment of chlorinate compound pollution.
Resumo:
Bread dough and particularly wheat dough, due to its viscoelastic behaviour, is probably the most dynamic and complicated rheological system and its characteristics are very important since they highly affect final products’ textural and sensorial properties. The study of dough rheology has been a very challenging task for many researchers since it can provide numerous information about dough formulation, structure and processing. This explains why dough rheology has been a matter of investigation for several decades. In this research rheological assessment of doughs and breads was performed by using empirical and fundamental methods at both small and large deformation, in order to characterize different types of doughs and final products such as bread. In order to study the structural aspects of food products, image analysis techniques was used for the integration of the information coming from empirical and fundamental rheological measurements. Evaluation of dough properties was carried out by texture profile analysis (TPA), dough stickiness (Chen and Hoseney cell) and uniaxial extensibility determination (Kieffer test) by using a Texture Analyser; small deformation rheological measurements, were performed on a controlled stress–strain rheometer; moreover the structure of different doughs was observed by using the image analysis; while bread characteristics were studied by using texture profile analysis (TPA) and image analysis. The objective of this research was to understand if the different rheological measurements were able to characterize and differentiate the different samples analysed. This in order to investigate the effect of different formulation and processing conditions on dough and final product from a structural point of view. For this aim the following different materials were performed and analysed: - frozen dough realized without yeast; - frozen dough and bread made with frozen dough; - doughs obtained by using different fermentation method; - doughs made by Kamut® flour; - dough and bread realized with the addition of ginger powder; - final products coming from different bakeries. The influence of sub-zero storage time on non-fermented and fermented dough viscoelastic performance and on final product (bread) was evaluated by using small deformation and large deformation methods. In general, the longer the sub-zero storage time the lower the positive viscoelastic attributes. The effect of fermentation time and of different type of fermentation (straight-dough method; sponge-and-dough procedure and poolish method) on rheological properties of doughs were investigated using empirical and fundamental analysis and image analysis was used to integrate this information throughout the evaluation of the dough’s structure. The results of fundamental rheological test showed that the incorporation of sourdough (poolish method) provoked changes that were different from those seen in the others type of fermentation. The affirmative action of some ingredients (extra-virgin olive oil and a liposomic lecithin emulsifier) to improve rheological characteristics of Kamut® dough has been confirmed also when subjected to low temperatures (24 hours and 48 hours at 4°C). Small deformation oscillatory measurements and large deformation mechanical tests performed provided useful information on the rheological properties of samples realized by using different amounts of ginger powder, showing that the sample with the highest amount of ginger powder (6%) had worse rheological characteristics compared to the other samples. Moisture content, specific volume, texture and crumb grain characteristics are the major quality attributes of bread products. The different sample analyzed, “Coppia Ferrarese”, “Pane Comune Romagnolo” and “Filone Terra di San Marino”, showed a decrease of crumb moisture and an increase in hardness over the storage time. Parameters such as cohesiveness and springiness, evaluated by TPA that are indicator of quality of fresh bread, decreased during the storage. By using empirical rheological tests we found several differences among the samples, due to the different ingredients used in formulation and the different process adopted to prepare the sample, but since these products are handmade, the differences could be account as a surplus value. In conclusion small deformation (in fundamental units) and large deformation methods showed a significant role in monitoring the influence of different ingredients used in formulation, different processing and storage conditions on dough viscoelastic performance and on final product. Finally the knowledge of formulation, processing and storage conditions together with the evaluation of structural and rheological characteristics is fundamental for the study of complex matrices like bakery products, where numerous variable can influence their final quality (e.g. raw material, bread-making procedure, time and temperature of the fermentation and baking).
Resumo:
The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).
Resumo:
Flicker is a power quality phenomenon that applies to cycle instability of light intensity resulting from supply voltage fluctuation, which, in turn can be caused by disturbances introduced during power generation, transmission or distribution. The standard EN 61000-4-15 which has been recently adopted also by the IEEE as IEEE Standard 1453 relies on the analysis of the supply voltage which is processed according to a suitable model of the lamp – human eye – brain chain. As for the lamp, an incandescent 60 W, 230 V, 50 Hz source is assumed. As far as the human eye – brain model is concerned, it is represented by the so-called flicker curve. Such a curve was determined several years ago by statistically analyzing the results of tests where people were subjected to flicker with different combinations of magnitude and frequency. The limitations of this standard approach to flicker evaluation are essentially two. First, the provided index of annoyance Pst can be related to an actual tiredness of the human visual system only if such an incandescent lamp is used. Moreover, the implemented response to flicker is “subjective” given that it relies on the people answers about their feelings. In the last 15 years, many scientific contributions have tackled these issues by investigating the possibility to develop a novel model of the eye-brain response to flicker and overcome the strict dependence of the standard on the kind of the light source. In this light of fact, this thesis is aimed at presenting an important contribution for a new Flickermeter. An improved visual system model using a physiological parameter that is the mean value of the pupil diameter, has been presented, thus allowing to get a more “objective” representation of the response to flicker. The system used to both generate flicker and measure the pupil diameter has been illustrated along with all the results of several experiments performed on the volunteers. The intent has been to demonstrate that the measurement of that geometrical parameter can give reliable information about the feeling of the human visual system to light flicker.
Resumo:
With life expectancies increasing around the world, populations are getting age and neurodegenerative diseases have become a global issue. For this reason we have focused our attention on the two most important neurodegenerative diseases: Parkinson’s and Alzheimer’s. Parkinson’s disease is a chronic progressive neurodegenerative movement disorder of multi-factorial origin. Environmental toxins as well as agricultural chemicals have been associated with PD. Has been observed that N/OFQ contributes to both neurotoxicity and symptoms associated with PD and that pronociceptin gene expression is up-regulated in rat SN of 6-OHDA and MPP induced experimental parkinsonism. First, we investigated the role of N/OFQ-NOP system in the pathogenesis of PD in an animal model developed using PQ and/or MB. Then we studied Alzheimer's disease. This disorder is defined as a progressive neurologic disease of the brain leading to the irreversible loss of neurons and the loss of intellectual abilities, including memory and reasoning, which become severe enough to impede social or occupational functioning. Effective biomarker tests could prevent such devastating damage occurring. We utilized the peripheral blood cells of AD discordant monozygotic twin in the search of peripheral markers which could reflect the pathology within the brain, and also support the hypothesis that PBMC might be a useful model of epigenetic gene regulation in the brain. We investigated the mRNA levels in several genes involve in AD pathogenesis, as well DNA methylation by MSP Real-Time PCR. Finally by Western Blotting we assess the immunoreactivity levels for histone modifications. Our results support the idea that epigenetic changes assessed in PBMCs can also be useful in neurodegenerative disorders, like AD and PD, enabling identification of new biomarkers in order to develop early diagnostic programs.
Resumo:
Introduction and Background: Multiple system atrophy (MSA) is a sporadic, adult-onset, progressive neurodegenerative disease characterized clinically by parkinsonism, cerebellar ataxia, and autonomic failure. We investigated cognitive functions longitudinally in a group of probable MSA patients, matching data with sleep parameters. Patients and Methods: 10 patients (7m/3f) underwent a detailed interview, a general and neurological examination, laboratory exams, MRI scans, a cardiovascular reflexes study, a battery of neuropsychological tests, and video-polysomnographic recording (VPSG). Patients were revaluated (T1) a mean of 16±5 (range: 12-28) months after the initial evaluation (T0). At T1, the neuropsychological assessment and VPSG were repeated. Results: The mean patient age was 57.8±6.4 years (range: 47-64) with a mean age at disease onset of 53.2±7.1 years (range: 43-61) and symptoms duration at T0 of 60±48 months (range: 12-144). At T0, 7 patients showed no cognitive deficits while 3 patients showed isolated cognitive deficits. At T1, 1 patient worsened developing multiple cognitive deficits from a normal condition. At T0 and T1, sleep efficiency was reduced, REM latency increased, NREM sleep stages 1-2 slightly increased. Comparisons between T1 and T0 showed a significant worsening in two tests of attention and no significant differences of VPSG parameters. No correlation was found between neuropsychological results and VPSG findings or RBD duration. Discussion and Conclusions: The majority of our patients do not show any cognitive deficits at T0 and T1, while isolated cognitive deficits are present in the remaining patients. Attention is the cognitive function which significantly worsened. Our data confirm the previous findings concerning the prevalence, type and the evolution of cognitive deficits in MSA. Regarding the developing of a condition of dementia, our data did not show a clear-cut diagnosis of dementia. We confirm a mild alteration of sleep structure. RBD duration does not correlate with neuropsychological findings.