907 resultados para PRECISION EXPERIMENTS
Resumo:
Máster en Oceanografía
Resumo:
Precision horticulture and spatial analysis applied to orchards are a growing and evolving part of precision agriculture technology. The aim of this discipline is to reduce production costs by monitoring and analysing orchard-derived information to improve crop performance in an environmentally sound manner. Georeferencing and geostatistical analysis coupled to point-specific data mining allow to devise and implement management decisions tailored within the single orchard. Potential applications range from the opportunity to verify in real time along the season the effectiveness of cultural practices to achieve the production targets in terms of fruit size, number, yield and, in a near future, fruit quality traits. These data will impact not only the pre-harvest but their effect will extend to the post-harvest sector of the fruit chain. Chapter 1 provides an updated overview on precision horticulture , while in Chapter 2 a preliminary spatial statistic analysis of the variability in apple orchards is provided before and after manual thinning; an interpretation of this variability and how it can be managed to maximize orchard performance is offered. Then in Chapter 3 a stratification of spatial data into management classes to interpret and manage spatial variation on the orchard is undertaken. An inverse model approach is also applied to verify whether the crop production explains environmental variation. In Chapter 4 an integration of the techniques adopted before is presented. A new key for reading the information gathered within the field is offered. The overall goal of this Dissertation was to probe into the feasibility, the desirability and the effectiveness of a precision approach to fruit growing, following the lines of other areas of agriculture that already adopt this management tool. As existing applications of precision horticulture already had shown, crop specificity is an important factor to be accounted for. This work focused on apple because of its importance in the area where the work was carried out, and worldwide.
Resumo:
In this thesis the performances of the CMS Drift Tubes Local Trigger System of the CMS detector are studied. CMS is one of the general purpose experiments that will operate at the Large Hadron Collider at CERN. Results from data collected during the Cosmic Run At Four Tesla (CRAFT) commissioning exercise, a globally coordinated run period where the full experiment was involved and configured to detect cosmic rays crossing the CMS cavern, are presented. These include analyses on the precision and accuracy of the trigger reconstruction mechanism and measurement of the trigger efficiency. The description of a method to perform system synchronization is also reported, together with a comparison of the outcomes of trigger electronics and its software emulator code.
Resumo:
Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.
Resumo:
In the last decades, the increase of industrial activities and of the request for the world food requirement, the intensification of natural resources exploitation, directly connected to pollution, have aroused an increasing interest of the public opinion towards initiatives linked to the regulation of food production, as well to the institution of a modern legislation for the consumer guardianship. This work was planned taking into account some important thematics related to marine environment, collecting and showing the data obtained from the studies made on different marine species of commercial interest (Chamelea gallina, Mytilus edulis, Ostrea edulis, Crassostrea gigas, Salmo salar, Gadus morhua). These studies have evaluated the effects of important physic and chemical parameters variations (temperature, xenobiotics like drugs, hydrocarbons and pesticides) on cells involved in the immune defence (haemocytes) and on some important enzymatic systems involved in xenobiotic biotransformation processes (cytochrome P450 complex) and in the related antioxidant defence processes (Superoxide dismutase, Catalase, Heat Shock Protein), from a biochemical and bimolecular point of view. Oxygen is essential in the biological answer of a living organism. Its consume in the normal cellular breathing physiological processes and foreign substances biotransformation, leads to reactive oxygen species (ROS) formation, potentially toxic and responsible of biological macromolecules damages with consequent pathologies worsening. Such processes can bring to a qualitative alteration of the derived products, but also to a general state of suffering that in the most serious cases can provoke the death of the organism, with important repercussions in economic field, in the output of the breedings, of fishing and of aquaculture. In this study it seemed interesting to apply also alternative methodologies currently in use in the medical field (cytofluorimetry) and in proteomic studies (bidimensional electrophoresis, mass spectrometry) with the aim of identify new biomarkers to place beside the traditional methods for the control of the animal origin food quality. From the results it’s possible to point out some relevant aspects from each experiment: 1. The cytofluorimetric techniques applied to O. edulis and C. gigas could bring to important developments in the search of alternative methods that quickly allows to identify with precision the origin of a specific sample, contributing to oppose possible alimentary frauds, in this case for example related to presence of a different species, also under a qualitative profile, but morpholgically similar. A concrete perspective for the application in the inspective field of this method has to be confirmed by further laboratory tests that take also in account in vivo experiments to evaluate the effect in the whole organism of the factors evaluated only on haemocytes in vitro. These elements suggest therefore the possibility to suit the cytofluorimetric methods for the study of animal organisms of food interest, still before these enter the phase of industrial working processes, giving useful information about the possible presence of contaminants sources that can induce an increase of the immune defence and an alteration of normal cellular parameter values. 2. C. gallina immune system has shown an interesting answer to benzo[a]pyrene (B[a]P) exposure, dose and time dependent, with a significant decrease of the expression and of the activity of one of the most important enzymes involved in the antioxidant defence in haemocytes and haemolymph. The data obtained are confirmed by several measurements of physiological parameters, that together with the decrease of the activity of 7-etossi-resourifine-O-deetilase (EROD linked to xenobiotic biotransformation processes) during exposure, underline the major effects of B[a]P action. The identification of basal levels of EROD supports the possible presence of CYP1A subfamily in the invertebrates, still today controversial, never identified previously in C. gallina and never isolated in the immune cells, as confirmed instead in this study with the identification of CYP1A-immunopositive protein (CYP1A-IPP). This protein could reveal a good biomarker at the base of a simple and quick method that could give clear information about specific pollutants presence, even at low concentrations in the environment where usually these organisms are fished before being commercialized. 3. In this experiment it has been evaluated the effect of the antibiotic chloramphenicol (CA) in an important species of commercial interest, Chamelea gallina. Chloramphenicol is a drug still used in some developing countries, also in veterinary field. Controls to evaluate its presence in the alimentary products of animal origin, can reveal ineffective whereas the concentration results to be below the limit of sensitivity of the instruments usually used in this type of analysis. Negative effects of CA towards the CYP1A- IPP proteins, underlined in this work, seem to be due to the attack of free radicals resultant from the action of the antibiotic. This brings to a meaningful alteration of the biotransformation mechanisms through the free radicals. It seems particularly interesting to pay attention to the narrow relationships in C. gallina, between SOD/CAT and CYP450 system, actively involved in detoxification mechanism, especially if compared with the few similar works today present about mollusc, a group that is composed by numerous species that enter in the food field and on which constant controls are necessary to evaluate in a rapid and effective way the presence of possible contaminations. 4. The investigations on fishes (Gadus morhua, and Salmo salar) and on a bivalve mollusc (Mytilus edulis) have allowed to evaluate different aspects related to the possibility to identify a biomarker for the evaluation of the health of organisms of food interest and consequently for the quality of the final product through 2DE methodologies. In the seafood field these techniques are currently used with a discreet success only for vertebrates (fishes), while in the study of the invertebrates (molluscs) there are a lot of difficulties. The results obtained in this work have underline several problems in the correct identification of the isolated proteins in animal organisms of which doesn’t currently exist a complete genomic sequence. This brings to attribute some identities on the base of the comparison with similar proteins in other animal groups, incurring in the possibility to obtain inaccurate data and above all discordant with those obtained on the same animals by other authors. Nevertheless the data obtained in this work after MALDI-ToF analysis, result however objective and the spectra collected could be again analyzed in the future after the update of genomic database related to the species studied. 4-A. The investigation about the presence of HSP70 isoforms directly induced by different phenomena of stress like B[a]P presence, has used bidimensional electrophoresis methods in C. gallina, that have allowed to isolate numerous protein on 2DE gels, allowing the collection of several spots currently in phase of analysis with MALDI-ToF-MS. The present preliminary work has allowed therefore to acquire and to improve important methodologies in the study of cellular parameters and in the proteomic field, that is not only revealed of great potentiality in the application in medical and veterinary field, but also in the field of the inspection of the foods with connections to the toxicology and the environmental pollution. Such study contributes therefore to the search of rapid and new methodologies, that can increase the inspective strategies, integrating themselves with those existing, but improving at the same time the general background of information related to the state of health of the considered animal organism, with the possibility, still hypothetical, to replace in particular cases the employment of the traditional techniques in the alimentary field.
Resumo:
In dieser Arbeit wird das Phasenverhalten fluid-kristallin und kristallin-amorph, die elastischen Eigenschaften, das Nukleationsverhalten und das diffusive Verhalten ladungsstabilisierter Kolloide aus sphärischen Polystyrol- und Polytetrafluorethylenpartikeln in wässerigen Dispersionsmitteln bei sehr geringem Fremdionengehalt systematisch untersucht. Die dazugehörigen Messungen werden an einer neuartigen selbstkonstruierten Kombinationslichtstreuapparatur durchgeführt, die die Meßmethoden der dynamischen Lichtstreuung, statischen Lichtstreuung und Torsionsresonanzspektroskopie in sich vereint. Die drei Meßmethoden sind optimal auf die Untersuchung kolloidaler Festkörper abgestimmt. Das elastische Verhalten der Festkörper kann sehr gut durch die Elastizitätstheorie atomarer Kristallsysteme beschrieben werden, wenn ein Debye-Hückel-Potential im Sinne des Poisson-Boltzmann-Cell Modells als Wechselwirkungspotential verwendet wird. Die ermittelten Phasengrenzen fluid-kristallin stehen erstmalig in guter Übereinstimmung mit Ergebnissen aus molekulardynamischen Simulationen, wenn die in der Torsionsresonanzspektroskopie bestimmte Wechselwirkungsenergie zu Grunde gelegt wird. Neben der Gleichgewichtsstruktur sind Aussagen zur Verfestigungskinetik möglich. Das gefundene Nukleationserhalten kann gut durch die klassische Nukleationstheorie beschrieben werden, wenn bei niedriger Unterkühlung der Schmelze ein Untergrund heterogener Keimung berücksichtigt wird. PTFE-Partikel zeigen auch bei hohen Konzentrationen nur geringfügige Mehrfachstreuung. Durch ihren Einsatz ist erstmals eine systematische Untersuchung des Glasübergangs in hochgeladenen ladungsstabilisierten Systemen möglich. Ladungsstabilisierte Kolloide unterscheiden sich vor allem durch ihre extreme Kristallisationstendenz von früher untersuchten Hartkugelsystemen. Bei hohen Partikelkonzentrationen (Volumenbrüche größer 10 Prozent) kann ein glasartiger Festkörper identifiziert werden, dessen physikalisches Verhalten die Existenz eines Bernalglases nahe legt. Der Glasübergang ist im Vergleich mit den in anderen kolloidalen Systemen und atomaren Systemen beobachteten Übergängen von sehr unterschiedlichem Charakter. Im verwendeten PTFE-System ist auf Grund der langreichweitigen stark repulsiven Wechselwirkung kein direkter Zugang des Glaszustandes aus der übersättigten Schmelze möglich. Der amorphe Festkörper entsteht hier aus einer nanokristallinen Phase. Die Keimrate steigt im zugänglichen Meßbereich annähernd exponentiell mit der Partikelanzahldichte, so daß man feststellen kann, daß der Glaszustand nicht durch Unterdrückung der Nukleation, sondern durch eine Forcierung derselben erreicht wird.
Resumo:
Das Ziel des Experiments NA48 am CERN ist die Messung des Parameters Re(epsilon'/epsilon) der direktenCP-Verletzung mit einer Genauigkeit von 2x10^-4. Experimentell zugänglich ist das DoppelverhältnisR, das aus den Zerfällen des KL und KS in zwei neutrale bzw. zwei geladene Pionengebildet wird. Für R gilt in guter Näherung: R=1-6Re(epsilon'/epsilon).
NA48 verwendet eine Wichtung der KL-Ereignisse zur Reduzierung der Sensitivität auf dieDetektorakzeptanz. Zur Kontrolle der bisherigen Standardanalyse wurde eine Analyse ohne Ereigniswichtung durchgeführt. Das Ergebnis derungewichteten Analyse wird in dieser Arbeit vorgestellt. Durch Verzicht auf die Ereigniswichtung kann derstatistische Anteil des Gesamtfehlers deutlich verringert werden. Da der limitierende Kanal der Zerfall deslanglebigen Kaons in zwei neutrale Pionen ist, ist die Verwendung der gesamten Anzahl derKL-Zerfälle ein lohnendes Ziel.Im Laufe dieser Arbeit stellte sich heraus, dass dersystematische Fehler der Akzeptanzkorrektur diesen Gewinn wieder aufhebt.
Das Ergebnis der Arbeit für die Daten aus den Jahren 1998und 1999 ohne Ereigniswichtung lautet
Re(epsilon'/epsilon)=(17,91+-4,41(syst.)+-1,36(stat.))x10^-4.
Damit ist eindeutig die Existenz der direkten CP-Verletzungbestätigt. Dieses Ergebnis ist mit dem veröffentlichten Ergebnis vonNA48 verträglichSomit ist der Test der bisherigen Analysestrategie bei NA48erfolgreich durchgeführt worden.
Resumo:
Das Experiment zur Überprüfung der Gerasimov-Drell-Hearn(GDH)-Summenregel am Mainzer Mikrotron diente zur Messunghelizitätsabhängiger Eigenschaften der Photoproduktion amProton. Hierbei konnten neben den totalen Photoabsorptionswirkungsquerschnitten auch Querschnitte partieller Kanäle der Einpion-, Zweipion- und Eta-Produktion bestimmt werden. Für die möglichen Doppel-Pion-Reaktionen sind zur Zeit nur wenig Information vorhanden. Daher sind die Reaktionsmechanismen der Doppel-Pion-Photoproduktion weitgehend noch unverstanden. Aus diesem Grund ist die Untersuchung weiterer Observablenerforderlich. Die Helizitätsabhängigkeit stellt eine solcheneue Observable dar und kann dadurch gemessen werden, daßein zirkular polarisiertes Photon mit einem longitudinalpolarisierten Nukleon wechselwirkt. Der Photonspin beträgt1 und der Fermionenspin des Nukleons beträgt 1/2.Die Spins koppeln zu 3/2 und 1/2. Dies liefert diehelizitätsabhängigen Wirkungsquerschnitte.Diese Dissertation beschreibt den Aufbau des Mainzer GDH-Experiments und die technische Realisation der Messung helizitätsabhängiger Photoabsorptionswirkungsquerschnitte. Die Helizitätsabhängigkeit der doppelt geladenen Doppel-Pion-Photoproduktion am Proton wurde anhand der gemessenen Daten untersucht. Diese Reaktion ist ein Zweistufenprozeß, wobei das angeregte Nukleon resonant oder nicht-resonant in die Endzustandsteilchen zerfällt. Durch die helizitätsabhängigen Daten wurden Freiheitsgradein der Doppel-Pion-Photoproduktion eingeschränkt, wodurch eine Verbesserung der Modelle der Gent-Mainz- und der Valencia-Gruppe ermöglicht wurde. Die im Rahmen dieser Arbeit ermittelten Daten haben zusätzlich eine Motivation für weitere Interpretationsansätze durch die genanntenTheoriegruppen geschaffen.
Resumo:
In a large number of problems the high dimensionality of the search space, the vast number of variables and the economical constrains limit the ability of classical techniques to reach the optimum of a function, known or unknown. In this thesis we investigate the possibility to combine approaches from advanced statistics and optimization algorithms in such a way to better explore the combinatorial search space and to increase the performance of the approaches. To this purpose we propose two methods: (i) Model Based Ant Colony Design and (ii) Naïve Bayes Ant Colony Optimization. We test the performance of the two proposed solutions on a simulation study and we apply the novel techniques on an appplication in the field of Enzyme Engineering and Design.
Resumo:
The increasing precision of current and future experiments in high-energy physics requires a likewise increase in the accuracy of the calculation of theoretical predictions, in order to find evidence for possible deviations of the generally accepted Standard Model of elementary particles and interactions. Calculating the experimentally measurable cross sections of scattering and decay processes to a higher accuracy directly translates into including higher order radiative corrections in the calculation. The large number of particles and interactions in the full Standard Model results in an exponentially growing number of Feynman diagrams contributing to any given process in higher orders. Additionally, the appearance of multiple independent mass scales makes even the calculation of single diagrams non-trivial. For over two decades now, the only way to cope with these issues has been to rely on the assistance of computers. The aim of the xloops project is to provide the necessary tools to automate the calculation procedures as far as possible, including the generation of the contributing diagrams and the evaluation of the resulting Feynman integrals. The latter is based on the techniques developed in Mainz for solving one- and two-loop diagrams in a general and systematic way using parallel/orthogonal space methods. These techniques involve a considerable amount of symbolic computations. During the development of xloops it was found that conventional computer algebra systems were not a suitable implementation environment. For this reason, a new system called GiNaC has been created, which allows the development of large-scale symbolic applications in an object-oriented fashion within the C++ programming language. This system, which is now also in use for other projects besides xloops, is the main focus of this thesis. The implementation of GiNaC as a C++ library sets it apart from other algebraic systems. Our results prove that a highly efficient symbolic manipulator can be designed in an object-oriented way, and that having a very fine granularity of objects is also feasible. The xloops-related parts of this work consist of a new implementation, based on GiNaC, of functions for calculating one-loop Feynman integrals that already existed in the original xloops program, as well as the addition of supplementary modules belonging to the interface between the library of integral functions and the diagram generator.