932 resultados para Read Out Driver, Data Acquisition, Electronics, FPGA, ATLAS, IBL, Pixel Detector, LHC, VME
Resumo:
Abstract Background Obstructive sleep apnea (OSA) is a respiratory disease characterized by the collapse of the extrathoracic airway and has important social implications related to accidents and cardiovascular risk. The main objective of the present study was to investigate whether the drop in expiratory flow and the volume expired in 0.2 s during the application of negative expiratory pressure (NEP) are associated with the presence and severity of OSA in a population of professional interstate bus drivers who travel medium and long distances. Methods/Design An observational, analytic study will be carried out involving adult male subjects of an interstate bus company. Those who agree to participate will undergo a detailed patient history, physical examination involving determination of blood pressure, anthropometric data, circumference measurements (hips, waist and neck), tonsils and Mallampati index. Moreover, specific questionnaires addressing sleep apnea and excessive daytime sleepiness will be administered. Data acquisition will be completely anonymous. Following the medical examination, the participants will perform a spirometry, NEP test and standard overnight polysomnography. The NEP test is performed through the administration of negative pressure at the mouth during expiration. This is a practical test performed while awake and requires little cooperation from the subject. In the absence of expiratory flow limitation, the increase in the pressure gradient between the alveoli and open upper airway caused by NEP results in an increase in expiratory flow. Discussion Despite the abundance of scientific evidence, OSA is still underdiagnosed in the general population. In addition, diagnostic procedures are expensive, and predictive criteria are still unsatisfactory. Because increased upper airway collapsibility is one of the main determinants of OSA, the response to the application of NEP could be a predictor of this disorder. With the enrollment of this study protocol, the expectation is to encounter predictive NEP values for different degrees of OSA in order to contribute toward an early diagnosis of this condition and reduce its impact and complications among commercial interstate bus drivers.
Resumo:
[ES] Este proyecto de fin de carrera aborda la actualización y refactorización de la aplicación Hecaton. Esta aplicación permite la monitorización y actuación en instalaciones industriales de manera remota a través de un interfaz web. Para ello hace uso de sensores y actuadores que, conectados a través de un equipo de adquisición de datos a un sistema informático servidor, permiten obtener, manipular y almacenar los datos y eventos recibidos. Hecaton ha sido desarrollado enteramente utilizando software libre. Además, el sistema permite ser personalizado, lo que posibilita su uso en todo tipo de escenarios, siendo el usuario quién define las reglas de funcionamiento. Este trabajo se trata del cuarto ciclo de desarrollo, pues la aplicación ha sido crea y ampliada en otros tres proyectos. En este último desarrollo se han actualizado las tecnologías y herramientas que forman parte de la aplicación. Se ha puesto especial énfasis en el rediseño de la interfaz web, adoptando el uso de las últimas tecnologías web que permiten un funcionamiento dinámico de la misma. Por otro lado se han corregido algunos errores de diseño e introducido el uso de nuevas herramientas para la gestión del proyecto software. Se trata por lo tanto de un ejercicio de refactorización software donde se ha puesto especial atención en conseguir un proyecto actualizado y que utilice metodologías de desarrollo actuales y que posibilite que sea actualizado en un futuro.
Resumo:
In fluid dynamics research, pressure measurements are of great importance to define the flow field acting on aerodynamic surfaces. In fact the experimental approach is fundamental to avoid the complexity of the mathematical models for predicting the fluid phenomena. It’s important to note that, using in-situ sensor to monitor pressure on large domains with highly unsteady flows, several problems are encountered working with the classical techniques due to the transducer cost, the intrusiveness, the time response and the operating range. An interesting approach for satisfying the previously reported sensor requirements is to implement a sensor network capable of acquiring pressure data on aerodynamic surface using a wireless communication system able to collect the pressure data with the lowest environmental–invasion level possible. In this thesis a wireless sensor network for fluid fields pressure has been designed, built and tested. To develop the system, a capacitive pressure sensor, based on polymeric membrane, and read out circuitry, based on microcontroller, have been designed, built and tested. The wireless communication has been performed using the Zensys Z-WAVE platform, and network and data management have been implemented. Finally, the full embedded system with antenna has been created. As a proof of concept, the monitoring of pressure on the top of the mainsail in a sailboat has been chosen as working example.
Resumo:
The presented study carried out an analysis on rural landscape changes. In particular the study focuses on the understanding of driving forces acting on the rural built environment using a statistical spatial model implemented through GIS techniques. It is well known that the study of landscape changes is essential for a conscious decision making in land planning. From a bibliography review results a general lack of studies dealing with the modeling of rural built environment and hence a theoretical modelling approach for such purpose is needed. The advancement in technology and modernity in building construction and agriculture have gradually changed the rural built environment. In addition, the phenomenon of urbanization of a determined the construction of new volumes that occurred beside abandoned or derelict rural buildings. Consequently there are two types of transformation dynamics affecting mainly the rural built environment that can be observed: the conversion of rural buildings and the increasing of building numbers. It is the specific aim of the presented study to propose a methodology for the development of a spatial model that allows the identification of driving forces that acted on the behaviours of the building allocation. In fact one of the most concerning dynamic nowadays is related to an irrational expansion of buildings sprawl across landscape. The proposed methodology is composed by some conceptual steps that cover different aspects related to the development of a spatial model: the selection of a response variable that better describe the phenomenon under study, the identification of possible driving forces, the sampling methodology concerning the collection of data, the most suitable algorithm to be adopted in relation to statistical theory and method used, the calibration process and evaluation of the model. A different combination of factors in various parts of the territory generated favourable or less favourable conditions for the building allocation and the existence of buildings represents the evidence of such optimum. Conversely the absence of buildings expresses a combination of agents which is not suitable for building allocation. Presence or absence of buildings can be adopted as indicators of such driving conditions, since they represent the expression of the action of driving forces in the land suitability sorting process. The existence of correlation between site selection and hypothetical driving forces, evaluated by means of modeling techniques, provides an evidence of which driving forces are involved in the allocation dynamic and an insight on their level of influence into the process. GIS software by means of spatial analysis tools allows to associate the concept of presence and absence with point futures generating a point process. Presence or absence of buildings at some site locations represent the expression of these driving factors interaction. In case of presences, points represent locations of real existing buildings, conversely absences represent locations were buildings are not existent and so they are generated by a stochastic mechanism. Possible driving forces are selected and the existence of a causal relationship with building allocations is assessed through a spatial model. The adoption of empirical statistical models provides a mechanism for the explanatory variable analysis and for the identification of key driving variables behind the site selection process for new building allocation. The model developed by following the methodology is applied to a case study to test the validity of the methodology. In particular the study area for the testing of the methodology is represented by the New District of Imola characterized by a prevailing agricultural production vocation and were transformation dynamic intensively occurred. The development of the model involved the identification of predictive variables (related to geomorphologic, socio-economic, structural and infrastructural systems of landscape) capable of representing the driving forces responsible for landscape changes.. The calibration of the model is carried out referring to spatial data regarding the periurban and rural area of the study area within the 1975-2005 time period by means of Generalised linear model. The resulting output from the model fit is continuous grid surface where cells assume values ranged from 0 to 1 of probability of building occurrences along the rural and periurban area of the study area. Hence the response variable assesses the changes in the rural built environment occurred in such time interval and is correlated to the selected explanatory variables by means of a generalized linear model using logistic regression. Comparing the probability map obtained from the model to the actual rural building distribution in 2005, the interpretation capability of the model can be evaluated. The proposed model can be also applied to the interpretation of trends which occurred in other study areas, and also referring to different time intervals, depending on the availability of data. The use of suitable data in terms of time, information, and spatial resolution and the costs related to data acquisition, pre-processing, and survey are among the most critical aspects of model implementation. Future in-depth studies can focus on using the proposed model to predict short/medium-range future scenarios for the rural built environment distribution in the study area. In order to predict future scenarios it is necessary to assume that the driving forces do not change and that their levels of influence within the model are not far from those assessed for the time interval used for the calibration.
Resumo:
ALICE, that is an experiment held at CERN using the LHC, is specialized in analyzing lead-ion collisions. ALICE will study the properties of quarkgluon plasma, a state of matter where quarks and gluons, under conditions of very high temperatures and densities, are no longer confined inside hadrons. Such a state of matter probably existed just after the Big Bang, before particles such as protons and neutrons were formed. The SDD detector, one of the ALICE subdetectors, is part of the ITS that is composed by 6 cylindrical layers with the innermost one attached to the beam pipe. The ITS tracks and identifies particles near the interaction point, it also aligns the tracks of the articles detected by more external detectors. The two ITS middle layers contain the whole 260 SDD detectors. A multichannel readout board, called CARLOSrx, receives at the same time the data coming from 12 SDD detectors. In total there are 24 CARLOSrx boards needed to read data coming from all the SDD modules (detector plus front end electronics). CARLOSrx packs data coming from the front end electronics through optical link connections, it stores them in a large data FIFO and then it sends them to the DAQ system. Each CARLOSrx is composed by two boards. One is called CARLOSrx data, that reads data coming from the SDD detectors and configures the FEE; the other one is called CARLOSrx clock, that sends the clock signal to all the FEE. This thesis contains a description of the hardware design and firmware features of both CARLOSrx data and CARLOSrx clock boards, which deal with all the SDD readout chain. A description of the software tools necessary to test and configure the front end electronics will be presented at the end of the thesis.
Resumo:
The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).
Resumo:
Seit Frühjahr 2004 wird der Crystal Ball-Detektor am Photonenstrahl des Mainzer Mikrotrons für Koinzidenzexperimente zur Untersuchung der Struktur der Nukleonen genutzt. Aufbau und Inbetriebnahme des Kalorimeters, insbesondere der neuen Detektorelektronik, bilden einen Schwerpunkt dieser Arbeit. Komponenten wurden neu konstruiert oder auf ihre Verwendbarkeit geprüft und nögenfalls modifiziert. Nach erfolgreichem Abschluss der Aufbauphase wurden Experimente zur Produktion von $pi$- und $eta$-Mesonen am Proton mit mehr als 2500 Stunden Strahlbetrieb durchgeführt. Den zweiten Schwerpunkt der Dissertation bildet die erstmalige Messung der Helizitätsasymmetrie I$^odot$ in der Photoproduktion zweier neutraler Pionen. Zum Verstädnis des Anregungsspektrums der Nukleonen müssen Experimente mit polarisierten Photonen und/oder polarisierten Targets durchgeführt werden. Da Modelle trotz unterschiedlicher Annahmen unpolarisiert gemessene Größen vergleichbar gut reproduzieren, ist die Bestimmung der auf Modellunterschiede empfindlichen Polarisationsobservablen unumgäglich. Im Gegensatz zur Einpionproduktion tritt in der Zweipionproduktion eine Einfachpolarisationsobservable auf, die mit zirkular polarisierten Photonen am unpolarisierten Proton gemessen werden kann. Diese wurde in der Reaktion $gamma$ p $rightarrow$ p $pi^0$ $pi^0$ und in $gamma$ p $rightarrow$ p $pi^+$ $pi^-$ energie- und winkelabhägig bestimmt. Die Ergebnisse weichen stark von den Modellvorhersagen ab.
Resumo:
Sowohl in der Natur als auch in der Industrie existieren thermisch induzierte Strömungen. Von Interesse für diese Forschungsarbeit sind dabei die Konvektionen im Erdmantel sowie in den Glasschmelzwannen. Der dort stattfindende Materialtransport resultiert aus Unterschieden in der Dichte, der Temperatur und der chemischen Konzentration innerhalb des konvektierenden Materials. Um das Verständnis für die ablaufenden Prozesse zu verbessern, werden von zahlreichen Forschergruppen numerische Modellierungen durchgeführt. Die Verifikation der dafür verwendeten Algorithmen erfolgt meist über die Analyse von Laborexperimenten. Im Vordergrund dieser Forschungsarbeit steht die Entwicklung einer Methode zur Bestimmung der dreidimensionalen Temperaturverteilung für die Untersuchung von thermisch induzierten Strömungen in einem Versuchsbecken. Eine direkte Temperaturmessung im Inneren des Versuchsmaterials bzw. der Glasschmelze beeinflusst allerdings das Strömungsverhalten. Deshalb wird die geodynamisch störungsfrei arbeitende Impedanztomographie verwendet. Die Grundlage dieser Methode bildet der erweiterte Arrhenius-Zusammenhang zwischen Temperatur und spezifischer elektrischer Leitfähigkeit. Während der Laborexperimente wird ein zähflüssiges Polyethylenglykol-Wasser-Gemisch in einem Becken von unten her erhitzt. Die auf diese Weise generierten Strömungen stellen unter Berücksichtigung der Skalierung ein Analogon sowohl zu dem Erdmantel als auch zu den Schmelzwannen dar. Über mehrere Elektroden, die an den Beckenwänden installiert sind, erfolgen die geoelektrischen Messungen. Nach der sich anschließenden dreidimensionalen Inversion der elektrischen Widerstände liegt das Modell mit der Verteilung der spezifischen elektrischen Leitfähigkeit im Inneren des Versuchsbeckens vor. Diese wird mittels der erweiterten Arrhenius-Formel in eine Temperaturverteilung umgerechnet. Zum Nachweis der Eignung dieser Methode für die nichtinvasive Bestimmung der dreidimensionalen Temperaturverteilung wurden mittels mehrerer Thermoelemente an den Beckenwänden zusätzlich direkte Temperaturmessungen durchgeführt und die Werte miteinander verglichen. Im Wesentlichen sind die Innentemperaturen gut rekonstruierbar, wobei die erreichte Messgenauigkeit von der räumlichen und zeitlichen Auflösung der Gleichstromgeoelektrik abhängt.
Resumo:
Il contesto nazionale è cambiato recentemente per l’introduzione del nuovo Sistema Geodetico coincidente con quello Europeo (ETRS89, frame ETRF00) e realizzato dalle stazioni della Rete Dinamica Nazionale. Sistema geodetico, associato al cartografico UTM_ETRF00, divenuto per decreto obbligatorio nelle Pubbliche Amministrazioni. Questo cambiamento ha consentito di ottenere rilevamenti dei dati cartografici in coordinate assolute ETRF00 molto più accurate. Quando i dati così rilevati vengono utilizzati per aggiornamenti cartografici perdono le coordinate originarie e vengono adattati a particolari cartografici circostanti. Per progettare una modernizzazione delle mappe catastali e delle carte tecniche finalizzata a consentire l’introduzione degli aggiornamenti senza modificarne le coordinate assolute originarie, lo studio è iniziato valutando come utilizzare sviluppi di strutturazione dei dati topografici presenti nel Database Geotopografico, modellizzazioni 3D di fabbricati nelle esperienze catastali INSPIRE, integrazioni in ambito MUDE tra progetti edilizi e loro realizzazioni. Lo studio è proseguito valutando i servizi di posizionamento in tempo reale NRTK presenti in Italia. Inoltre sono state effettuate sperimentazioni per verificare anche in sede locale la precisione e l’affidabilità dei servizi di posizionamento presenti. La criticità della cartografia catastale deriva sostanzialmente dal due fatti: che originariamente fu inquadrata in 850 Sistemi e successivamente fu trasformata in Roma40 con una esigua densità di punti rimisurati; che fino al 1988 fu aggiornata con modalità non rigorose di bassa qualità. Per risolvere tali criticità si è quindi ipotizzato di sfruttare le modalità di rilevamento NRTK per aumentare localmente la densità dei punti rimisurati e reinquadrare le mappe catastali. Il test, realizzato a Bologna, ha comportato un’analisi preliminare per individuare quali Punti Fiduciali considerare coerenti con le specifiche cartografiche per poi utilizzarli e aumentare localmente la densità dei punti rimisurati. La sperimentazione ha consentito la realizzazione del progetto e di inserire quindi i prossimi aggiornamenti senza modificarne le coordinate ETRF00 ottenute dal servizio di posizionamento.
Resumo:
Das A4-Experiment bestimmt den Beitrag der Strangequarks zu den elektromagnetischen Formfaktoren des Nukleons durch Messung der Paritätsverletzung in der elastischen Elektron-Nukleon-Streuung. Diese Messungen werden mit dem spinpolarisierten Elektronenstrahl des Mainzer Mikrotrons (MAMI) bei Strahlenergien zwischen 315 und 1508 MeV ndurchgeführt. Die Bestimmung des Strahlpolarisationsgrades ist für die Analyse der Daten unerläßlich, um die physikalische Asymmetrie aus der gemessenen paritätsverletzenden Asymmetrie extrahieren zu können. Aus diesem Grund wird von der A4-Kollaboration ein neuartiges Compton-Laserrückstreupolarimeter entwickelt, das eine zerstörungsfreie Messung der Strahlpolarisation, parallel zum laufenden Paritätsexperiment erlaubt. Um den zuverlässigen Dauerbetrieb des Polarimeters zu ermöglichen, wurde das Polarimeter im Rahmen dieser Arbeit weiterentwickelt. Das Datenerfassungssystem für Photonen- und Elektronendetektor wurde neu aufgebaut und im Hinblick auf die Verarbeitung hoher Raten optimiert. Zum Nachweis der rückgestreuten Photonen wurde ein neuartiger Detektor (LYSO) in Betrieb genommen. Darüber hinaus wurden GEANT4-Simulationen der Detektoren durchgeführt und eine Analyseumgebung für die Extraktion von Comptonasymmetrien aus den Rückstreudaten entwickelt. Das Analyseverfahren nutzt die Möglichkeit, die rückgestreuten Photonen durch koinzidente Detektion der gestreuten Elektronen energiemarkiert nachzuweisen (Tagging). Durch die von der Energiemarkierung eingeführte differentielle Energieskala wird somit eine präzise Bestimmung der Analysierstärke möglich. In der vorliegenden Arbeit wurde die Analysierstärke des Polarimeters bestimmt, so daß nun das Produkt von Elektronen- und Laserstrahlpolarisation bei einem Strahlstrom von 20 muA, parallel zum laufenden Paritätsexperiment, mit einer statistischen Genauigkeit von 1% in 24 Stunden bei 855 MeV bzw. <1% in 12 Stunden bei 1508 MeV gemessen werden kann. In Kombination mit der Bestimmung der Laserpolarisation in einer parallelen Arbeit (Y. Imai) auf 1% kann die statistische Unsicherheit der Strahlpolarisation im A4-Experiment von zuvor 5% auf nun 1,5% bei 1508MeV verringert werden. Für die Daten zur Messung der paritätsverletzenden Elektronenstreuung bei einem Viererimpulsübertrag von $Q^2=0,6 (GeV/c)^2$ beträgt die Rohasymmetrie beim derzeitigen Stand der Analyse $A_{PV}^{Roh} = ( -20,0 pm 0,9_{stat} ) cdot 10^{-6}$. Für eine Strahlpolarisation von 80% erhält man einen Gesamtfehler von $1,68 cdot 10^{-6}$ für $Delta P_e/P_e = 5 %$. Als Ergebnis dieser Arbeit wird sich dieser Fehler durch Analyse der Daten des Compton-Laserrückstreupolarimeters um 29% auf $1,19 cdot 10^{-6}$ ($Delta P_e/P_e = 1,5 %$) verringern lassen.
Resumo:
We have realized a Data Acquisition chain for the use and characterization of APSEL4D, a 32 x 128 Monolithic Active Pixel Sensor, developed as a prototype for frontier experiments in high energy particle physics. In particular a transition board was realized for the conversion between the chip and the FPGA voltage levels and for the signal quality enhancing. A Xilinx Spartan-3 FPGA was used for real time data processing, for the chip control and the communication with a Personal Computer through a 2.0 USB port. For this purpose a firmware code, developed in VHDL language, was written. Finally a Graphical User Interface for the online system monitoring, hit display and chip control, based on windows and widgets, was realized developing a C++ code and using Qt and Qwt dedicated libraries. APSEL4D and the full acquisition chain were characterized for the first time with the electron beam of the transmission electron microscope and with 55Fe and 90Sr radioactive sources. In addition, a beam test was performed at the T9 station of the CERN PS, where hadrons of momentum of 12 GeV/c are available. The very high time resolution of APSEL4D (up to 2.5 Mfps, but used at 6 kfps) was fundamental in realizing a single electron Young experiment using nanometric double slits obtained by a FIB technique. On high statistical samples, it was possible to observe the interference and diffractions of single isolated electrons traveling inside a transmission electron microscope. For the first time, the information on the distribution of the arrival time of the single electrons has been extracted.
Resumo:
Der Wirkungsquerschnitt der Charmoniumproduktion wurde unter Nutzung der Daten aus pp-Kollisionen bei s^{1/2}=7TeV, die im Jahr 2010 vom Atlas-Experiment am LHC aufgezeichnet wurden, gemessen. Um das notwendige Detektorverständnis zu verbessern, wurde eine Energiekalibration durchgeführt.rnrnrnUnter Nutzung von Elektronen aus Zerfällen des Charmoniums wurde die Energieskala der elektromagnetischen Kalorimeter bei niedrigen Energien untersucht. Nach Anwendung der Kalibration wurden für die Energiemessung im Vergleich mit in Monte-Carlo-Simulationen gemessenen Energien Abweichungen von weniger als 0,5% gefunden.rnrnrnMit einer integrierten Luminosität von 2,2pb^{-1} wurde eine erste Messung des inklusiven Wirkungsquerschnittes für den Prozess pp->J/psi(e^{+}e^{-})+X bei s^{1/2}=7TeV vorgenommen. Das geschah im zugänglichen Bereich für Transversalimpulse p_{T,ee}>7GeV und Rapiditäten |y_{ee}|<2,4. Es wurden differentielle Wirkungsquerschnitte für den Transversalimpuls p_{T,ee} und für die Rapidität |y_{ee}| bestimmt. Integration beider Verteilungen lieferte für den inklusiven Wirkungsquerschnitt sigma(pp-> J/psi X)BR(J/psi->e^{+}e^{-}) die Werte (85,1+/-1,9_{stat}+/-11,2_{syst}+/-2,9_{Lum})nb und (75,4+/-1,6_{stat}+/-11,9_{syst}+/-2,6_{Lum})nb, die innerhalb der Systematik kompatibel sind.rnrnrnVergleiche mit Messungen von Atlas und CMS für den Prozess pp->J/psi(mu^{+}mu^{-})+X zeigten gute Übereinstimmung. Zum Vergleich mit der Theorie wurden Vorhersagen mit verschiedenen Modellen in nächst-zu-führender und mit Anteilen in nächst-zu-nächst-zu-führender Ordnung kombiniert. Der Vergleich zeigt eine gute Übereinstimmung bei Berücksichtigung von Anteilen in nächst-zu-nächst-zu-führender Ordnung.
Resumo:
Tractor rollover represent a primary cause of death or serious injury in agriculture and despite the mandatory Roll-Over Protective Structures (ROPS), that reduced the number of injuries, tractor accidents are still of great concern. Because of their versatility and wide use many studies on safety are concerned with the stability of tractors, but they often prefer controlled tests or laboratory tests. The evaluation of tractors working in field, instead, is a very complex issue because the rollover could be influenced by the interaction among operator, tractor and environment. Recent studies are oriented towards the evaluation of the actual working conditions developing prototypes for driver assistance and data acquisition. Currently these devices are produced and sold by manufacturers. A warning device was assessed in this study with the aim to evaluate its performance and to collect data on different variables influencing the dynamics of tractors in field by monitoring continuously the working conditions of tractors operating at the experimental farm of the Bologna University. The device consists of accelerometers, gyroscope, GSM/GPRS, GPS for geo-referencing and a transceiver for the automatic recognition of tractor-connected equipment. A microprocessor processes data and provides information, through a dedicated algorithm requiring data on the geometry of the tested tractor, on the level of risk for the operator in terms of probable loss of stability and suggests corrective measures to reduce the potential instability of the tractor.
Resumo:
The new stage of the Mainz Microtron, MAMI, at the Institute for Nuclear Physics of the Johannes Gutenberg-University, operational since 2007, allows open strangeness experiments to be performed. Covering the lack of electroproduction data at very low Q2, p(e,K+)Lambda and p(e,K+)Sigma0, reactions have been studied at Q^2 = 0.036(GeV/c)^2 andrnQ^2 = 0.05(GeV=c)^2 in a large angular range. Cross-section at W=1.75rnGeV will be given in angular bins and compared with the predictions of Saclay-Lyon and Kaon Maid isobaric models. We conclude that the original Kaon-Maid model, which has large longitudinal couplings of the photon to nucleon resonances, is unphysical. Extensive studies for the suitability of silicon photomultipliers as read out devices for a scintillating fiber tracking detector, with potential applications in both positive and negative arms of the spectrometer, will be presented as well.
Resumo:
This thesis presents a CMOS Amplifier with High Common Mode rejection designed in UMC 130nm technology. The goal is to achieve a high amplification factor for a wide range of biological signals (with frequencies in the range of 10Hz-1KHz) and to reject the common-mode noise signal. It is here presented a Data Acquisition System, composed of a Delta-Sigma-like Modulator and an antenna, that is the core of a portable low-complexity radio system; the amplifier is designed in order to interface the data acquisition system with a sensor that acquires the electrical signal. The Modulator asynchronously acquires and samples human muscle activity, by sending a Quasi-Digital pattern that encodes the acquired signal. There is only a minor loss of information translating the muscle activity using this pattern, compared to an encoding technique which uses astandard digital signal via Impulse-Radio Ultra-Wide Band (IR-UWB). The biological signals, needed for Electromyographic analysis, have an amplitude of 10-100μV and need to be highly amplified and separated from the overwhelming 50mV common mode noise signal. Various tests of the firmness of the concept are presented, as well the proof that the design works even with different sensors, such as Radiation measurement for Dosimetry studies.