972 resultados para Read Out Driver, Data Acquisition, Electronics, FPGA, ATLAS, IBL, Pixel Detector, LHC, VME


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sowohl in der Natur als auch in der Industrie existieren thermisch induzierte Strömungen. Von Interesse für diese Forschungsarbeit sind dabei die Konvektionen im Erdmantel sowie in den Glasschmelzwannen. Der dort stattfindende Materialtransport resultiert aus Unterschieden in der Dichte, der Temperatur und der chemischen Konzentration innerhalb des konvektierenden Materials. Um das Verständnis für die ablaufenden Prozesse zu verbessern, werden von zahlreichen Forschergruppen numerische Modellierungen durchgeführt. Die Verifikation der dafür verwendeten Algorithmen erfolgt meist über die Analyse von Laborexperimenten. Im Vordergrund dieser Forschungsarbeit steht die Entwicklung einer Methode zur Bestimmung der dreidimensionalen Temperaturverteilung für die Untersuchung von thermisch induzierten Strömungen in einem Versuchsbecken. Eine direkte Temperaturmessung im Inneren des Versuchsmaterials bzw. der Glasschmelze beeinflusst allerdings das Strömungsverhalten. Deshalb wird die geodynamisch störungsfrei arbeitende Impedanztomographie verwendet. Die Grundlage dieser Methode bildet der erweiterte Arrhenius-Zusammenhang zwischen Temperatur und spezifischer elektrischer Leitfähigkeit. Während der Laborexperimente wird ein zähflüssiges Polyethylenglykol-Wasser-Gemisch in einem Becken von unten her erhitzt. Die auf diese Weise generierten Strömungen stellen unter Berücksichtigung der Skalierung ein Analogon sowohl zu dem Erdmantel als auch zu den Schmelzwannen dar. Über mehrere Elektroden, die an den Beckenwänden installiert sind, erfolgen die geoelektrischen Messungen. Nach der sich anschließenden dreidimensionalen Inversion der elektrischen Widerstände liegt das Modell mit der Verteilung der spezifischen elektrischen Leitfähigkeit im Inneren des Versuchsbeckens vor. Diese wird mittels der erweiterten Arrhenius-Formel in eine Temperaturverteilung umgerechnet. Zum Nachweis der Eignung dieser Methode für die nichtinvasive Bestimmung der dreidimensionalen Temperaturverteilung wurden mittels mehrerer Thermoelemente an den Beckenwänden zusätzlich direkte Temperaturmessungen durchgeführt und die Werte miteinander verglichen. Im Wesentlichen sind die Innentemperaturen gut rekonstruierbar, wobei die erreichte Messgenauigkeit von der räumlichen und zeitlichen Auflösung der Gleichstromgeoelektrik abhängt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il contesto nazionale è cambiato recentemente per l’introduzione del nuovo Sistema Geodetico coincidente con quello Europeo (ETRS89, frame ETRF00) e realizzato dalle stazioni della Rete Dinamica Nazionale. Sistema geodetico, associato al cartografico UTM_ETRF00, divenuto per decreto obbligatorio nelle Pubbliche Amministrazioni. Questo cambiamento ha consentito di ottenere rilevamenti dei dati cartografici in coordinate assolute ETRF00 molto più accurate. Quando i dati così rilevati vengono utilizzati per aggiornamenti cartografici perdono le coordinate originarie e vengono adattati a particolari cartografici circostanti. Per progettare una modernizzazione delle mappe catastali e delle carte tecniche finalizzata a consentire l’introduzione degli aggiornamenti senza modificarne le coordinate assolute originarie, lo studio è iniziato valutando come utilizzare sviluppi di strutturazione dei dati topografici presenti nel Database Geotopografico, modellizzazioni 3D di fabbricati nelle esperienze catastali INSPIRE, integrazioni in ambito MUDE tra progetti edilizi e loro realizzazioni. Lo studio è proseguito valutando i servizi di posizionamento in tempo reale NRTK presenti in Italia. Inoltre sono state effettuate sperimentazioni per verificare anche in sede locale la precisione e l’affidabilità dei servizi di posizionamento presenti. La criticità della cartografia catastale deriva sostanzialmente dal due fatti: che originariamente fu inquadrata in 850 Sistemi e successivamente fu trasformata in Roma40 con una esigua densità di punti rimisurati; che fino al 1988 fu aggiornata con modalità non rigorose di bassa qualità. Per risolvere tali criticità si è quindi ipotizzato di sfruttare le modalità di rilevamento NRTK per aumentare localmente la densità dei punti rimisurati e reinquadrare le mappe catastali. Il test, realizzato a Bologna, ha comportato un’analisi preliminare per individuare quali Punti Fiduciali considerare coerenti con le specifiche cartografiche per poi utilizzarli e aumentare localmente la densità dei punti rimisurati. La sperimentazione ha consentito la realizzazione del progetto e di inserire quindi i prossimi aggiornamenti senza modificarne le coordinate ETRF00 ottenute dal servizio di posizionamento.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Das A4-Experiment bestimmt den Beitrag der Strangequarks zu den elektromagnetischen Formfaktoren des Nukleons durch Messung der Paritätsverletzung in der elastischen Elektron-Nukleon-Streuung. Diese Messungen werden mit dem spinpolarisierten Elektronenstrahl des Mainzer Mikrotrons (MAMI) bei Strahlenergien zwischen 315 und 1508 MeV ndurchgeführt. Die Bestimmung des Strahlpolarisationsgrades ist für die Analyse der Daten unerläßlich, um die physikalische Asymmetrie aus der gemessenen paritätsverletzenden Asymmetrie extrahieren zu können. Aus diesem Grund wird von der A4-Kollaboration ein neuartiges Compton-Laserrückstreupolarimeter entwickelt, das eine zerstörungsfreie Messung der Strahlpolarisation, parallel zum laufenden Paritätsexperiment erlaubt. Um den zuverlässigen Dauerbetrieb des Polarimeters zu ermöglichen, wurde das Polarimeter im Rahmen dieser Arbeit weiterentwickelt. Das Datenerfassungssystem für Photonen- und Elektronendetektor wurde neu aufgebaut und im Hinblick auf die Verarbeitung hoher Raten optimiert. Zum Nachweis der rückgestreuten Photonen wurde ein neuartiger Detektor (LYSO) in Betrieb genommen. Darüber hinaus wurden GEANT4-Simulationen der Detektoren durchgeführt und eine Analyseumgebung für die Extraktion von Comptonasymmetrien aus den Rückstreudaten entwickelt. Das Analyseverfahren nutzt die Möglichkeit, die rückgestreuten Photonen durch koinzidente Detektion der gestreuten Elektronen energiemarkiert nachzuweisen (Tagging). Durch die von der Energiemarkierung eingeführte differentielle Energieskala wird somit eine präzise Bestimmung der Analysierstärke möglich. In der vorliegenden Arbeit wurde die Analysierstärke des Polarimeters bestimmt, so daß nun das Produkt von Elektronen- und Laserstrahlpolarisation bei einem Strahlstrom von 20 muA, parallel zum laufenden Paritätsexperiment, mit einer statistischen Genauigkeit von 1% in 24 Stunden bei 855 MeV bzw. <1% in 12 Stunden bei 1508 MeV gemessen werden kann. In Kombination mit der Bestimmung der Laserpolarisation in einer parallelen Arbeit (Y. Imai) auf 1% kann die statistische Unsicherheit der Strahlpolarisation im A4-Experiment von zuvor 5% auf nun 1,5% bei 1508MeV verringert werden. Für die Daten zur Messung der paritätsverletzenden Elektronenstreuung bei einem Viererimpulsübertrag von $Q^2=0,6 (GeV/c)^2$ beträgt die Rohasymmetrie beim derzeitigen Stand der Analyse $A_{PV}^{Roh} = ( -20,0 pm 0,9_{stat} ) cdot 10^{-6}$. Für eine Strahlpolarisation von 80% erhält man einen Gesamtfehler von $1,68 cdot 10^{-6}$ für $Delta P_e/P_e = 5 %$. Als Ergebnis dieser Arbeit wird sich dieser Fehler durch Analyse der Daten des Compton-Laserrückstreupolarimeters um 29% auf $1,19 cdot 10^{-6}$ ($Delta P_e/P_e = 1,5 %$) verringern lassen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have realized a Data Acquisition chain for the use and characterization of APSEL4D, a 32 x 128 Monolithic Active Pixel Sensor, developed as a prototype for frontier experiments in high energy particle physics. In particular a transition board was realized for the conversion between the chip and the FPGA voltage levels and for the signal quality enhancing. A Xilinx Spartan-3 FPGA was used for real time data processing, for the chip control and the communication with a Personal Computer through a 2.0 USB port. For this purpose a firmware code, developed in VHDL language, was written. Finally a Graphical User Interface for the online system monitoring, hit display and chip control, based on windows and widgets, was realized developing a C++ code and using Qt and Qwt dedicated libraries. APSEL4D and the full acquisition chain were characterized for the first time with the electron beam of the transmission electron microscope and with 55Fe and 90Sr radioactive sources. In addition, a beam test was performed at the T9 station of the CERN PS, where hadrons of momentum of 12 GeV/c are available. The very high time resolution of APSEL4D (up to 2.5 Mfps, but used at 6 kfps) was fundamental in realizing a single electron Young experiment using nanometric double slits obtained by a FIB technique. On high statistical samples, it was possible to observe the interference and diffractions of single isolated electrons traveling inside a transmission electron microscope. For the first time, the information on the distribution of the arrival time of the single electrons has been extracted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Der Wirkungsquerschnitt der Charmoniumproduktion wurde unter Nutzung der Daten aus pp-Kollisionen bei s^{1/2}=7TeV, die im Jahr 2010 vom Atlas-Experiment am LHC aufgezeichnet wurden, gemessen. Um das notwendige Detektorverständnis zu verbessern, wurde eine Energiekalibration durchgeführt.rnrnrnUnter Nutzung von Elektronen aus Zerfällen des Charmoniums wurde die Energieskala der elektromagnetischen Kalorimeter bei niedrigen Energien untersucht. Nach Anwendung der Kalibration wurden für die Energiemessung im Vergleich mit in Monte-Carlo-Simulationen gemessenen Energien Abweichungen von weniger als 0,5% gefunden.rnrnrnMit einer integrierten Luminosität von 2,2pb^{-1} wurde eine erste Messung des inklusiven Wirkungsquerschnittes für den Prozess pp->J/psi(e^{+}e^{-})+X bei s^{1/2}=7TeV vorgenommen. Das geschah im zugänglichen Bereich für Transversalimpulse p_{T,ee}>7GeV und Rapiditäten |y_{ee}|<2,4. Es wurden differentielle Wirkungsquerschnitte für den Transversalimpuls p_{T,ee} und für die Rapidität |y_{ee}| bestimmt. Integration beider Verteilungen lieferte für den inklusiven Wirkungsquerschnitt sigma(pp-> J/psi X)BR(J/psi->e^{+}e^{-}) die Werte (85,1+/-1,9_{stat}+/-11,2_{syst}+/-2,9_{Lum})nb und (75,4+/-1,6_{stat}+/-11,9_{syst}+/-2,6_{Lum})nb, die innerhalb der Systematik kompatibel sind.rnrnrnVergleiche mit Messungen von Atlas und CMS für den Prozess pp->J/psi(mu^{+}mu^{-})+X zeigten gute Übereinstimmung. Zum Vergleich mit der Theorie wurden Vorhersagen mit verschiedenen Modellen in nächst-zu-führender und mit Anteilen in nächst-zu-nächst-zu-führender Ordnung kombiniert. Der Vergleich zeigt eine gute Übereinstimmung bei Berücksichtigung von Anteilen in nächst-zu-nächst-zu-führender Ordnung.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tractor rollover represent a primary cause of death or serious injury in agriculture and despite the mandatory Roll-Over Protective Structures (ROPS), that reduced the number of injuries, tractor accidents are still of great concern. Because of their versatility and wide use many studies on safety are concerned with the stability of tractors, but they often prefer controlled tests or laboratory tests. The evaluation of tractors working in field, instead, is a very complex issue because the rollover could be influenced by the interaction among operator, tractor and environment. Recent studies are oriented towards the evaluation of the actual working conditions developing prototypes for driver assistance and data acquisition. Currently these devices are produced and sold by manufacturers. A warning device was assessed in this study with the aim to evaluate its performance and to collect data on different variables influencing the dynamics of tractors in field by monitoring continuously the working conditions of tractors operating at the experimental farm of the Bologna University. The device consists of accelerometers, gyroscope, GSM/GPRS, GPS for geo-referencing and a transceiver for the automatic recognition of tractor-connected equipment. A microprocessor processes data and provides information, through a dedicated algorithm requiring data on the geometry of the tested tractor, on the level of risk for the operator in terms of probable loss of stability and suggests corrective measures to reduce the potential instability of the tractor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The new stage of the Mainz Microtron, MAMI, at the Institute for Nuclear Physics of the Johannes Gutenberg-University, operational since 2007, allows open strangeness experiments to be performed. Covering the lack of electroproduction data at very low Q2, p(e,K+)Lambda and p(e,K+)Sigma0, reactions have been studied at Q^2 = 0.036(GeV/c)^2 andrnQ^2 = 0.05(GeV=c)^2 in a large angular range. Cross-section at W=1.75rnGeV will be given in angular bins and compared with the predictions of Saclay-Lyon and Kaon Maid isobaric models. We conclude that the original Kaon-Maid model, which has large longitudinal couplings of the photon to nucleon resonances, is unphysical. Extensive studies for the suitability of silicon photomultipliers as read out devices for a scintillating fiber tracking detector, with potential applications in both positive and negative arms of the spectrometer, will be presented as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a CMOS Amplifier with High Common Mode rejection designed in UMC 130nm technology. The goal is to achieve a high amplification factor for a wide range of biological signals (with frequencies in the range of 10Hz-1KHz) and to reject the common-mode noise signal. It is here presented a Data Acquisition System, composed of a Delta-Sigma-like Modulator and an antenna, that is the core of a portable low-complexity radio system; the amplifier is designed in order to interface the data acquisition system with a sensor that acquires the electrical signal. The Modulator asynchronously acquires and samples human muscle activity, by sending a Quasi-Digital pattern that encodes the acquired signal. There is only a minor loss of information translating the muscle activity using this pattern, compared to an encoding technique which uses astandard digital signal via Impulse-Radio Ultra-Wide Band (IR-UWB). The biological signals, needed for Electromyographic analysis, have an amplitude of 10-100μV and need to be highly amplified and separated from the overwhelming 50mV common mode noise signal. Various tests of the firmness of the concept are presented, as well the proof that the design works even with different sensors, such as Radiation measurement for Dosimetry studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the first part of a study investigating a model-based transient calibration process for diesel engines. The motivation is to populate hundreds of parameters (which can be calibrated) in a methodical and optimum manner by using model-based optimization in conjunction with the manual process so that, relative to the manual process used by itself, a significant improvement in transient emissions and fuel consumption and a sizable reduction in calibration time and test cell requirements is achieved. Empirical transient modelling and optimization has been addressed in the second part of this work, while the required data for model training and generalization are the focus of the current work. Transient and steady-state data from a turbocharged multicylinder diesel engine have been examined from a model training perspective. A single-cylinder engine with external air-handling has been used to expand the steady-state data to encompass transient parameter space. Based on comparative model performance and differences in the non-parametric space, primarily driven by a high engine difference between exhaust and intake manifold pressures (ΔP) during transients, it has been recommended that transient emission models should be trained with transient training data. It has been shown that electronic control module (ECM) estimates of transient charge flow and the exhaust gas recirculation (EGR) fraction cannot be accurate at the high engine ΔP frequently encountered during transient operation, and that such estimates do not account for cylinder-to-cylinder variation. The effects of high engine ΔP must therefore be incorporated empirically by using transient data generated from a spectrum of transient calibrations. Specific recommendations on how to choose such calibrations, how many data to acquire, and how to specify transient segments for data acquisition have been made. Methods to process transient data to account for transport delays and sensor lags have been developed. The processed data have then been visualized using statistical means to understand transient emission formation. Two modes of transient opacity formation have been observed and described. The first mode is driven by high engine ΔP and low fresh air flowrates, while the second mode is driven by high engine ΔP and high EGR flowrates. The EGR fraction is inaccurately estimated at both modes, while EGR distribution has been shown to be present but unaccounted for by the ECM. The two modes and associated phenomena are essential to understanding why transient emission models are calibration dependent and furthermore how to choose training data that will result in good model generalization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for a heavy standard model Higgs boson decaying via H→ZZ→→ℓ(+)ℓ(-)νν, where ℓ=e, μ, is presented. It is based on proton-proton collision data at √s=7 TeV, collected by the ATLAS experiment at the LHC in the first half of 2011 and corresponding to an integrated luminosity of 1.04 fb(-1). The data are compared to the expected standard model backgrounds. The data and the background expectations are found to be in agreement and upper limits are placed on the Higgs boson production cross section over the entire mass window considered; in particular, the production of a standard model Higgs boson is excluded in the region 340

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article gives an overview over the methods used in the low--level analysis of gene expression data generated using DNA microarrays. This type of experiment allows to determine relative levels of nucleic acid abundance in a set of tissues or cell populations for thousands of transcripts or loci simultaneously. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. This includes the design of probes, the experimental design, the image analysis of microarray scanned images, the normalization of fluorescence intensities, the assessment of the quality of microarray data and incorporation of quality information in subsequent analyses, the combination of information across arrays and across sets of experiments, the discovery and recognition of patterns in expression at the single gene and multiple gene levels, and the assessment of significance of these findings, considering the fact that there is a lot of noise and thus random features in the data. For all of these components, access to a flexible and efficient statistical computing environment is an essential aspect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, observations of space debris are primarily performed with ground-based sensors. These sensors have a detection limit at some centimetres diameter for objects in Low Earth Orbit (LEO) and at about two decimetres diameter for objects in Geostationary Orbit (GEO). The few space-based debris observations stem mainly from in-situ measurements and from the analysis of returned spacecraft surfaces. Both provide information about mostly sub-millimetre-sized debris particles. As a consequence the population of centimetre- and millimetre-sized debris objects remains poorly understood. The development, validation and improvement of debris reference models drive the need for measurements covering the whole diameter range. In 2003 the European Space Agency (ESA) initiated a study entitled “Space-Based Optical Observation of Space Debris”. The first tasks of the study were to define user requirements and to develop an observation strategy for a space-based instrument capable of observing uncatalogued millimetre-sized debris objects. Only passive optical observations were considered, focussing on mission concepts for the LEO, and GEO regions respectively. Starting from the requirements and the observation strategy, an instrument system architecture and an associated operations concept have been elaborated. The instrument system architecture covers the telescope, camera and onboard processing electronics. The proposed telescope is a folded Schmidt design, characterised by a 20 cm aperture and a large field of view of 6°. The camera design is based on the use of either a frame-transfer charge coupled device (CCD), or on a cooled hybrid sensor with fast read-out. A four megapixel sensor is foreseen. For the onboard processing, a scalable architecture has been selected. Performance simulations have been executed for the system as designed, focussing on the orbit determination of observed debris particles, and on the analysis of the object detection algorithms. In this paper we present some of the main results of the study. A short overview of the user requirements and observation strategy is given. The architectural design of the instrument is discussed, and the main tradeoffs are outlined. An insight into the results of the performance simulations is provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The verification possibilities of dynamically collimated treatment beams with a scanning liquid ionization chamber electronic portal image device (SLIC-EPID) are investigated. The ion concentration in the liquid of a SLIC-EPID and therefore the read-out signal is determined by two parameters of a differential equation describing the creation and recombination of the ions. Due to the form of this equation, the portal image detector describes a nonlinear dynamic system with memory. In this work, the parameters of the differential equation were experimentally determined for the particular chamber in use and for an incident open 6 MV photon beam. The mathematical description of the ion concentration was then used to predict portal images of intensity-modulated photon beams produced by a dynamic delivery technique, the sliding window approach. Due to the nature of the differential equation, a mathematical condition for 'reliable leaf motion verification' in the sliding window technique can be formulated. It is shown that the time constants for both formation and decay of the equilibrium concentration in the chamber is in the order of seconds. In order to guarantee reliable leaf motion verification, these time constants impose a constraint on the rapidity of the image-read out for a given maximum leaf speed. For a leaf speed of 2 cm s(-1), a minimum image acquisition frequency of about 2 Hz is required. Current SLIC-EPID systems are usually too slow since they need about a second to acquire a portal image. However, if the condition is fulfilled, the memory property of the system can be used to reconstruct the leaf motion. It is shown that a simple edge detecting algorithm can be employed to determine the leaf positions. The method is also very robust against image noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECT: The localization of any given target in the brain has become a challenging issue because of the increased use of deep brain stimulation to treat Parkinson disease, dystonia, and nonmotor diseases (for example, Tourette syndrome, obsessive compulsive disorders, and depression). The aim of this study was to develop an automated method of adapting an atlas of the human basal ganglia to the brains of individual patients. METHODS: Magnetic resonance images of the brain specimen were obtained before extraction from the skull and histological processing. Adaptation of the atlas to individual patient anatomy was performed by reshaping the atlas MR images to the images obtained in the individual patient using a hierarchical registration applied to a region of interest centered on the basal ganglia, and then applying the reshaping matrix to the atlas surfaces. RESULTS: Results were evaluated by direct visual inspection of the structures visible on MR images and atlas anatomy, by comparison with electrophysiological intraoperative data, and with previous atlas studies in patients with Parkinson disease. The method was both robust and accurate, never failing to provide an anatomically reliable atlas to patient registration. The registration obtained did not exceed a 1-mm mismatch with the electrophysiological signatures in the region of the subthalamic nucleus. CONCLUSIONS: This registration method applied to the basal ganglia atlas forms a powerful and reliable method for determining deep brain stimulation targets within the basal ganglia of individual patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Letter describes a model-independent search for the production of new resonances in photon + jet events using 20 inverse fb of proton--proton LHC data recorded with the ATLAS detector at a centre-of-mass energy of s√ = 8 TeV. The photon + jet mass distribution is compared to a background model fit from data; no significant deviation from the background-only hypothesis is found. Limits are set at 95% credibility level on generic Gaussian-shaped signals and two benchmark phenomena beyond the Standard Model: non-thermal quantum black holes and excited quarks. Non-thermal quantum black holes are excluded below masses of 4.6 TeV and excited quarks are excluded below masses of 3.5 TeV.