953 resultados para DATA QUALITY
Resumo:
O objetivo deste artigo é estimar a cobertura populacional do Sistema de Vigilância Alimentar e Nutricional (SISVAN) nos diferentes estágios de vida e avaliar seu funcionamento no estado de São Paulo. O estudo incluiu 65 municípios divididos em 14 regiões do estado. A cobertura do SISVAN foi estimada a partir de dados de monitoramento do estado nutricional disponíveis nos relatórios públicos, e do número de usuários que frequentam os serviços públicos de saúde. O total de usuários foi obtido pela diferença entre o total de habitantes e o número de beneficiários de planos de saúde privados. A maioria das regiões apresentou uma cobertura reduzida (<10%). Cerca de 57% revelaram cobertura entre 5 e 10%. Constatou-se uma preponderância de registros do estado nutricional de crianças para todas as regiões do Estado. Chama a atenção a reduzida cobertura entre os idosos, que é inexistente ou próxima de zero na maioria das regiões. Apesar dos esforços empreendidos pelo governo visando à ampliação e à qualificação do SISVAN, o monitoramento nutricional no estado de São Paulo ainda é insuficiente. Esta condição compromete sua utilização na elaboração de políticas efetivas na área de alimentação e nutrição.
Resumo:
In the past few years several GPS (Global Position System) positioning techniques have been develope and/or improved with the goal of obtaining high accuracy and productivity in real time. The reference station network concept besides to enabling quality and reliability in positioning for scientific and civil GPS community, allows studies concerning tropospheric refraction modeling in the network region. Moreover, among the network corrections transmission methods available to users, there is the VRS (Virtual Reference Station) concept. In this method, the data of a virtual station are generated near the rover receiver (user). This provides a short baseline and the user has the possibility of using a single frequency receiver to accomplish the relative positioning. In this paper, the methodology applied to generate VRS data, using different tropospheric models is described. Thus, comparative tests were conducted in the four seasons with the NWP/INPE (Numerical Weather Prediction/National Institute for Space Research) and Hopfield tropospheric models. In order to analyse the VRS data quality, it was used the Precise Point Positioning (PPP) method, where satisfactory results were found. Mean differences between PNT/INPE and Hopfield models of 9.75% and 24.2% for the hydrostatic and wet days, respectively were obtained.
Resumo:
Data from reference stations are widely used in GNSS (Global Navigation Satellite System) positioning, and can be used in relative positioning or network-based positioning concept. Positioning accuracy will be directly influenced by errors in signals collected in these stations. In this paper, it is aimed at evaluating these data quality using temporal series of multipath index MP1 and MP2. A statistical study of temporal series with 7 years of daily observations related to 7 stations from RBMC (Rede Brasileira de Monitoramento Contínuo) was accomplished. In order to investigate trends and seasonality a linear regression model, correlograms, and Fourier periodograms were used. We also used a harmonic adjust to identify peaks on temporal series. At last, the possible causes of seasonality found in some stations were discussed. It was also possible to identify peaks in MP values of March and October months (mainly in stations located near geomagnetic equator).
Resumo:
Pós-graduação em Zootecnia - FCAV
Resumo:
[ES] Se presenta el análisis de Calidad del Dato utilizado en la construcción de una herramienta de observación diseñada ad hoc. Se trata de un sistema mixto de formatos de campo y sistemas de categorías exhaustivas y mutuamente excluyentes (E/ME) que tiene como objetivo codificar la fase de ataque del balonmano playa. Se utilizan como criterios: minuto, marcador, zona de finalización y jugador que finaliza. Se han codificado 12 observaciones de selecciones nacionales absolutas masculinas. El análisis se ha realizado utilizando la concordancia consensuada (aproximación cualitativa de la calidad del dato), elaborando un archivo de detección de errores, calculando el índice Kappa de Cohen, los índices de correlación Tau-B de Kendall, Pearson y Spearman; y un análisis de Generalizabilidad. Los resultados de los coeficientes de correlación muestran un índice mínimo de .993, los índices Kappa de Cohen se sitúan en .917 y los índices de generalizabilidad son óptimos. Estos resultados aseguran que la herramienta de observación, además de tener un buen ajuste, permite registrar con fiabilidad y precisión.
Resumo:
Da ormai sette anni la stazione permanente GPS di Baia Terranova acquisisce dati giornalieri che opportunamente elaborati consentono di contribuire alla comprensione della dinamica antartica e a verificare se modelli globali di natura geofisica siano aderenti all’area di interesse della stazione GPS permanente. Da ricerche bibliografiche condotte si è dedotto che una serie GPS presenta molteplici possibili perturbazioni principalmente dovute a errori nella modellizzazione di alcuni dati ancillari necessari al processamento. Non solo, da alcune analisi svolte, è emerso come tali serie temporali ricavate da rilievi geodetici, siano afflitte da differenti tipologie di rumore che possono alterare, se non opportunamente considerate, i parametri di interesse per le interpretazioni geofisiche del dato. Il lavoro di tesi consiste nel comprendere in che misura tali errori, possano incidere sui parametri dinamici che caratterizzano il moto della stazione permanente, facendo particolare riferimento alla velocità del punto sul quale la stazione è installata e sugli eventuali segnali periodici che possono essere individuati.
Resumo:
Research in art conservation has been developed from the early 1950s, giving a significant contribution to the conservation-restoration of cultural heritage artefacts. In fact, only through a profound knowledge about the nature and conditions of constituent materials, suitable decisions on the conservation and restoration measures can thus be adopted and preservation practices enhanced. The study of ancient artworks is particularly challenging as they can be considered as heterogeneous and multilayered systems where numerous interactions between the different components as well as degradation and ageing phenomena take place. However, difficulties to physically separate the different layers due to their thickness (1-200 µm) can result in the inaccurate attribution of the identified compounds to a specific layer. Therefore, details can only be analysed when the sample preparation method leaves the layer structure intact, as for example the preparation of embedding cross sections in synthetic resins. Hence, spatially resolved analytical techniques are required not only to exactly characterize the nature of the compounds but also to obtain precise chemical and physical information about ongoing changes. This thesis focuses on the application of FTIR microspectroscopic techniques for cultural heritage materials. The first section is aimed at introducing the use of FTIR microscopy in conservation science with a particular attention to the sampling criteria and sample preparation methods. The second section is aimed at evaluating and validating the use of different FTIR microscopic analytical methods applied to the study of different art conservation issues which may be encountered dealing with cultural heritage artefacts: the characterisation of the artistic execution technique (chapter II-1), the studies on degradation phenomena (chapter II-2) and finally the evaluation of protective treatments (chapter II-3). The third and last section is divided into three chapters which underline recent developments in FTIR spectroscopy for the characterisation of paint cross sections and in particular thin organic layers: a newly developed preparation method with embedding systems in infrared transparent salts (chapter III-1), the new opportunities offered by macro-ATR imaging spectroscopy (chapter III-2) and the possibilities achieved with the different FTIR microspectroscopic techniques nowadays available (chapter III-3). In chapter II-1, FTIR microspectroscopy as molecular analysis, is presented in an integrated approach with other analytical techniques. The proposed sequence is optimized in function of the limited quantity of sample available and this methodology permits to identify the painting materials and characterise the adopted execution technique and state of conservation. Chapter II-2 describes the characterisation of the degradation products with FTIR microscopy since the investigation on the ageing processes encountered in old artefacts represents one of the most important issues in conservation research. Metal carboxylates resulting from the interaction between pigments and binding media are characterized using synthesised metal palmitates and their production is detected on copper-, zinc-, manganese- and lead- (associated with lead carbonate) based pigments dispersed either in oil or egg tempera. Moreover, significant effects seem to be obtained with iron and cobalt (acceleration of the triglycerides hydrolysis). For the first time on sienna and umber paints, manganese carboxylates are also observed. Finally in chapter II-3, FTIR microscopy is combined with further elemental analyses to characterise and estimate the performances and stability of newly developed treatments, which should better fit conservation-restoration problems. In the second part, in chapter III-1, an innovative embedding system in potassium bromide is reported focusing on the characterisation and localisation of organic substances in cross sections. Not only the identification but also the distribution of proteinaceous, lipidic or resinaceous materials, are evidenced directly on different paint cross sections, especially in thin layers of the order of 10 µm. Chapter III-2 describes the use of a conventional diamond ATR accessory coupled with a focal plane array to obtain chemical images of multi-layered paint cross sections. A rapid and simple identification of the different compounds is achieved without the use of any infrared microscope objectives. Finally, the latest FTIR techniques available are highlighted in chapter III-3 in a comparative study for the characterisation of paint cross sections. Results in terms of spatial resolution, data quality and chemical information obtained are presented and in particular, a new FTIR microscope equipped with a linear array detector, which permits reducing the spatial resolution limit to approximately 5 µm, provides very promising results and may represent a good alternative to either mapping or imaging systems.
Resumo:
The southern Apennines of Italy have been experienced several destructive earthquakes both in historic and recent times. The present day seismicity, characterized by small-to-moderate magnitude earthquakes, was used like a probe to obatin a deeper knowledge of the fault structures where the largest earthquakes occurred in the past. With the aim to infer a three dimensional seismic image both the problem of data quality and the selection of a reliable and robust tomographic inversion strategy have been faced. The data quality has been obtained to develop optimized procedures for the measurements of P- and S-wave arrival times, through the use of polarization filtering and to the application of a refined re-picking technique based on cross-correlation of waveforms. A technique of iterative tomographic inversion, linearized, damped combined with a strategy of multiscale inversion type has been adopted. The retrieved P-wave velocity model indicates the presence of a strong velocity variation along a direction orthogonal to the Apenninic chain. This variation defines two domains which are characterized by a relatively low and high velocity values. From the comparison between the inferred P-wave velocity model with a portion of a structural section available in literature, the high velocity body was correlated with the Apulia carbonatic platforms whereas the low velocity bodies was associated to the basinal deposits. The deduced Vp/Vs ratio shows that the ratio is lower than 1.8 in the shallower part of the model, while for depths ranging between 5 km and 12 km the ratio increases up to 2.1 in correspondence to the area of higher seismicity. This confirms that areas characterized by higher values are more prone to generate earthquakes as a response to the presence of fluids and higher pore-pressures.
Resumo:
Summary PhD Thesis Jan Pollmann: This thesis focuses on global scale measurements of light reactive non-methane hydrocarbon (NMHC), in the volatility range from ethane to toluene with a special focus on ethane, propane, isobutane, butane, isopentane and pentane. Even though they only occur at the ppt level (nmol mol-1) in the remote troposphere these species can yield insight into key atmospheric processes. An analytical method was developed and subsequently evaluated to analyze NMHC from the NOAA – ERSL cooperative air sampling network. Potential analytical interferences through other atmospheric trace gases (water vapor and ozone) were carefully examined. The analytical parameters accuracy and precision were analyzed in detail. It was proven that more than 90% of the data points meet the Global Atmospheric Watch (GAW) data quality objective. Trace gas measurements from 28 measurement stations were used to derive the global atmospheric distribution profile for 4 NMHC (ethane, propane, isobutane, butane). A close comparison of the derived ethane data with previously published reports showed that northern hemispheric ethane background mixing ratio declined by approximately 30% since 1990. No such change was observed for southern hemispheric ethane. The NMHC data and trace gas data supplied by NOAA ESRL were used to estimate local diurnal averaged hydroxyl radical (OH) mixing ratios by variability analysis. Comparison of the variability derived OH with directly measured OH and modeled OH mixing ratios were found in good agreement outside the tropics. Tropical OH was on average two times higher than predicted by the model. Variability analysis was used to assess the effect of chlorine radicals on atmospheric oxidation chemistry. It was found that Cl is probably not of significant relevance on a global scale.
Resumo:
The analysis of the K(892)*0 resonance production in Pb–Pb collisions at √sNN = 2.76 TeV with the ALICE detector at the LHC is presented. The analysis is motivated by the interest in the measurement of short-lived resonances production that can provide insights on the properties of the medium produced in heavy-ion collisions both during its partonic (Quark-Gluon Plasma) and hadronic phase. This particular analysis exploits particle identification of the ALICE Time-Of-Flight detector. The ALICE experiment is presented, with focus on the performance of the Time-Of-Flight system. The aspects of calibration and data quality controls are discussed in detail, while illustrating the excellent and very stable performance of the system in different collision environments at the LHC. A full analysis of the K*0 resonance production is presented: from the resonance reconstruction to the determination of the efficiency and the systematic uncertainty. The results show that the analysis strategy discussed is a valid tool to measure the K∗0 up to intermediate momenta. Preliminary results on K*0 resonance production at the LHC are presented and confirmed to be a powerful tool to study the physics of ultra-relativistic heavy-ion collisions.
Resumo:
In the last few years the resolution of numerical weather prediction (nwp) became higher and higher with the progresses of technology and knowledge. As a consequence, a great number of initial data became fundamental for a correct initialization of the models. The potential of radar observations has long been recognized for improving the initial conditions of high-resolution nwp models, while operational application becomes more frequent. The fact that many nwp centres have recently taken into operations convection-permitting forecast models, many of which assimilate radar data, emphasizes the need for an approach to providing quality information which is needed in order to avoid that radar errors degrade the model's initial conditions and, therefore, its forecasts. Environmental risks can can be related with various causes: meteorological, seismical, hydrological/hydraulic. Flash floods have horizontal dimension of 1-20 Km and can be inserted in mesoscale gamma subscale, this scale can be modeled only with nwp model with the highest resolution as the COSMO-2 model. One of the problems of modeling extreme convective events is related with the atmospheric initial conditions, in fact the scale dimension for the assimilation of atmospheric condition in an high resolution model is about 10 Km, a value too high for a correct representation of convection initial conditions. Assimilation of radar data with his resolution of about of Km every 5 or 10 minutes can be a solution for this problem. In this contribution a pragmatic and empirical approach to deriving a radar data quality description is proposed to be used in radar data assimilation and more specifically for the latent heat nudging (lhn) scheme. Later the the nvective capabilities of the cosmo-2 model are investigated through some case studies. Finally, this work shows some preliminary experiments of coupling of a high resolution meteorological model with an Hydrological one.
Resumo:
Flüchtige organische Bestandteile (engl.: VOC) sind in der Atmosphäre in Spuren vorhanden, spielen aber trotzdem eine wichtige Rolle in der Luftchemie: sie beeinflussen das Ozon der Troposphäre, städtischen Smog, Oxidationskapazität und haben direkte und indirekte Auswirkungen auf die globale Klimaveränderung. Eine wichtige Klasse der VOC sind die Nicht-Methan-Kohlenwasserstoffe (engl.: NMHC), die überwiegend von anthropogenen Quellen kommen. Aus diesem Grund ist für Luftchemiker ein Messinstrument nötig, das die VOC, die NMHC eingeschlossen, mit einer höheren Zeitauflösung misst, besonders für Echtzeitmessungen an Bord eines Forschungsflugzeuges. Dafür wurde das System zur schnellen Beobachtung von organischen Spuren (engl.: FOTOS) entworfen, gebaut für den Einsatz in einem neuen Wissenschaftlichen Flugzeug, das in großen Höhen und über weite Strecken fliegt, genannt HALO. In der Folge wurde FOTOS in zwei Messkampagnen am Boden getestet. FOTOS wurde entworfen und gebaut mit einem speziell angefertigten, automatisierten, kryogenen Probensystem mit drei Fallen und einem angepassten, erworbenen schnellen GC-MS. Ziel dieses Aufbaus war es, die Vielseitigkeit zu vergrößern und das Störungspotential zu verringern, deshalb wurden keine chemischen Trocknungsmittel oder adsorbierenden Stoffe verwendet. FOTOS erreichte eine Probenfrequenz von 5.5 Minuten, während es mindestens 13 verschiedene C2- bis C5-NMHC maß. Die Drei-Sigma-Detektionsgrenze für n- und iso-Pentan wurde als 2.6 und 2.0 pptv ermittelt, in dieser Reihenfolge. Labortests bestätigten, dass FOTOS ein vielseitiges, robustes, hochautomatisiertes, präzises, genaues, empfindliches Instrument ist, geeignet für Echtzeitmessungen von VOC in Probenfrequenzen, die angemessen sind für ein Forschungsflugzeug wie HALO. Um die Leistung von FOTOS zu bestätigen, wurde vom 26. Januar bis 4. Februar 2010 ein Zwischenvergleich gemacht mit dem GC-FID-System am Meteorologischen Observatorium Hohenpeißenberg, einer WMO-GAW-globalen Station. Dreizehn verschiedene NMHC wurden innerhalb des Rahmens der GWA Data Quality Objectives (DQO) analysiert und verglichen. Mehr als 80% der Messungen von sechs C3- bis C5-NMHC erfüllten diese DQO. Diese erste Messkampagne im Feld hob die Robustheit und Messgenauigkeit von FOTOS hervor, zusätzlich zu dem Vorteil der höheren Probenfrequenz, sogar in einer Messung am Boden. Um die Möglichkeiten dieses Instrumentes im Feld zu zeigen, maß FOTOS ausgewählte leichte NMHC während einer Messkampagne im Borealen Waldgebiet, HUMPPA-COPEC 2010. Vom 12. Juli bis zum 12. August 2010 beteiligte sich eine internationale Gruppe von Instituten und Instrumenten an Messungen physikalischer und chemischer Größen der Gas- und Partikelphasen der Luft über dem Borealen Wald an der SMEAR II-Station nahe Hyyttiälä, Finnland. Es wurden mehrere Hauptpunkte von Interesse im Mischungsverhältnis der Alkane und im Isomerenverhätnis von Pentan identifiziert, insbesondere sehr unterschiedliche Perioden niedriger und hoher Variabilität, drei Rauchschwaden von Biomassen-Verbrennung von russischen Waldbränden und zwei Tage mit extrem sauberer Luft aus der Polarregion. Vergleiche der NMHC mit anderen anthropogenen Indikatoren zeigten mehrere Quellen anthropogener Einflüsse am Ort auf und erlaubten eine Unterscheidung zwischen lokalen und weiter entfernten Quellen. Auf einen minimalen natürlichen Beitrag zum 24h-Kreislauf von NOx wurde geschlussfolgert aus der Korrelation von NOx mit Alkanen. Altersschätzungen der Luftmassen durch das Isomerenverhältnis von Pentan wurden erschwert durch sich verändernde Verhältnisse der Quellen und durch Besonderheiten der Photochemie während des Sommers im hohen Norden. Diese Messungen zeigten den Wert des Messens leichter NMHC, selbst in abgelegenen Regionen, als einen zusätzlichen spezifischen Marker von anthropogenem Einfluss.
Resumo:
Nitrogen is an essential nutrient. It is for human, animal and plants a constituent element of proteins and nucleic acids. Although the majority of the Earth’s atmosphere consists of elemental nitrogen (N2, 78 %) only a few microorganisms can use it directly. To be useful for higher plants and animals elemental nitrogen must be converted to a reactive oxidized form. This conversion happens within the nitrogen cycle by free-living microorganisms, symbiotic living Rhizobium bacteria or by lightning. Humans are able to synthesize reactive nitrogen through the Haber-Bosch process since the beginning of the 20th century. As a result food security of the world population could be improved noticeably. On the other side the increased nitrogen input results in acidification and eutrophication of ecosystems and in loss of biodiversity. Negative health effects arose for humans such as fine particulate matter and summer smog. Furthermore, reactive nitrogen plays a decisive role at atmospheric chemistry and global cycles of pollutants and nutritive substances.rnNitrogen monoxide (NO) and nitrogen dioxide (NO2) belong to the reactive trace gases and are grouped under the generic term NOx. They are important components of atmospheric oxidative processes and influence the lifetime of various less reactive greenhouse gases. NO and NO2 are generated amongst others at combustion process by oxidation of atmospheric nitrogen as well as by biological processes within soil. In atmosphere NO is converted very quickly into NO2. NO2 is than oxidized to nitrate (NO3-) and to nitric acid (HNO3), which bounds to aerosol particles. The bounded nitrate is finally washed out from atmosphere by dry and wet deposition. Catalytic reactions of NOx are an important part of atmospheric chemistry forming or decomposing tropospheric ozone (O3). In atmosphere NO, NO2 and O3 are in photosta¬tionary equilibrium, therefore it is referred as NO-NO2-O3 triad. At regions with elevated NO concentrations reactions with air pollutions can form NO2, altering equilibrium of ozone formation.rnThe essential nutrient nitrogen is taken up by plants mainly by dissolved NO3- entering the roots. Atmospheric nitrogen is oxidized to NO3- within soil via bacteria by nitrogen fixation or ammonium formation and nitrification. Additionally atmospheric NO2 uptake occurs directly by stomata. Inside the apoplast NO2 is disproportionated to nitrate and nitrite (NO2-), which can enter the plant metabolic processes. The enzymes nitrate and nitrite reductase convert nitrate and nitrite to ammonium (NH4+). NO2 gas exchange is controlled by pressure gradients inside the leaves, the stomatal aperture and leaf resistances. Plant stomatal regulation is affected by climate factors like light intensity, temperature and water vapor pressure deficit. rnThis thesis wants to contribute to the comprehension of the effects of vegetation in the atmospheric NO2 cycle and to discuss the NO2 compensation point concentration (mcomp,NO2). Therefore, NO2 exchange between the atmosphere and spruce (Picea abies) on leaf level was detected by a dynamic plant chamber system under labo¬ratory and field conditions. Measurements took place during the EGER project (June-July 2008). Additionally NO2 data collected during the ECHO project (July 2003) on oak (Quercus robur) were analyzed. The used measuring system allowed simultaneously determina¬tion of NO, NO2, O3, CO2 and H2O exchange rates. Calculations of NO, NO2 and O3 fluxes based on generally small differences (∆mi) measured between inlet and outlet of the chamber. Consequently a high accuracy and specificity of the analyzer is necessary. To achieve these requirements a highly specific NO/NO2 analyzer was used and the whole measurement system was optimized to an enduring measurement precision.rnData analysis resulted in a significant mcomp,NO2 only if statistical significance of ∆mi was detected. Consequently, significance of ∆mi was used as a data quality criterion. Photo-chemical reactions of the NO-NO2-O3 triad in the dynamic plant chamber’s volume must be considered for the determination of NO, NO2, O3 exchange rates, other¬wise deposition velocity (vdep,NO2) and mcomp,NO2 will be overestimated. No significant mcomp,NO2 for spruce could be determined under laboratory conditions, but under field conditions mcomp,NO2 could be identified between 0.17 and 0.65 ppb and vdep,NO2 between 0.07 and 0.42 mm s-1. Analyzing field data of oak, no NO2 compensation point concentration could be determined, vdep,NO2 ranged between 0.6 and 2.71 mm s-1. There is increasing indication that forests are mainly a sink for NO2 and potential NO2 emissions are low. Only when assuming high NO soil emissions, more NO2 can be formed by reaction with O3 than plants are able to take up. Under these circumstance forests can be a source for NO2.
Resumo:
Am Mainzer Mikrotron können Lambda-Hyperkerne in (e,e'K^+)-Reaktionen erzeugt werden. Durch den Nachweis des erzeugten Kaons im KAOS-Spektrometer lassen sich Reaktionen markieren, bei denen ein Hyperon erzeugt wurde. Die Spektroskopie geladener Pionen, die aus schwachen Zweikörperzerfällen leichter Hyperkerne stammen, erlaubt es die Bindungsenergie des Hyperons im Kern mit hoher Präzision zu bestimmen. Neben der direkten Produktion von Hyperkernen ist auch die Erzeugung durch die Fragmentierung eines hoch angeregten Kontinuumszustands möglich. Dadurch können unterschiedliche Hyperkerne in einem Experiment untersucht werden. Für die Spektroskopie der Zerfallspionen stehen hochauflösende Magnetspektrometer zur Verfügung. Um die Grundzustandsmasse der Hyperkerne aus dem Pionimpuls zu berechnen, ist es erforderlich, dass das Hyperfragment vor dem Zerfall im Target abgebremst wird. Basierend auf dem bekannten Wirkungsquerschnitt der elementaren Kaon-Photoproduktion wurde eine Berechnung der zu erwartenden Ereignisrate vorgenommen. Es wurde eine Monte-Carlo-Simulation entwickelt, die den Fragmentierungsprozess und das Abbremsen der Hyperfragmente im Target beinhaltet. Diese nutzt ein statistisches Aufbruchsmodell zur Beschreibung der Fragmentierung. Dieser Ansatz ermöglicht für Wasserstoff-4-Lambda-Hyperkerne eine Vorhersage der zu erwartenden Zählrate an Zerfallspionen. In einem Pilotexperiment im Jahr 2011 wurde erstmalig an MAMI der Nachweis von Hadronen mit dem KAOS-Spektrometer unter einem Streuwinkel von 0° demonstriert, und koinzident dazu Pionen nachgewiesen. Es zeigte sich, dass bedingt durch die hohen Untergrundraten von Positronen in KAOS eine eindeutige Identifizierung von Hyperkernen in dieser Konfiguration nicht möglich war. Basierend auf diesen Erkenntnissen wurde das KAOS-Spektrometer so modifiziert, dass es als dedizierter Kaonenmarkierer fungierte. Zu diesem Zweck wurde ein Absorber aus Blei im Spektrometer montiert, in dem Positronen durch Schauerbildung abgestoppt werden. Die Auswirkung eines solchen Absorbers wurde in einem Strahltest untersucht. Eine Simulation basierend auf Geant4 wurde entwickelt mittels derer der Aufbau von Absorber und Detektoren optimiert wurde, und die Vorhersagen über die Auswirkung auf die Datenqualität ermöglichte. Zusätzlich wurden mit der Simulation individuelle Rückrechnungsmatrizen für Kaonen, Pionen und Protonen erzeugt, die die Wechselwirkung der Teilchen mit der Bleiwand beinhalteten, und somit eine Korrektur der Auswirkungen ermöglichen. Mit dem verbesserten Aufbau wurde 2012 eine Produktionsstrahlzeit durchgeführt, wobei erfolgreich Kaonen unter 0° Streuwinkel koninzident mit Pionen aus schwachen Zerfällen detektiert werden konnten. Dabei konnte im Impulsspektrum der Zerfallspionen eine Überhöhung mit einer Signifikanz, die einem p-Wert von 2,5 x 10^-4 entspricht, festgestellt werden. Diese Ereignisse können aufgrund ihres Impulses, den Zerfällen von Wasserstoff-4-Lambda-Hyperkernen zugeordnet werden, wobei die Anzahl detektierter Pionen konsistent mit der berechneten Ausbeute ist.
Resumo:
A global metabolic profiling methodology based on gas chromatography coupled to time-of-flight mass spectrometry (GC-TOFMS) for human plasma was applied to a human exercise study focused on the effects of beverages containing glucose, galactose, or fructose taken after exercise and throughout a recovery period of 6 h and 45 min. One group of 10 well trained male cyclists performed 3 experimental sessions on separate days (randomized, single center). After performing a standardized depletion protocol on a bicycle, subjects consumed one of three different beverages: maltodextrin (MD)+glucose (2:1 ratio), MD+galactose (2:1), and MD+fructose (2:1), consumed at an average of 1.25 g of carbohydrate (CHO) ingested per minute. Blood was taken straight after exercise and every 45 min within the recovery phase. With the resulting blood plasma, insulin, free fatty acid (FFA) profile, glucose, and GC-TOFMS global metabolic profiling measurements were performed. The resulting profiling data was able to match the results obtained from the other clinical measurements with the addition of being able to follow many different metabolites throughout the recovery period. The data quality was assessed, with all the labelled internal standards yielding values of <15% CV for all samples (n=335), apart from the labelled sucrose which gave a value of 15.19%. Differences between recovery treatments including the appearance of galactonic acid from the galactose based beverage were also highlighted.