925 resultados para TRACE validation
Resumo:
A small break loss-of-coolant accident (SBLOCA) is one of problems investigated in an NPP operation. Such accident can be analyzed using an experiment facility and TRACE thermal-hydraulic system code. A series of SBLOCA experiments was carried out on Parallel Channel Test Loop (PACTEL) facility, exploited together with Technical Research Centre of Finland VTT Energy and Lappeenranta University of Technology (LUT), in order to investigate two-phase phenomena related to a VVER-type reactor. The experiments and a TRACE model of the PACTEL facility are described in the paper. In addition, there is the TRACE code description with main field equations. At the work, calculations of a SBLOCA series are implemented and after the calculations, the thesis discusses the validation of TRACE and concludes with an assessment of the usefulness and accuracy of the code in calculating small breaks.
Resumo:
The purpose of this thesis is to study the scalability of small break LOCA experiments. The study is performed on the experimental data, as well as on the results of thermal hydraulic computation performed on TRACE code. The SBLOCA experiments were performed on PACTEL facility situated at LUT. The temporal scaling of the results was done by relating the total coolant mass in the system with the initial break mass flow and using the quotient to scale the experiment time. The results showed many similarities in the behaviour of pressure and break mass flow between the experiments.
Resumo:
Environmental pollution continues to be an emerging study field, as there are thousands of anthropogenic compounds mixed in the environment whose possible mechanisms of toxicity and physiological outcomes are of great concern. Developing methods to access and prioritize the screening of these compounds at trace levels in order to support regulatory efforts is, therefore, very important. A methodology based on solid phase extraction followed by derivatization and gas chromatography-mass spectrometry analysis was developed for the assessment of four endocrine disrupting compounds (EDCs) in water matrices: bisphenol A, estrone, 17b-estradiol and 17a-ethinylestradiol. The study was performed, simultaneously, by two different laboratories in order to evaluate the robustness of the method and to increase the quality control over its application in routine analysis. Validation was done according to the International Conference on Harmonisation recommendations and other international guidelines with specifications for the GC-MS methodology. Matrix-induced chromatographic response enhancement was avoided by using matrix-standard calibration solutions and heteroscedasticity has been overtaken by a weighted least squares linear regression model application. Consistent evaluation of key analytical parameters such as extraction efficiency, sensitivity, specificity, linearity, limits of detection and quantification, precision, accuracy and robustness was done in accordance with standards established for acceptance. Finally, the application of the optimized method in the assessment of the selected analytes in environmental samples suggested that it is an expedite methodology for routine analysis of EDC residues in water matrices.
Resumo:
Within the next decade, the improved version 2 of Global Ozone Monitoring Experiment (GOME-2), a ultraviolet-visible spectrometer dedicated to the observation of key atmospheric trace species from space, will be launched successively on board three EUMETSAT Polar System (EPS) MetOp satellites. Starting with the launch of MetOp-1 scheduled for summer 2006, the GOME-2 series will extend till 2020 the global monitoring of atmospheric composition pioneered with ERS-2 GOME-1 since 1995 and enhanced with Envisat SCIAMACHY since 2002 and EOS-Aura OMI since 2004. For more than a decade, an international pool of scientific teams active in ground-and space-based ultraviolet-visible remote sensing have contributed to the successful post-launch validation of trace gas data products and the associated maturation of retrieval algorithms for the latter satellites, ensuring that geophysical data products are/become reliable and accurate enough for intended research and applications. Building on this experience, this consortium plans now to develop and carry out appropriate validation of a list of GOME-2 trace gas column data of both tropospheric and stratospheric relevance: nitrogen dioxide (NO 2), ozone (O 3), bromine monoxide (BrO), chlorine dioxide (OClO), formaldehyde (HCHO), and sulphur dioxide (SO 2). The proposed investigation will combine four complementary approaches resulting in an end-to-end validation of expected column data products.
Resumo:
In recent years, there has been increasing fish consumption in Brazil, largely due to the popularity of Japanese cuisine. No study, however, has previously assessed the presence of inorganic contaminants in species used in the preparation of Japanese food. In this paper, we determined total arsenic, cadmium, chromium, total mercury, and lead contents in 82 fish samples of Tuna (Thunnus thynnus), Porgy (Pagrus pagrus), Snook (Centropomus sp.), and Salmon (Salmo salar) species marketed in Sao Paulo (Brazil). Samples were mineralized in HNO(3)/H(2)O(2) for As, Cd, Cr and Pb, and in HNO(3)/H(2)SO(4)/V(2)O(5) for Hg. Inorganic contaminants were determined after the validation of the methodology using Inductively Coupled Plasma Optical Emission Spectrometry (ICP OES); and for Hg, an ICP-coupled hydride generator was used. Concentration ranges for elements analyzed in mg kg(-1) (wet base) were as follows: Total As (0.11-10.82); Cd (0.005-0.047); Cr (0.008-0.259); Pb (0.026-0.481); and total Hg (0.0077-0.9681). As and Cr levels exceeded the maximum limits allowed by the Brazilian law (1 and 0.1 mg kg(-1)) in 51.2 and 7.3% of the total samples studied, respectively. The most contaminated species were porgy (As = 95% and Cr = 10%) and tuna (As 91% and Cr = 10%). An estimation of As, Cd, Pb, and Hg weekly intake was calculated considering a 60 kg adult person and a 350 g consumption of fish per week, with As and Hg elements presenting the highest contribution on diets reaching 222% of provisional tolerable weekly intake (PTWI) for As in porgy and 41% of PTWI for Hg in tuna. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Concentrations of eleven trace elements (Al, As, Cd, Cr, Co, Hg, Mn, Ni, Pb, Se, and Si) were measured in 39 (natural and flavoured) water samples. Determinations were performed using graphite furnace electrothermetry for almost all elements (Al, As, Cd, Cr, Co, Mn, Ni, Pb, and Si). For Se determination hydride generation was used, and cold vapour generation for Hg. These techniques were coupled to atomic absorption spectrophotometry. The trace element content of still or sparkling natural waters changed from brand to brand. Significant differences between natural still and natural sparkling waters (p<0.001) were only apparent for Mn. The Mann–Whitney U-test was used to search for significant differences between flavoured and natural waters. The concentration of each element was compared with the presence of flavours, preservatives, acidifying agents, fruit juice and/or sweeteners, according to the labelled composition. It was shown that flavoured waters generally increase the trace element content. The addition of preservatives and acidifying regulators had a significant influence on Mn, Co, As and Si contents (p<0.05). Fruit juice can also be correlated to the increase of Co and As. Sweeteners did not provide any significant difference in Mn, Co, Se and Si content.
Resumo:
Amulti-residue methodology based on a solid phase extraction followed by gas chromatography–tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC–MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness.
Resumo:
Dissertation for the Degree of Master in Technology and Food Safety – Food Quality
Resumo:
Clenbuterol is a β2 agonist agent with anabolic properties given by the increase in the muscular mass in parallel to the decrease of the body fat. For this reason, the use of clenbuterol is forbidden by the World Anti-Doping Agency (WADA) in the practice of sport. This compound is of particular interest for anti-doping authorities and WADA-accredited laboratories due to the recent reporting of risk of unintentional doping following the eating of meat contaminated with traces of clenbuterol in some countries. In this work, the development and the validation of an ultra-high pressure liquid chromatography coupled to electrospray ionization tandem mass spectrometry (UHPLC-ESI-MS/MS) method for the quantification of clenbuterol in human urine is described. The analyte was extracted from urine samples by liquid-liquid extraction (LLE) in basic conditions using tert butyl-methyl ether (TBME) and analyzed by UHPLC-MS/MS with a linear gradient of acetonitrile in 9min only. The simple and rapid method presented here was validated in compliance with authority guidelines and showed a limit of quantification at 5pg/mL and a linearity range from 5pg/mL to 300pg/mL. Good trueness (85.8-105%), repeatability (5.7-10.6% RSD) and intermediate precision (5.9-14.9% RSD) results were obtained. The method was then applied to real samples from eighteen volunteers collecting urines after single oral doses administration (1, 5 and 10μg) of clenbuterol-enriched yogurts.
Resumo:
Diplomityön tavoitteena on paineistimen yksityiskohtainen mallintaminen APROS- ja TRACE- termohydrauliikkaohjelmistoja käyttäen. Rakennetut paineistinmallit testattiin vertaamalla laskentatuloksia paineistimen täyttymistä, tyhjentymistä ja ruiskutusta käsittelevistä erilliskokeista saatuun mittausdataan. Tutkimuksen päätavoitteena on APROSin paineistinmallin validoiminen käyttäen vertailuaineistona PACTEL ATWS-koesarjan sopivia paineistinkokeita sekä MIT Pressurizer- ja Neptunus- erilliskokeita. Lisäksi rakennettiin malli Loviisan ydinvoimalaitoksen paineistimesta, jota käytettiin turbiinitrippitransientin simulointiin tarkoituksena selvittää mahdolliset voimalaitoksen ja koelaitteistojen mittakaavaerosta johtuvat vaikutukset APROSin paineistinlaskentaan. Kokeiden simuloinnissa testattiin erilaisia noodituksia ja mallinnusvaihtoehtoja, kuten entalpian ensimmäisen ja toisen kertaluvun diskretisointia, ja APROSin sekä TRACEn antamia tuloksia vertailtiin kattavasti toisiinsa. APROSin paineistinmallin lämmönsiirtokorrelaatioissa havaittiin merkittävä puute ja laskentatuloksiin saatiin huomattava parannus ottamalla käyttöön uusi seinämälauhtumismalli. Työssä tehdyt TRACE-simulaatiot ovat osa United States Nuclear Regulatory Commissionin kansainvälistä CAMP-koodinkehitys-ja validointiohjelmaa.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.
Resumo:
L’avancement en âge est associé à plusieurs modifications cognitives, dont un déclin des capacités à mémoriser et/ou à rappeler les événements vécus personnellement. Il amène parallèlement une augmentation des faux souvenirs, c.-à-d. le rappel d’événements qui ne se sont pas réellement déroulés. Les faux souvenirs peuvent avoir d’importantes répercussions dans la vie quotidienne des personnes âgées et il importe donc de mieux comprendre ce phénomène en vieillissement normal. Des études ont démontré l’importance de la fonction des lobes temporaux médians (FTM)/mémoire et de la fonction des lobes frontaux (FF)/fonctions exécutives dans l’effet de faux souvenirs. Ainsi, la première étude de la thèse visait à valider en français une version adaptée d’une méthode proposée par Glisky, Polster, & Routhieaux (1995), permettant de mesurer ces fonctions cognitives (Chapitre 2). L’analyse factorielle de cette étude démontre que les scores neuropsychologiques associés à la mémoire se regroupent en un facteur, le facteur FTM/mémoire, alors que ceux associés aux fonctions exécutives se regroupent en un deuxième facteur, le facteur FF/fonctions exécutives. Des analyses « bootstrap » effectuées avec 1 000 ré-échantillons démontrent la stabilité des résultats pour la majorité des scores. La deuxième étude de cette thèse visait à éclairer les mécanismes cognitifs (FTM/mémoire et FF/fonctions exécutives) ainsi que théoriques de l’effet de faux souvenirs accru en vieillissement normal (Chapitre 3). La Théorie des Traces Floues (TTF; Brainerd & Reyna, 1990) propose des explications de l’effet de faux souvenirs pour lesquelles la FTM/mémoire semble davantage importante, alors que celles proposées par la Théorie de l’Activation et du Monitorage (TAM; Roediger, Balota, & Watson, 2001) sont davantage reliées à la FF/fonctions exécutives. Les tests neuropsychologiques mesurant la FTM/mémoire ainsi que ceux mesurant la FF/fonctions exécutives ont été administrés à 52 participants âgés (moyenne de 67,81 ans). Basé sur l’étude de validation précédente, un score composite de la FTM/mémoire et un score composite de la FF/fonctions exécutives ont été calculés pour chaque participant. Ces derniers ont d’abord été séparés en deux sous-groupes, un premier au score FTM/mémoire élevé (n = 29, âge moyen de 67,45 ans) et un deuxième au score FTM/mémoire faible (n = 23, âge moyen de 68,26 ans) en s’assurant de contrôler statistiquement plusieurs variables, dont le score de la FF/fonctions exécutives. Enfin, ces participants ont été séparés en deux sous-groupes, un premier au score FF/fonctions exécutives élevé (n = 26, âge moyen 68,08 ans) et un deuxième au score FF/fonctions exécutives faible (n = 25, âge moyen de 67,36 ans), en contrôlant les variables confondantes, dont le score de la FTM/mémoire. Les proportions de vraie et de fausse mémoire (cibles et leurres associatifs) ont été mesurées à l’aide d’un paradigme Deese-Roediger et McDermott (DRM; Deese, 1959; Roediger & McDermott, 1995), avec rappel et reconnaissance jumelée à une procédure « Je me souviens / Je sais » (Tulving, 1985) chez les 52 participants âgés ainsi que chez 22 jeunes (âge moyen de 24,59 ans), apparié pour les années de scolarité. D’abord, afin de tester l’hypothèse de la TTF (Brainerd & Reyna, 1990), ces proportions ont été comparées entre les jeunes adultes et les deux sous-groupes de personnes âgées catégorisées selon le score de la FTM/mémoire. Ensuite, afin de tester l’hypothèse de la TAM (Roediger et al., 2001), ces proportions ont été comparées entre les jeunes adultes et les deux sous-groupes de personnes âgées catégorisées selon le score de la FF/fonctions exécutives. Il s’agit de la première étude qui compare directement ces hypothèses à travers de nombreuses mesures de vraie et de fausse mémoire. Les résultats démontrent que seule la FTM/mémoire modulait l’effet d’âge en vraie mémoire, et de manière quelque peu indirecte, en fausse mémoire et dans la relation entre la vraie et la fausse remémoration. Ensuite, les résultats démontrent que seule la FF/fonctions exécutives jouerait un rôle dans la fausse reconnaissance des leurres associatifs. Par ailleurs, en des effets d’âge sont présents en faux rappel et fausse remémorations de leurres associatifs, entre les jeunes adultes et les personnes âgées au fonctionnement cognitif élevé, peu importe la fonction cognitive étudiée. Ces résultats suggèrent que des facteurs autres que la FTM/mémoire et la FF/fonctions exécutives doivent être identifiés afin d’expliquer la vulnérabilité des personnes âgées aux faux souvenirs. Les résultats de cette thèse sont discutés à la lumière des hypothèses théoriques et cognitives en faux souvenirs (Chapitre 4).
Resumo:
CO, O3, and H2O data in the upper troposphere/lower stratosphere (UTLS) measured by the Atmospheric Chemistry Experiment Fourier Transform Spectrometer(ACE-FTS) on Canada’s SCISAT-1 satellite are validated using aircraft and ozonesonde measurements. In the UTLS, validation of chemical trace gas measurements is a challenging task due to small-scale variability in the tracer fields, strong gradients of the tracers across the tropopause, and scarcity of measurements suitable for validation purposes. Validation based on coincidences therefore suffers from geophysical noise. Two alternative methods for the validation of satellite data are introduced, which avoid the usual need for coincident measurements: tracer-tracer correlations, and vertical tracer profiles relative to tropopause height. Both are increasingly being used for model validation as they strongly suppress geophysical variability and thereby provide an “instantaneous climatology”. This allows comparison of measurements between non-coincident data sets which yields information about the precision and a statistically meaningful error-assessment of the ACE-FTS satellite data in the UTLS. By defining a trade-off factor, we show that the measurement errors can be reduced by including more measurements obtained over a wider longitude range into the comparison, despite the increased geophysical variability. Applying the methods then yields the following upper bounds to the relative differences in the mean found between the ACE-FTS and SPURT aircraft measurements in the upper troposphere (UT) and lower stratosphere (LS), respectively: for CO ±9% and ±12%, for H2O ±30% and ±18%, and for O3 ±25% and ±19%. The relative differences for O3 can be narrowed down by using a larger dataset obtained from ozonesondes, yielding a high bias in the ACEFTS measurements of 18% in the UT and relative differences of ±8% for measurements in the LS. When taking into account the smearing effect of the vertically limited spacing between measurements of the ACE-FTS instrument, the relative differences decrease by 5–15% around the tropopause, suggesting a vertical resolution of the ACE-FTS in the UTLS of around 1 km. The ACE-FTS hence offers unprecedented precision and vertical resolution for a satellite instrument, which will allow a new global perspective on UTLS tracer distributions.
Resumo:
During SPURT (Spurenstofftransport in der Tropopausenregion, trace gas transport in the tropopause region) we performed measurements of a wide range of trace gases with different lifetimes and sink/source characteristics in the northern hemispheric upper troposphere (UT) and lowermost stratosphere (LMS). A large number of in-situ instruments were deployed on board a Learjet 35A, flying at altitudes up to 13.7 km, at times reaching to nearly 380 K potential temperature. Eight measurement campaigns (consisting of a total of 36 flights), distributed over all seasons and typically covering latitudes between 35° N and 75° N in the European longitude sector (10° W–20° E), were performed. Here we present an overview of the project, describing the instrumentation, the encountered meteorological situations during the campaigns and the data set available from SPURT. Measurements were obtained for N2O, CH4, CO, CO2, CFC12, H2, SF6, NO, NOy, O3 and H2O. We illustrate the strength of this new data set by showing mean distributions of the mixing ratios of selected trace gases, using a potential temperature-equivalent latitude coordinate system. The observations reveal that the LMS is most stratospheric in character during spring, with the highest mixing ratios of O3 and NOy and the lowest mixing ratios of N2O and SF6. The lowest mixing ratios of NOy and O3 are observed during autumn, together with the highest mixing ratios of N2O and SF6 indicating a strong tropospheric influence. For H2O, however, the maximum concentrations in the LMS are found during summer, suggesting unique (temperature- and convection-controlled) conditions for this molecule during transport across the tropopause. The SPURT data set is presently the most accurate and complete data set for many trace species in the LMS, and its main value is the simultaneous measurement of a suite of trace gases having different lifetimes and physical-chemical histories. It is thus very well suited for studies of atmospheric transport, for model validation, and for investigations of seasonal changes in the UT/LMS, as demonstrated in accompanying and elsewhere published studies.