884 resultados para Validation of analytical methodology
Resumo:
A validation of the burn-up simulation system EVOLCODE 2.0 is presented here, involving the experimental measurement of U and Pu isotopes and some fission fragments production ratios after a burn-up of around 30 GWd/tU in a Pressurized Light Water Reactor (PWR). This work provides an in-depth analysis of the validation results, including the possible sources of the uncertainties. An uncertainty analysis based on the sensitivity methodology has been also performed, providing the uncertainties in the isotopic content propagated from the cross sections uncertainties. An improvement of the classical Sensitivity/ Uncertainty (S/U) model has been developed to take into account the implicit dependence of the neutron flux normalization, that is, the effect of the constant power of the reactor. The improved S/U methodology, neglected in this kind of studies, has proven to be an important contribution to the explanation of some simulation-experiment discrepancies for which, in general, the cross section uncertainties are, for the most relevant actinides, an important contributor to the simulation uncertainties, of the same order of magnitude and sometimes even larger than the experimental uncertainties and the experiment- simulation differences. Additionally, some hints for the improvement of the JEFF3.1.1 fission yield library and for the correction of some errata in the experimental data are presented.
Resumo:
The effect of small mistuning in the forced response of a bladed disk is analyzed using a recently introduced methodology: the asymptotic mistuning model. The asymptotic mistuning model is an extremely reduced, simplified model that is derived directly from the full formulation of the mistuned bladed disk using a consistent perturbative procedure based on the relative smallness of the mistuning distortion. A detailed description of the derivation of the asymptotic mistuning model for a realistic bladed disk configuration is presented. The asymptotic mistuning model results for several different mistuning patterns and forcing conditions are compared with those from a high-resolution finite element model. The asymptotic mistuning model produces quantitatively accurate results, and, probably more relevant, it gives precise information about the factors (tuned modes and components of the mistuning pattern) that actually play a role in the vibrational forced response of mistuned bladed disks.
Resumo:
The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Lipid peroxidation products like malondialdehyde, 4-hydroxynonenal and F(2)-isoprostanes are widely used as markers of oxidative stress in vitro and in vivo. This study reports the results of a multi-laboratory validation study by COST Action B35 to assess inter-laboratory and intra-laboratory variation in the measurement of lipid peroxidation. Human plasma samples were exposed to UVA irradiation at different doses (0, 15 J, 20 J), encoded and shipped to 15 laboratories, where analyses of malondialdehyde, 4-hydroxynonenal and isoprostanes were conducted. The results demonstrate a low within-day-variation and a good correlation of results observed on two different days. However, high coefficients of variation were observed between the laboratories. Malondialdehyde determined by HPLC was found to be the most sensitive and reproducible lipid peroxidation product in plasma upon UVA treatment. It is concluded that measurement of malondialdehyde by HPLC has good analytical validity for inter-laboratory studies on lipid peroxidation in human EDTA-plasma samples, although it is acknowledged that this may not translate to biological validity.
Resumo:
Background. Previous research has shown that object recognition may develop well into late childhood and adolescence. The present study extends that research and reveals novel differences in holistic and analytic recognition performance in 7-12 year olds compared to that seen in adults. We interpret our data within a hybrid model of object recognition that proposes two parallel routes for recognition (analytic vs. holistic) modulated by attention. Methodology / Principal Findings. Using a repetition-priming paradigm, we found in Experiment 1 that children showed no holistic priming, but only analytic priming. Given that holistic priming might be thought to be more ‘primitive’, we confirmed in Experiment 2 that our surprising finding was not because children’s analytic recognition was merely a result of name repetition. Conclusions / Significance. Our results suggest a developmental primacy of analytic object recognition. By contrast, holistic object recognition skills appear to emerge with a much more protracted trajectory extending into late adolescence.
Resumo:
The sharing of near real-time traceability knowledge in supply chains plays a central role in coordinating business operations and is a key driver for their success. However before traceability datasets received from external partners can be integrated with datasets generated internally within an organisation, they need to be validated against information recorded for the physical goods received as well as against bespoke rules defined to ensure uniformity, consistency and completeness within the supply chain. In this paper, we present a knowledge driven framework for the runtime validation of critical constraints on incoming traceability datasets encapuslated as EPCIS event-based linked pedigrees. Our constraints are defined using SPARQL queries and SPIN rules. We present a novel validation architecture based on the integration of Apache Storm framework for real time, distributed computation with popular Semantic Web/Linked data libraries and exemplify our methodology on an abstraction of the pharmaceutical supply chain.
Resumo:
A new mesoscale simulation model for solids dissolution based on an computationally efficient and versatile digital modelling approach (DigiDiss) is considered and validated against analytical solutions and published experimental data for simple geometries. As the digital model is specifically designed to handle irregular shapes and complex multi-component structures, use of the model is explored for single crystals (sugars) and clusters. Single crystals and the cluster were first scanned using X-ray microtomography to obtain a digital version of their structures. The digitised particles and clusters were used as a structural input to digital simulation. The same particles were then dissolved in water and the dissolution process was recorded by a video camera and analysed yielding: the overall dissolution times and images of particle size and shape during the dissolution. The results demonstrate the coherence of simulation method to reproduce experimental behaviour, based on known chemical and diffusion properties of constituent phase. The paper discusses how further sophistications to the modelling approach will need to include other important effects such as complex disintegration effects (particle ejection, uncertainties in chemical properties). The nature of the digital modelling approach is well suited to for future implementation with high speed computation using hybrid conventional (CPU) and graphical processor (GPU) systems.
Resumo:
This paper presents a theoretical model on the vibration analysis of micro scale fluid-loaded rectangular isotropic plates, based on the Lamb's assumption of fluid-structure interaction and the Rayleigh-Ritz energy method. An analytical solution for this model is proposed, which can be applied to most cases of boundary conditions. The dynamical experimental data of a series of microfabricated silicon plates are obtained using a base-excitation dynamic testing facility. The natural frequencies and mode shapes in the experimental results are in good agreement with the theoretical simulations for the lower order modes. The presented theoretical and experimental investigations on the vibration characteristics of the micro scale plates are of particular interest in the design of microplate based biosensing devices. Copyright © 2009 by ASME.
Resumo:
The study of volcano deformation data can provide information on magma processes and help assess the potential for future eruptions. In employing inverse deformation modeling on these data, we attempt to characterize the geometry, location and volume/pressure change of a deformation source. Techniques currently used to model sheet intrusions (e.g., dikes and sills) often require significant a priori assumptions about source geometry and can require testing a large number of parameters. Moreover, surface deformations are a non-linear function of the source geometry and location. This requires the use of Monte Carlo inversion techniques which leads to long computation times. Recently, ‘displacement tomography’ models have been used to characterize magma reservoirs by inverting source deformation data for volume changes using a grid of point sources in the subsurface. The computations involved in these models are less intensive as no assumptions are made on the source geometry and location, and the relationship between the point sources and the surface deformation is linear. In this project, seeking a less computationally intensive technique for fracture sources, we tested if this displacement tomography method for reservoirs could be used for sheet intrusions. We began by simulating the opening of three synthetic dikes of known geometry and location using an established deformation model for fracture sources. We then sought to reproduce the displacements and volume changes undergone by the fractures using the sources employed in the tomography methodology. Results of this validation indicate the volumetric point sources are not appropriate for locating fracture sources, however they may provide useful qualitative information on volume changes occurring in the surrounding rock, and therefore indirectly indicate the source location.
Resumo:
Advances in the diagnosis of Mycobacterium bovis infection in wildlife hosts may benefit the development of sustainable approaches to the management of bovine tuberculosis in cattle. In the present study, three laboratories from two different countries participated in a validation trial to evaluate the reliability and reproducibility of a real time PCR assay in the detection and quantification of M. bovis from environmental samples. The sample panels consisted of negative badger faeces spiked with a dilution series of M. bovis BCG Pasteur and of field samples of faeces from badgers of unknown infection status taken from badger latrines in areas with high and low incidence of bovine TB (bTB) in cattle. Samples were tested with a previously optimised methodology. The experimental design involved rigorous testing which highlighted a number of potential pitfalls in the analysis of environmental samples using real time PCR. Despite minor variation between operators and laboratories, the validation study demonstrated good concordance between the three laboratories: on the spiked panels, the test showed high levels of agreement in terms of positive/negative detection, with high specificity (100%) and high sensitivity (97%) at levels of 10(5) cells g(-1) and above. Quantitative analysis of the data revealed low variability in recovery of BCG cells between laboratories and operators. On the field samples, the test showed high reproducibility both in terms of positive/negative detection and in the number of cells detected, despite low numbers of samples identified as positive by any laboratory. Use of a parallel PCR inhibition control assay revealed negligible PCR-interfering chemicals co-extracted with the DNA. This is the first example of a multi-laboratory validation of a real time PCR assay for the detection of mycobacteria in environmental samples. Field studies are now required to determine how best to apply the assay for population-level bTB surveillance in wildlife.