46 resultados para NUCLEAR DATA COLLECTIONS


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work is aimed to present the main differences of nuclear data uncertainties among three different nuclear data libraries: EAF-2007, EAF-2010 and SCALE-6.0, under different neutron spectra: LWR, ADS and DEMO (fusion). To take into account the neutron spectrum, the uncertainty data are collapsed to onegroup. That is a simple way to see the differences among libraries for one application. Also, the neutron spectrum effect on different applications can be observed. These comparisons are presented only for (n,fission), (n,gamma) and (n,p) reactions, for the main transuranic isotopes (234,235,236,238U, 237Np, 238,239,240,241Pu, 241,242m,243Am, 242,243,244,245,246,247,248Cm, 249Bk, 249,250,251,252Cf). But also general comparisons among libraries are presented taking into account all included isotopes. In other works, target accuracies are presented for nuclear data uncertainties; here, these targets are compared with uncertainties on the above libraries. The main results of these comparisons are that EAF-2010 has reduced their uncertainties for many isotopes from EAF-2007 for (n,gamma) and (n,fission) but not for (n,p); SCALE-6.0 gives lower uncertainties for (n,fission) reactions for ADS and PWR applications, but gives higher uncertainties for (n,p) reactions in all applications. For the (n,gamma) reaction, the amount of isotopes which have higher uncertainties is quite similar to the amount of isotopes which have lower uncertainties when SCALE-6.0 and EAF-2010 are compared. When the effect of neutron spectra is analysed, the ADS neutron spectrum obtained the highest uncertainties for (n,gamma) and (n,fission) reactions of all libraries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fission product yields are fundamental parameters for several nuclear engineering calculations and in particular for burn-up/activation problems. The impact of their uncertainties was widely studied in the past and valuations were released, although still incomplete. Recently, the nuclear community expressed the need for full fission yield covariance matrices to produce inventory calculation results that take into account the complete uncertainty data. In this work, we studied and applied a Bayesian/generalised least-squares method for covariance generation, and compared the generated uncertainties to the original data stored in the JEFF-3.1.2 library. Then, we focused on the effect of fission yield covariance information on fission pulse decay heat results for thermal fission of 235U. Calculations were carried out using different codes (ACAB and ALEPH-2) after introducing the new covariance values. Results were compared with those obtained with the uncertainty data currently provided by the library. The uncertainty quantification was performed with the Monte Carlo sampling technique. Indeed, correlations between fission yields strongly affect the statistics of decay heat. Introduction Nowadays, any engineering calculation performed in the nuclear field should be accompanied by an uncertainty analysis. In such an analysis, different sources of uncertainties are taken into account. Works such as those performed under the UAM project (Ivanov, et al., 2013) treat nuclear data as a source of uncertainty, in particular cross-section data for which uncertainties given in the form of covariance matrices are already provided in the major nuclear data libraries. Meanwhile, fission yield uncertainties were often neglected or treated shallowly, because their effects were considered of second order compared to cross-sections (Garcia-Herranz, et al., 2010). However, the Working Party on International Nuclear Data Evaluation Co-operation (WPEC)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A sensitivity analysis on the multiplication factor, keffkeff, to the cross section data has been carried out for the MYRRHA critical configuration in order to show the most relevant reactions. With these results, a further analysis on the 238Pu and 56Fe cross sections has been performed, comparing the evaluations provided in the JEFF-3.1.2 and ENDF/B-VII.1 libraries for these nuclides. Then, the effect in MYRRHA of the differences between evaluations are analysed, presenting the source of the differences. With these results, recommendations for the 56Fe and 238Pu evaluations are suggested. These calculations have been performed with SCALE6.1 and MCNPX-2.7e.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Propagation of nuclear data uncertainties in reactor calculations is interesting for design purposes and libraries evaluation. Previous versions of the GRS XSUSA library propagated only neutron cross section uncertainties. We have extended XSUSA uncertainty assessment capabilities by including propagation of fission yields and decay data uncertainties due to the their relevance in depletion simulations. We apply this extended methodology to the UAM6 PWR Pin-Cell Burnup Benchmark, which involves uncertainty propagation through burnup.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Replication Data Management (RDM) aims at enabling the use of data collections from several iterations of an experiment. However, there are several major challenges to RDM from integrating data models and data from empirical study infrastructures that were not designed to cooperate, e.g., data model variation of local data sources. [Objective] In this paper we analyze RDM needs and evaluate conceptual RDM approaches to support replication researchers. [Method] We adapted the ATAM evaluation process to (a) analyze RDM use cases and needs of empirical replication study research groups and (b) compare three conceptual approaches to address these RDM needs: central data repositories with a fixed data model, heterogeneous local repositories, and an empirical ecosystem. [Results] While the central and local approaches have major issues that are hard to resolve in practice, the empirical ecosystem allows bridging current gaps in RDM from heterogeneous data sources. [Conclusions] The empirical ecosystem approach should be explored in diverse empirical environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Transitionprobabilities and oscillatorstrengths of 176 spectral lines with astrophysical interest arising from 5d10ns (n = 7,8), 5d10np (n = 6,7), 5d10nd (n = 6,7), 5d105f, 5d105g, 5d10nh (n = 6,7,8), 5d96s2, and 5d96s6p configurations, and radiativelifetimes for 43 levels of PbIV, have been calculated. These values were obtained in intermediate coupling (IC) and using relativistic Hartree–Fock calculations including core-polarization effects. For the IC calculations, we use the standard method of least-square fitting from experimental energy levels by means of the Cowan computer code. The inclusion in these calculations of the 5d107p and 5d105f configurations has facilitated a complete assignment of the energy levels in the PbIV. Transitionprobabilities, oscillatorstrengths, and radiativelifetimes obtained are generally in good agreement with the experimental data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Determining as accurate as possible spent nuclear fuel isotopic content is gaining importance due to its safety and economic implications. Since nowadays higher burn ups are achievable through increasing initial enrichments, more efficient burn up strategies within the reactor cores and the extension of the irradiation periods, establishing and improving computation methodologies is mandatory in order to carry out reliable criticality and isotopic prediction calculations. Several codes (WIMSD5, SERPENT 1.1.7, SCALE 6.0, MONTEBURNS 2.0 and MCNP-ACAB) and methodologies are tested here and compared to consolidated benchmarks (OECD/NEA pin cell moderated with light water) with the purpose of validating them and reviewing the state of the isotopic prediction capabilities. These preliminary comparisons will suggest what can be generally expected of these codes when applied to real problems. In the present paper, SCALE 6.0 and MONTEBURNS 2.0 are used to model the same reported geometries, material compositions and burn up history of the Spanish Van de llós II reactor cycles 7-11 and to reproduce measured isotopies after irradiation and decay times. We analyze comparisons between measurements and each code results for several grades of geometrical modelization detail, using different libraries and cross-section treatment methodologies. The power and flux normalization method implemented in MONTEBURNS 2.0 is discussed and a new normalization strategy is developed to deal with the selected and similar problems, further options are included to reproduce temperature distributions of the materials within the fuel assemblies and it is introduced a new code to automate series of simulations and manage material information between them. In order to have a realistic confidence level in the prediction of spent fuel isotopic content, we have estimated uncertainties using our MCNP-ACAB system. This depletion code, which combines the neutron transport code MCNP and the inventory code ACAB, propagates the uncertainties in the nuclide inventory assessing the potential impact of uncertainties in the basic nuclear data: cross-section, decay data and fission yields

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fuel cycles are designed with the aim of obtaining the highest amount of energy possible. Since higher burnup values are reached, it is necessary to improve our disposal designs, traditionally based on the conservative assumption that they contain fresh fuel. The criticality calculations involved must consider burnup by making the most of the experimental and computational capabilities developed, respectively, to measure and predict the isotopic content of the spent nuclear fuel. These high burnup scenarios encourage a review of the computational tools to find out possible weaknesses in the nuclear data libraries, in the methodologies applied and their applicability range. Experimental measurements of the spent nuclear fuel provide the perfect framework to benchmark the most well-known and established codes, both in the industry and academic research activity. For the present paper, SCALE 6.0/TRITON and MONTEBURNS 2.0 have been chosen to follow the isotopic content of four samples irradiated in the Spanish Vandellós-II pressurized water reactor up to burnup values ranging from 40 GWd/MTU to 75 GWd/MTU. By comparison with the experimental data reported for these samples, we can probe the applicability of these codes to deal with high burnup problems. We have developed new computational tools within MONTENBURNS 2.0. They make possible to handle an irradiation history that includes geometrical and positional changes of the samples within the reactor core. This paper describes the irradiation scenario against which the mentioned codes and our capabilities are to be benchmarked.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In activation calculations, there are several approaches to quantify uncertainties: deterministic by means of sensitivity analysis, and stochastic by means of Monte Carlo. Here, two different Monte Carlo approaches for nuclear data uncertainty are presented: the first one is the Total Monte Carlo (TMC). The second one is by means of a Monte Carlo sampling of the covariance information included in the nuclear data libraries to propagate these uncertainties throughout the activation calculations. This last approach is what we named Covariance Uncertainty Propagation, CUP. This work presents both approaches and their differences. Also, they are compared by means of an activation calculation, where the cross-section uncertainties of 239Pu and 241Pu are propagated in an ADS activation calculation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this work is to present the Exercise I-1b “pin-cell burn-up benchmark” proposed in the framework of OECD LWR UAM. Its objective is to address the uncertainty due to the basic nuclear data as well as the impact of processing the nuclear and covariance data in a pin-cell depletion calculation. Four different sensitivity/uncertainty propagation methodologies participate in this benchmark (GRS, NRG, UPM, and SNU&KAERI). The paper describes the main features of the UPM model (hybrid method) compared with other methodologies. The requested output provided by UPM is presented, and it is discussed regarding the results of other methodologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Following the processing and validation of JEFF-3.1 performed in 2006 and presented in ND2007, and as a consequence of the latest updated of this library (JEFF-3.1.2) in February 2012, a new processing and validation of JEFF-3.1.2 cross section library is presented in this paper. The processed library in ACE format at ten different temperatures was generated with NJOY-99.364 nuclear data processing system. In addition, NJOY-99 inputs are provided to generate PENDF, GENDF, MATXSR and BOXER formats. The library has undergone strict QA procedures, being compared with other available libraries (e.g. ENDF/B-VII.1) and processing codes as PREPRO-2000 codes. A set of 119 criticality benchmark experiments taken from ICSBEP-2010 has been used for validation purposes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The calculation of the effective delayed neutron fraction, beff , with Monte Carlo codes is a complex task due to the requirement of properly considering the adjoint weighting of delayed neutrons. Nevertheless, several techniques have been proposed to circumvent this difficulty and obtain accurate Monte Carlo results for beff without the need of explicitly determining the adjoint flux. In this paper, we make a review of some of these techniques; namely we have analyzed two variants of what we call the k-eigenvalue technique and other techniques based on different interpretations of the physical meaning of the adjoint weighting. To test the validity of all these techniques we have implemented them with the MCNPX code and we have benchmarked them against a range of critical and subcritical systems for which either experimental or deterministic values of beff are available. Furthermore, several nuclear data libraries have been used in order to assess the impact of the uncertainty in nuclear data in the calculated value of beff .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Best estimate analysis of rod ejection transients requires 3D kinetics core simulators. If they use cross sections libraries compiled in multidimensional tables,interpolation errors – originated when the core simulator computes the cross sections from the table values – are a source of uncertainty in k-effective calculations that should be accounted for. Those errors depend on the grid covering the domain of state variables and can be easily reduced, in contrast with other sources of uncertainties such as the ones due to nuclear data, by choosing an optimized grid distribution. The present paper assesses the impact of the grid structure on a PWR rod ejection transient analysis using the coupled neutron-kinetics/thermal-hydraulicsCOBAYA3/COBRA-TF system. Forthispurpose, the OECD/NEA PWR MOX/UO2 core transient benchmark has been chosen, as material compositions and geometries are available, allowing the use of lattice codes to generate libraries with different grid structures. Since a complete nodal cross-section library is also provided as part of the benchmark specifications, the effects of the library generation on transient behavior are also analyzed.Results showed large discrepancies when using the benchmark library and own-generated libraries when compared with benchmark participants’ solutions. The origin of the discrepancies was found to lie in the nodal cross sections provided in the benchmark.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial Data Infrastructures have become a methodological and technological benchmark enabling distributed access to historical-cartographic archives. However, it is essential to offer enhanced virtual tools that imitate the current processes and methodologies that are carried out by librarians, historians and academics in the existing map libraries around the world. These virtual processes must be supported by a generic framework for managing, querying, and accessing distributed georeferenced resources and other content types such as scientific data or information. The authors have designed and developed support tools to provide enriched browsing, measurement and geometrical analysis capabilities, and dynamical querying methods, based on SDI foundations. The DIGMAP engine and the IBERCARTO collection enable access to georeferenced historical-cartographical archives. Based on lessons learned from the CartoVIRTUAL and DynCoopNet projects, a generic service architecture scheme is proposed. This way, it is possible to achieve the integration of virtual map rooms and SDI technologies bringing support to researchers within the historical and social domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

T actitivity in LiPb LiPb mock-up material irradiated in Frascati: measurement and MCNP results