978 resultados para NUCLEAR DATA COLLECTIONS
Resumo:
Fission fragment mass distributions were measured in heavy-ion induced fission of 238U. The mass distributions changed drastically with incident energy. The results are explained by a change of the ratio between fusion and quasifission with nuclear orientation. A calculation based on a fluctuation dissipation model reproduced the mass distributions and their incident energy dependence. Fusion probability was determined in the analysis. Evaporation residue cross sections were calculated with a statistical model for the reactions of 30Si+238U and 34S+238U using the obtained fusion probability in the entrance channel. The results agree with the measured cross sections of 263,264Sg and 267,268Hs, produced by 30Si+238U and 34S+238U, respectively. It is also suggested that sub-barrier energies can be used for heavy element synthesis.
Resumo:
Alveolar echinococcosis, caused by the tapeworm Echinococcus multilocularis, is one of the most severe parasitic diseases in humans and represents one of the 17 neglected diseases prioritised by the World Health Organisation (WHO) in 2012. Considering the major medical and veterinary importance of this parasite, the phylogeny of the genus Echinococcus is of considerable importance; yet, despite numerous efforts with both mitochondrial and nuclear data, it has remained unresolved. The genus is clearly complex, and this is one of the reasons for the incomplete understanding of its taxonomy. Although taxonomic studies have recognised E. multilocularis as a separate entity from the Echinococcus granulosus complex and other members of the genus, it would be premature to draw firm conclusions about the taxonomy of the genus before the phylogeny of the whole genus is fully resolved. The recent sequencing of E. multilocularis and E. granulosus genomes opens new possibilities for performing in-depth phylogenetic analyses. In addition, whole genome data provide the possibility of inferring phylogenies based on a large number of functional genes, i.e. genes that trace the evolutionary history of adaptation in E. multilocularis and other members of the genus. Moreover, genomic data open new avenues for studying the molecular epidemiology of E. multilocularis: genotyping studies with larger panels of genetic markers allow the genetic diversity and spatial dynamics of parasites to be evaluated with greater precision. There is an urgent need for international coordination of genotyping of E. multilocularis isolates from animals and human patients. This could be fundamental for a better understanding of the transmission of alveolar echinococcosis and for designing efficient healthcare strategies.
Resumo:
Today's digital libraries (DLs) archive vast amounts of information in the form of text, videos, images, data measurements, etc. User access to DL content can rely on similarity between metadata elements, or similarity between the data itself (content-based similarity). We consider the problem of exploratory search in large DLs of time-oriented data. We propose a novel approach for overview-first exploration of data collections based on user-selected metadata properties. In a 2D layout representing entities of the selected property are laid out based on their similarity with respect to the underlying data content. The display is enhanced by compact summarizations of underlying data elements, and forms the basis for exploratory navigation of users in the data space. The approach is proposed as an interface for visual exploration, leading the user to discover interesting relationships between data items relying on content-based similarity between data items and their respective metadata labels. We apply the method on real data sets from the earth observation community, showing its applicability and usefulness.
Resumo:
The software Pan2Applic is a tool to convert files or folders of files (ascii/tab-separated data files with or without metaheader), downloaded from PANGAEA via the search engine or the data warehouse to formats as used by applications, e.g. for visualization or further processing. It may also be used to convert files or zip-archives as downloaded from CD-ROM data collections, published in the WDC-MARE Reports series. Pan2Applic is distributed as freeware for the operating systems Microsoft Windows, Apple OS X and Linux.
Resumo:
Transitionprobabilities and oscillatorstrengths of 176 spectral lines with astrophysical interest arising from 5d10ns (n = 7,8), 5d10np (n = 6,7), 5d10nd (n = 6,7), 5d105f, 5d105g, 5d10nh (n = 6,7,8), 5d96s2, and 5d96s6p configurations, and radiativelifetimes for 43 levels of PbIV, have been calculated. These values were obtained in intermediate coupling (IC) and using relativistic Hartree–Fock calculations including core-polarization effects. For the IC calculations, we use the standard method of least-square fitting from experimental energy levels by means of the Cowan computer code. The inclusion in these calculations of the 5d107p and 5d105f configurations has facilitated a complete assignment of the energy levels in the PbIV. Transitionprobabilities, oscillatorstrengths, and radiativelifetimes obtained are generally in good agreement with the experimental data.
Resumo:
Determining as accurate as possible spent nuclear fuel isotopic content is gaining importance due to its safety and economic implications. Since nowadays higher burn ups are achievable through increasing initial enrichments, more efficient burn up strategies within the reactor cores and the extension of the irradiation periods, establishing and improving computation methodologies is mandatory in order to carry out reliable criticality and isotopic prediction calculations. Several codes (WIMSD5, SERPENT 1.1.7, SCALE 6.0, MONTEBURNS 2.0 and MCNP-ACAB) and methodologies are tested here and compared to consolidated benchmarks (OECD/NEA pin cell moderated with light water) with the purpose of validating them and reviewing the state of the isotopic prediction capabilities. These preliminary comparisons will suggest what can be generally expected of these codes when applied to real problems. In the present paper, SCALE 6.0 and MONTEBURNS 2.0 are used to model the same reported geometries, material compositions and burn up history of the Spanish Van de llós II reactor cycles 7-11 and to reproduce measured isotopies after irradiation and decay times. We analyze comparisons between measurements and each code results for several grades of geometrical modelization detail, using different libraries and cross-section treatment methodologies. The power and flux normalization method implemented in MONTEBURNS 2.0 is discussed and a new normalization strategy is developed to deal with the selected and similar problems, further options are included to reproduce temperature distributions of the materials within the fuel assemblies and it is introduced a new code to automate series of simulations and manage material information between them. In order to have a realistic confidence level in the prediction of spent fuel isotopic content, we have estimated uncertainties using our MCNP-ACAB system. This depletion code, which combines the neutron transport code MCNP and the inventory code ACAB, propagates the uncertainties in the nuclide inventory assessing the potential impact of uncertainties in the basic nuclear data: cross-section, decay data and fission yields
Resumo:
Fuel cycles are designed with the aim of obtaining the highest amount of energy possible. Since higher burnup values are reached, it is necessary to improve our disposal designs, traditionally based on the conservative assumption that they contain fresh fuel. The criticality calculations involved must consider burnup by making the most of the experimental and computational capabilities developed, respectively, to measure and predict the isotopic content of the spent nuclear fuel. These high burnup scenarios encourage a review of the computational tools to find out possible weaknesses in the nuclear data libraries, in the methodologies applied and their applicability range. Experimental measurements of the spent nuclear fuel provide the perfect framework to benchmark the most well-known and established codes, both in the industry and academic research activity. For the present paper, SCALE 6.0/TRITON and MONTEBURNS 2.0 have been chosen to follow the isotopic content of four samples irradiated in the Spanish Vandellós-II pressurized water reactor up to burnup values ranging from 40 GWd/MTU to 75 GWd/MTU. By comparison with the experimental data reported for these samples, we can probe the applicability of these codes to deal with high burnup problems. We have developed new computational tools within MONTENBURNS 2.0. They make possible to handle an irradiation history that includes geometrical and positional changes of the samples within the reactor core. This paper describes the irradiation scenario against which the mentioned codes and our capabilities are to be benchmarked.
Resumo:
In activation calculations, there are several approaches to quantify uncertainties: deterministic by means of sensitivity analysis, and stochastic by means of Monte Carlo. Here, two different Monte Carlo approaches for nuclear data uncertainty are presented: the first one is the Total Monte Carlo (TMC). The second one is by means of a Monte Carlo sampling of the covariance information included in the nuclear data libraries to propagate these uncertainties throughout the activation calculations. This last approach is what we named Covariance Uncertainty Propagation, CUP. This work presents both approaches and their differences. Also, they are compared by means of an activation calculation, where the cross-section uncertainties of 239Pu and 241Pu are propagated in an ADS activation calculation.
Resumo:
The aim of this work is to present the Exercise I-1b “pin-cell burn-up benchmark” proposed in the framework of OECD LWR UAM. Its objective is to address the uncertainty due to the basic nuclear data as well as the impact of processing the nuclear and covariance data in a pin-cell depletion calculation. Four different sensitivity/uncertainty propagation methodologies participate in this benchmark (GRS, NRG, UPM, and SNU&KAERI). The paper describes the main features of the UPM model (hybrid method) compared with other methodologies. The requested output provided by UPM is presented, and it is discussed regarding the results of other methodologies.
Resumo:
Following the processing and validation of JEFF-3.1 performed in 2006 and presented in ND2007, and as a consequence of the latest updated of this library (JEFF-3.1.2) in February 2012, a new processing and validation of JEFF-3.1.2 cross section library is presented in this paper. The processed library in ACE format at ten different temperatures was generated with NJOY-99.364 nuclear data processing system. In addition, NJOY-99 inputs are provided to generate PENDF, GENDF, MATXSR and BOXER formats. The library has undergone strict QA procedures, being compared with other available libraries (e.g. ENDF/B-VII.1) and processing codes as PREPRO-2000 codes. A set of 119 criticality benchmark experiments taken from ICSBEP-2010 has been used for validation purposes.
Resumo:
The calculation of the effective delayed neutron fraction, beff , with Monte Carlo codes is a complex task due to the requirement of properly considering the adjoint weighting of delayed neutrons. Nevertheless, several techniques have been proposed to circumvent this difficulty and obtain accurate Monte Carlo results for beff without the need of explicitly determining the adjoint flux. In this paper, we make a review of some of these techniques; namely we have analyzed two variants of what we call the k-eigenvalue technique and other techniques based on different interpretations of the physical meaning of the adjoint weighting. To test the validity of all these techniques we have implemented them with the MCNPX code and we have benchmarked them against a range of critical and subcritical systems for which either experimental or deterministic values of beff are available. Furthermore, several nuclear data libraries have been used in order to assess the impact of the uncertainty in nuclear data in the calculated value of beff .
Resumo:
Best estimate analysis of rod ejection transients requires 3D kinetics core simulators. If they use cross sections libraries compiled in multidimensional tables,interpolation errors – originated when the core simulator computes the cross sections from the table values – are a source of uncertainty in k-effective calculations that should be accounted for. Those errors depend on the grid covering the domain of state variables and can be easily reduced, in contrast with other sources of uncertainties such as the ones due to nuclear data, by choosing an optimized grid distribution. The present paper assesses the impact of the grid structure on a PWR rod ejection transient analysis using the coupled neutron-kinetics/thermal-hydraulicsCOBAYA3/COBRA-TF system. Forthispurpose, the OECD/NEA PWR MOX/UO2 core transient benchmark has been chosen, as material compositions and geometries are available, allowing the use of lattice codes to generate libraries with different grid structures. Since a complete nodal cross-section library is also provided as part of the benchmark specifications, the effects of the library generation on transient behavior are also analyzed.Results showed large discrepancies when using the benchmark library and own-generated libraries when compared with benchmark participants’ solutions. The origin of the discrepancies was found to lie in the nodal cross sections provided in the benchmark.
Resumo:
We thank all the supporting team-members involved in the translation procedures and data collections. Research was supported by the Polish NCN Grant 2011/03/N/HS6/05112 (K.K.) and Chinese NNSF Grant 31200788 (C.X).
Resumo:
Closed miscibility gaps in ternary liquid mixtures, at constant temperature and pressure, are obtained if phase separations occur only in the ternary region, whilst all binary mixtures involved in the system are completely miscible. This type of behaviour, although not very frequent, has been observed for a certain number of systems. Nevertheless, we have found no information about the applicability of the common activity coefficient models, as NRTL and UNIQUAC, for these types of ternary systems. Moreover, any of the island type systems published in the most common liquid–liquid equilibrium data collections, are correlated with any model. In this paper, the applicability of the NRTL equation to model the LLE of island type systems is assessed using topological concepts related to the Gibbs stability test. A first attempt to correlate experimental LLE data for two island type ternary systems is also presented.
Resumo:
One of the main concerns is the nature of the missing values. Let’s consider extremes for simplicity. If missing at random we have not to care about. But if missing shows structures that covariate with substantive variables we have to make decisions. There are, in fact, several options to take. We are speaking about one country, one mode. But if you go cross-cultural (or more precisely, cross-state nations) and mixed modes many questions raise. For example, the simple one. What are we comparing? Reports and books usually go straight into variables distributions and coefficient comparisons. This is possible because the annalist presume "tabula rasa" effect from data collections procedures. But this is not, frequently, the real situation. This paper will expose the mixed missing mode imprint in international surveys. This will help to evaluate how deal with this problem. Also, to consider the real meaning of observed cross-national differences.