963 resultados para laser communications satellite-based laser submerged platform Monte Carlo simulation
Resumo:
Direct Simulation Monte Carlo (DSMC) is a powerful numerical method to study rarefied gas flows such as cometary comae and has been used by several authors over the past decade to study cometary outflow. However, the investigation of the parameter space in simulations can be time consuming since 3D DSMC is computationally highly intensive. For the target of ESA's Rosetta mission, comet 67P/Churyumov-Gerasimenko, we have identified to what extent modification of several parameters influence the 3D flow and gas temperature fields and have attempted to establish the reliability of inferences about the initial conditions from in situ and remote sensing measurements. A large number of DSMC runs have been completed with varying input parameters. In this work, we present the simulation results and conclude on the sensitivity of solutions to certain inputs. It is found that among cases of water outgassing, the surface production rate distribution is the most influential variable to the flow field.
Resumo:
ObjectKineticMonteCarlo models allow for the study of the evolution of the damage created by irradiation to time scales that are comparable to those achieved experimentally. Therefore, the essential ObjectKineticMonteCarlo parameters can be validated through comparison with experiments. However, this validation is not trivial since a large number of parameters is necessary, including migration energies of point defects and their clusters, binding energies of point defects in clusters, as well as the interactionradii. This is particularly cumbersome when describing an alloy, such as the Fe–Cr system, which is of interest for fusion energy applications. In this work we describe an ObjectKineticMonteCarlo model for Fe–Cr alloys in the dilute limit. The parameters used in the model come either from density functional theory calculations or from empirical interatomic potentials. This model is used to reproduce isochronal resistivity recovery experiments of electron irradiateddiluteFe–Cr alloys performed by Abe and Kuramoto. The comparison between the calculated results and the experiments reveal that an important parameter is the capture radius between substitutionalCr and self-interstitialFe atoms. A parametric study is presented on the effect of the capture radius on the simulated recovery curves.
Resumo:
All meta-analyses should include a heterogeneity analysis. Even so, it is not easy to decide whether a set of studies are homogeneous or heterogeneous because of the low statistical power of the statistics used (usually the Q test). Objective: Determine a set of rules enabling SE researchers to find out, based on the characteristics of the experiments to be aggregated, whether or not it is feasible to accurately detect heterogeneity. Method: Evaluate the statistical power of heterogeneity detection methods using a Monte Carlo simulation process. Results: The Q test is not powerful when the meta-analysis contains up to a total of about 200 experimental subjects and the effect size difference is less than 1. Conclusions: The Q test cannot be used as a decision-making criterion for meta-analysis in small sample settings like SE. Random effects models should be used instead of fixed effects models. Caution should be exercised when applying Q test-mediated decomposition into subgroups.
Resumo:
The uncertainty propagation in fuel cycle calculations due to Nuclear Data (ND) is a important important issue for : issue for : • Present fuel cycles (e.g. high burnup fuel programme) • New fuel cycles designs (e.g. fast breeder reactors and ADS) Different error propagation techniques can be used: • Sensitivity analysis • Response Response Surface Method Surface Method • Monte Carlo technique Then, p p , , in this paper, it is assessed the imp y pact of ND uncertainties on the decay heat and radiotoxicity in two applications: • Fission Pulse Decay ( y Heat calculation (FPDH) • Conceptual design of European Facility for Industrial Transmutation (EFIT)
Resumo:
En este estudio se aplica una metodología de obtención de las leyes de frecuencia derivadas (de caudales máximo vertidos y niveles máximos alcanzados) en un entorno de simulaciones de Monte Carlo, para su inclusión en un modelo de análisis de riesgo de presas. Se compara su comportamiento respecto del uso de leyes de frecuencia obtenidas con las técnicas tradicionalmente utilizadas.
Resumo:
Esta tesis analiza los elementos que afectan a la evaluación del rendimiento dentro de la técnica de radiodiagnóstico mediante tomografía por emisión de positrones (PET), centrándose en escáneres preclínicos. Se exploran las posibilidades de los protocolos estándar de evaluación sobre los siguientes aspectos: su uso como herramienta para validar programas de simulación Montecarlo, como método para la comparación de escáneres y su validez en el estudio del efecto sobre la calidad de imagen al utilizar radioisótopos alternativos. Inicialmente se estudian los métodos de evaluación orientados a la validación de simulaciones PET, para ello se presenta el programa GAMOS como entorno de simulación y se muestran los resultados de su validación basada en el estándar NEMA NU 4-2008 para escáneres preclínicos. Esta validación se ha realizado mediante la comparación de los resultados simulados frente a adquisiciones reales en el equipo ClearPET, describiendo la metodología de evaluación y selección de los parámetros NEMA. En este apartado también se mencionan las aportaciones desarrolladas en GAMOS para aplicaciones PET, como la inclusión de herramientas para la reconstrucción de imágenes. Por otro lado, la evaluación NEMA del ClearPET es utilizada para comparar su rendimiento frente a otro escáner preclínico: el sistema rPET-1. Esto supone la primera caracterización NEMA NU 4 completa de ambos equipos; al mismo tiempo que se analiza cómo afectan las importantes diferencias de diseño entre ellos, especialmente el tamaño axial del campo de visión y la configuración de los detectores. El 68Ga es uno de los radioisótopos no convencionales en imagen PET que está experimentando un mayor desarrollo, sin embargo, presenta la desventaja del amplio rango o distancia recorrida por el positrón emitido. Además del rango del positrón, otra propiedad física característica de los radioisótopos PET que puede afectar a la imagen es la emisión de fotones gamma adicionales, tal como le ocurre al isótopo 48V. En esta tesis se evalúan dichos efectos mediante estudios de resolución espacial y calidad de imagen NEMA. Finalmente, se analiza el alcance del protocolo NEMA NU 4-2008 cuando se utiliza para este propósito, adaptándolo a tal fin y proponiendo posibles modificaciones. Abstract This thesis analyzes the factors affecting the performance evaluation in positron emission tomography (PET) imaging, focusing on preclinical scanners. It explores the possibilities of standard protocols of assessment on the following aspects: their use as tools to validate Monte Carlo simulation programs, their usefulness as a method for comparing scanners and their validity in the study of the effect of alternative radioisotopes on image quality. Initially we study the methods of performance evaluation oriented to validate PET simulations. For this we present the GAMOS program as a simulation framework and show the results of its validation based on the standard NEMA NU 4-2008 for preclinical PET scanners. This has been accomplished by comparing simulated results against experimental acquisitions in the ClearPET scanner, describing the methodology for the evaluation and selection of NEMA parameters. This section also mentions the contributions developed in GAMOS for PET applications, such as the inclusion of tools for image reconstruction. Furthermore, the evaluation of the ClearPET scanner is used to compare its performance against another preclinical scanner, specifically the rPET-1 system. This is the first complete NEMA NU 4 based characterization study of both systems. At the same time we analyze how do the significant design differences of these two systems, especially the size of the axial field of view and the detectors configuration affect their performance characteristics. 68Ga is one of the unconventional radioisotopes in PET imaging the use of which is currently significantly increasing; however, it presents the disadvantage of the long positron range (distance traveled by the emitted positron before annihilating with an electron). Besides the positron range, additional gamma photon emission is another physical property characteristic of PET radioisotopes that can affect the reconstructed image quality, as it happens to the isotope 48V. In this thesis we assess these effects through studies of spatial resolution and image quality. Finally, we analyze the scope of the NEMA NU 4-2008 to carry out such studies, adapting it and proposing possible modifications.
Resumo:
This study characterises the abatement effect of large dams with fixed-crest spillways under extreme design flood conditions. In contrast to previous studies using specific hydrographs for flow into the reservoir and simplifications to obtain analytical solutions, an automated tool was designed for calculations based on a Monte Carlo simulation environment, which integrates models that represent the different physical processes in watersheds with areas of 150?2000 km2. The tool was applied to 21 sites that were uniformly distributed throughout continental Spain, with 105 fixed-crest dam configurations. This tool allowed a set of hydrographs to be obtained as an approximation for the hydrological forcing of a dam and the characterisation of the response of the dam to this forcing. For all cases studied, we obtained a strong linear correlation between the peak flow entering the reservoir and the peak flow discharged by the dam, and a simple general procedure was proposed to characterise the peak-flow attenuation behaviour of the reservoir. Additionally, two dimensionless coefficients were defined to relate the variables governing both the generation of the flood and its abatement in the reservoir. Using these coefficients, a model was defined to allow for the estimation of the flood abatement effect of a reservoir based on the available information. This model should be useful in the hydrological design of spillways and the evaluation of the hydrological safety of dams. Finally, the proposed procedure and model were evaluated and representative applications were presented
Resumo:
The new Spanish Regulation in Building Acoustic establishes values and limits for the different acoustic magnitudes whose fulfillment can be verify by means field measurements. In this sense, an essential aspect of a field measurement is to give the measured magnitude and the uncertainty associated to such a magnitude. In the calculus of the uncertainty it is very usual to follow the uncertainty propagation method as described in the Guide to the expression of Uncertainty in Measurements (GUM). Other option is the numerical calculus based on the distribution propagation method by means of Monte Carlo simulation. In fact, at this stage, it is possible to find several publications developing this last method by using different software programs. In the present work, we used Excel for the Monte Carlo simulation for the calculus of the uncertainty associated to the different magnitudes derived from the field measurements following ISO 140-4, 140-5 and 140-7. We compare the results with the ones obtained by the uncertainty propagation method. Although both methods give similar values, some small differences have been observed. Some arguments to explain such differences are the asymmetry of the probability distributions associated to the entry magnitudes,the overestimation of the uncertainty following the GUM
Resumo:
The aim of this work is to optimize a Monte Carlo (MC) kernel for electron radiation therapy (IOERT) compatible with intraoperative usage and to integrate it within an existing IOERT dedicated treatment planning system (TPS)
Resumo:
We propose a new method for ranking alternatives in multicriteria decision-making problems when there is imprecision concerning the alternative performances, component utility functions and weights. We assume decision maker?s preferences are represented by an additive multiattribute utility function, in which weights can be modeled by independent normal variables, fuzzy numbers, value intervals or by an ordinal relation. The approaches are based on dominance measures or exploring the weight space in order to describe which ratings would make each alternative the preferred one. On the one hand, the approaches based on dominance measures compute the minimum utility difference among pairs of alternatives. Then, they compute a measure by which to rank the alternatives. On the other hand, the approaches based on exploring the weight space compute confidence factors describing the reliability of the analysis. These methods are compared using Monte Carlo simulation.
Resumo:
Kinetic Monte Carlo (KMC) is a widely used technique to simulate the evolution of radiation damage inside solids. Despite de fact that this technique was developed several decades ago, there is not an established and easy to access simulating tool for researchers interested in this field, unlike in the case of molecular dynamics or density functional theory calculations. In fact, scientists must develop their own tools or use unmaintained ones in order to perform these types of simulations. To fulfil this need, we have developed MMonCa, the Modular Monte Carlo simulator. MMonCa has been developed using professional C++ programming techniques and has been built on top of an interpreted language to allow having a powerful yet flexible, robust but customizable and easy to access modern simulator. Both non lattice and Lattice KMC modules have been developed. We will present in this conference, for the first time, the MMonCa simulator. Along with other (more detailed) contributions in this meeting, the versatility of MMonCa to study a number of problems in different materials (particularly, Fe and W) subject to a wide range of conditions will be shown. Regarding KMC simulations, we have studied neutron-generated cascade evolution in Fe (as a model material). Starting with a Frenkel pair distribution we have followed the defect evolution up to 450 K. Comparison with previous simulations and experiments shows excellent agreement. Furthermore, we have studied a more complex system (He-irradiated W:C) using a previous parametrization [1]. He-irradiation at 4 K followed by isochronal annealing steps up to 500 K has been simulated with MMonCa. The He energy was 400 eV or 3 keV. In the first case, no damage is associated to the He implantation, whereas in the second one, a significant Frenkel pair concentration (evolving into complex clusters) is associated to the He ions. We have been able to explain He desorption both in the absence and in the presence of Frenkel pairs and we have also applied MMonCa to high He doses and fluxes at elevated temperatures. He migration and trapping dominate the kinetics of He desorption. These processes will be discussed and compared to experimental results. [1] C.S. Becquart et al. J. Nucl. Mater. 403 (2010) 75
Resumo:
inor actinides (MAs) transmutation is a main design objective of advanced nuclear systems such as generation IV Sodium Fast Reactors (SFRs). In advanced fuel cycles, MA contents in final high level waste packages are main contributors to short term heat production as well as to long-term radiotoxicity. Therefore, MA transmutation would have an impact on repository designs and would reduce the environment burden of nuclear energy. In order to predict such consequences Monte Carlo (MC) transport codes are used in reactor design tasks and they are important complements and references for routinely used deterministic computational tools. In this paper two promising Monte Carlo transport-coupled depletion codes, EVOLCODE and SERPENT, are used to examine the impact of MA burning strategies in a SFR core, 3600 MWth. The core concept proposal for MA loading in two configurations is the result of an optimization effort upon a preliminary reference design to reduce the reactivity insertion as a consequence of sodium voiding, one of the main concerns of this technology. The objective of this paper is double. Firstly, efficiencies of the two core configurations for MA transmutation are addressed and evaluated in terms of actinides mass changes and reactivity coefficients. Results are compared with those without MA loading. Secondly, a comparison of the two codes is provided. The discrepancies in the results are quantified and discussed.
Resumo:
In activation calculations, there are several approaches to quantify uncertainties: deterministic by means of sensitivity analysis, and stochastic by means of Monte Carlo. Here, two different Monte Carlo approaches for nuclear data uncertainty are presented: the first one is the Total Monte Carlo (TMC). The second one is by means of a Monte Carlo sampling of the covariance information included in the nuclear data libraries to propagate these uncertainties throughout the activation calculations. This last approach is what we named Covariance Uncertainty Propagation, CUP. This work presents both approaches and their differences. Also, they are compared by means of an activation calculation, where the cross-section uncertainties of 239Pu and 241Pu are propagated in an ADS activation calculation.
Resumo:
We introduce a dominance intensity measuring method to derive a ranking of alternatives to deal with incomplete information in multi-criteria decision-making problems on the basis of multi-attribute utility theory (MAUT) and fuzzy sets theory. We consider the situation where there is imprecision concerning decision-makers’ preferences, and imprecise weights are represented by trapezoidal fuzzy weights.The proposed method is based on the dominance values between pairs of alternatives. These values can be computed by linear programming, as an additive multi-attribute utility model is used to rate the alternatives. Dominance values are then transformed into dominance intensity measures, used to rank the alternatives under consideration. Distances between fuzzy numbers based on the generalization of the left and right fuzzy numbers are utilized to account for fuzzy weights. An example concerning the selection of intervention strategies to restore an aquatic ecosystem contaminated by radionuclides illustrates the approach. Monte Carlo simulation techniques have been used to show that the proposed method performs well for different imprecision levels in terms of a hit ratio and a rank-order correlation measure.
Resumo:
Introducing cover crops (CC) interspersed with intensively fertilized crops in rotation has the potential to reduce nitrate leaching. This paper evaluates various strategies involving CC between maize and compares the economic and environmental results with respect to a typical maize?fallow rotation. The comparison is performed through stochastic (Monte-Carlo) simulation models of farms? profits using probability distribution functions (pdfs) of yield and N fertilizer saving fitted with data collected from various field trials and pdfs of crop prices and the cost of fertilizer fitted from statistical sources. Stochastic dominance relationships are obtained to rank the most profitable strategies from a farm financial perspective. A two-criterion comparison scheme is proposed to rank alternative strategies based on farm profit and nitrate leaching levels, taking the baseline scenario as the maize?fallow rotation. The results show that when CC biomass is sold as forage instead of keeping it in the soil, greater profit and less leaching of nitrates are achieved than in the baseline scenario. While the fertilizer saving will be lower if CC is sold than if it is kept in the soil, the revenue obtained from the sale of the CC compensates for the reduced fertilizer savings. The results show that CC would perhaps provide a double dividend of greater profit and reduced nitrate leaching in intensive irrigated cropping systems in Mediterranean regions.