19 resultados para imported methodologies


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The prediction of the tritium production is required for handling procedures of samples, safety&maintenance and licensing of the International Fusion Materials Irradiation Facility (IFMIF).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

- Need of Tritium production - Neutronic objectives - The Frascati experiment - Measurements of Tritium activity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determining as accurate as possible spent nuclear fuel isotopic content is gaining importance due to its safety and economic implications. Since nowadays higher burn ups are achievable through increasing initial enrichments, more efficient burn up strategies within the reactor cores and the extension of the irradiation periods, establishing and improving computation methodologies is mandatory in order to carry out reliable criticality and isotopic prediction calculations. Several codes (WIMSD5, SERPENT 1.1.7, SCALE 6.0, MONTEBURNS 2.0 and MCNP-ACAB) and methodologies are tested here and compared to consolidated benchmarks (OECD/NEA pin cell moderated with light water) with the purpose of validating them and reviewing the state of the isotopic prediction capabilities. These preliminary comparisons will suggest what can be generally expected of these codes when applied to real problems. In the present paper, SCALE 6.0 and MONTEBURNS 2.0 are used to model the same reported geometries, material compositions and burn up history of the Spanish Van de llós II reactor cycles 7-11 and to reproduce measured isotopies after irradiation and decay times. We analyze comparisons between measurements and each code results for several grades of geometrical modelization detail, using different libraries and cross-section treatment methodologies. The power and flux normalization method implemented in MONTEBURNS 2.0 is discussed and a new normalization strategy is developed to deal with the selected and similar problems, further options are included to reproduce temperature distributions of the materials within the fuel assemblies and it is introduced a new code to automate series of simulations and manage material information between them. In order to have a realistic confidence level in the prediction of spent fuel isotopic content, we have estimated uncertainties using our MCNP-ACAB system. This depletion code, which combines the neutron transport code MCNP and the inventory code ACAB, propagates the uncertainties in the nuclide inventory assessing the potential impact of uncertainties in the basic nuclear data: cross-section, decay data and fission yields

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the beginning of the 90s, ontology development was similar to an art: ontology developers did not have clear guidelines on how to build ontologies but only some design criteria to be followed. Work on principles, methods and methodologies, together with supporting technologies and languages, made ontology development become an engineering discipline, the so-called Ontology Engineering. Ontology Engineering refers to the set of activities that concern the ontology development process and the ontology life cycle, the methods and methodologies for building ontologies, and the tool suites and languages that support them. Thanks to the work done in the Ontology Engineering field, the development of ontologies within and between teams has increased and improved, as well as the possibility of reusing ontologies in other developments and in final applications. Currently, ontologies are widely used in (a) Knowledge Engineering, Artificial Intelligence and Computer Science, (b) applications related to knowledge management, natural language processing, e-commerce, intelligent information integration, information retrieval, database design and integration, bio-informatics, education, and (c) the Semantic Web, the Semantic Grid, and the Linked Data initiative. In this paper, we provide an overview of Ontology Engineering, mentioning the most outstanding and used methodologies, languages, and tools for building ontologies. In addition, we include some words on how all these elements can be used in the Linked Data initiative.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This communication presents an overview of their first results and innovate methodologies, focused in their possibilities and limitations for the reconstruction of recent floods and paleofloods over the World.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fully 3D iterative image reconstruction algorithm has been developed for high-resolution PET cameras composed of pixelated scintillator crystal arrays and rotating planar detectors, based on the ordered subsets approach. The associated system matrix is precalculated with Monte Carlo methods that incorporate physical effects not included in analytical models, such as positron range effects and interaction of the incident gammas with the scintillator material. Custom Monte Carlo methodologies have been developed and optimized for modelling of system matrices for fast iterative image reconstruction adapted to specific scanner geometries, without redundant calculations. According to the methodology proposed here, only one-eighth of the voxels within two central transaxial slices need to be modelled in detail. The rest of the system matrix elements can be obtained with the aid of axial symmetries and redundancies, as well as in-plane symmetries within transaxial slices. Sparse matrix techniques for the non-zero system matrix elements are employed, allowing for fast execution of the image reconstruction process. This 3D image reconstruction scheme has been compared in terms of image quality to a 2D fast implementation of the OSEM algorithm combined with Fourier rebinning approaches. This work confirms the superiority of fully 3D OSEM in terms of spatial resolution, contrast recovery and noise reduction as compared to conventional 2D approaches based on rebinning schemes. At the same time it demonstrates that fully 3D methodologies can be efficiently applied to the image reconstruction problem for high-resolution rotational PET cameras by applying accurate pre-calculated system models and taking advantage of the system's symmetries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Escalator and moving walkway are multibody systems with a design of more than a century. Developed methodology allows studying and improving any subsystem of both systems. In addition, new concepts can be developed and tested without the necessity and cost of a real construction. CITEF (Railway Technologies Research Centre) has been modelling escalators for more than four years. Several complex and innovative models has been developed to characterize static, kinematic and dynamic escalator behaviour. The high number of mechanical elements that are part of escalators complicate modelling task. In this way, methodologies and tools have been developed in order to automate these task and saving computational and time costs. Developed methodologies have been validated with the results of comparing real measurements and simulated outputs from a dynamic model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article introduces the current agent-oriented methodologies. It discusses what approaches have been followed (mainly extending existing object oriented and knowledge engineering methodologies), the suitability of these approaches for agent modelling, and some conclusions drawn from the survey.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper shows the results of a research aimed to formulate a general model for supporting the implementation and management of an urban road pricing scheme. After a preliminary work, to define the state of the art in the field of sustainable urban mobility strategies, the problem has been theoretically set up in terms of transport economy, introducing the external costs’ concept duly translated into the principle of pricing for the use of public infrastructures. The research is based on the definition of a set of direct and indirect indicators to qualify the urban areas by land use, mobility, environmental and economic conditions. These indicators have been calculated for a selected set of typical urban areas in Europe on the basis of the results of a survey carried out by means of a specific questionnaire. Once identified the most typical and interesting applications of the road pricing concept in cities such as London (Congestion Charging), Milan (Ecopass), Stockholm (Congestion Tax) and Rome (ZTL), a large benchmarking exercise and the cross analysis of direct and indirect indicators, has allowed to define a simple general model, guidelines and key requirements for the implementation of a pricing scheme based traffic restriction in a generic urban area. The model has been finally applied to the design of a road pricing scheme for a particular area in Madrid, and to the quantification of the expected results of its implementation from a land use, mobility, environmental and economic perspective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La aparición de la fatiga ha sido ampliamente investigada en el acero y en otros materiales metálicos, sin embargo no se conoce en tanta profundidad en el hormigón estructural. Esto crea falta de uniformidad y enfoque en el proceso de verificación de estructuras de hormigón para el estado límite último de la fatiga. A medida que se llevan a cabo más investigaciones, la información sobre los parámetros que afectan a la fatiga en el hormigón comienzan a ser difundidos e incluso los que les afectan de forma indirecta. Esto conlleva a que se estén incorporando en las guías de diseño de todo el mundo, a pesar de que la comprobación del estado límite último no se trata por igual entre los distintos órganos de diseño. Este trabajo presentará un conocimiento básico del fenómeno de la fatiga, qué lo causa y qué condiciones de carga o propiedades materiales amplían o reducen la probabilidad de fallo por fatiga. Cuatro distintos códigos de diseño serán expuestos y su proceso de verificación ha sido examinado, comparados y valorados cualitativa y cuantitativamente. Una torre eólica, como ejemplo, fue analizada usando los procedimientos de verificación como se indica en sus respectivos códigos de referencia. The occurrence of fatigue has been extensively researched in steel and other metallic materials it is however, not as broadly understood in concrete. This produces a lack of uniformity in the approach and process in the verification of concrete structures for the ultimate limit state of fatigue. As more research is conducted and more information is known about the parameters which cause, propagate, and indirectly affect fatigue in concrete, they are incorporated in design guides around the world. Nevertheless, this ultimate limit state verification is not addressed equally by various design governing bodies. This report presents a baseline understanding of what the phenomenon of fatigue is, what causes it, and what loading or material conditions amplify or reduce the likelihood of fatigue failure. Four different design codes are exposed and their verification process has been examined, compared and evaluated both qualitatively and quantitatively. Using a wind turbine tower structure as case study, this report presents calculated results following the verification processes as instructed in the respective reference codes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The latest video coding standards developed, like HEVC (High Efficiency Video Coding, approved in January 2013), require for their implementation the use of devices able to support a high computational load. Considering that currently it is not enough the usage of one unique Digital Signal Processor (DSP), multicore devices have appeared recently in the market. However, due to its novelty, the working methodology that allows produce solutions for these configurations is in a very initial state, since currently the most part of the work needs to be performed manually. In consequence, the objective set consists on finding methodologies that ease this process. The study has been focused on extend a methodology, under development, for the generation of solutions for PCs and embedded systems. During this study, the standards RVC (Reconfigurable Video Coding) and HEVC have been employed, as well as DSPs of the Texas Instruments company. In its development, it has been tried to address all the factors that influence both the development and deployment of these new implementations of video decoders, ranging from tools up to aspects of the partitioning of algorithms, without this can cause a drop in application performance. The results of this study are the description of the employed methodology, the characterization of the software migration process and performance measurements for the HEVC standard in an RVC-based implementation. RESUMEN Los estándares de codificación de vídeo desarrollados más recientemente, como HEVC (High Efficiency Video Coding, aprobado en enero de 2013), requieren para su implementación el uso de dispositivos capaces de soportar una elevada carga computacional. Teniendo en cuenta que actualmente no es suficiente con utilizar un único Procesador Digital de Señal (DSP), han aparecido recientemente dispositivos multinúcleo en el mercado. Sin embargo, debido a su novedad, la metodología de trabajo que permite elaborar soluciones para tales configuraciones se encuentra en un estado muy inicial, ya que actualmente la mayor parte del trabajo debe realizarse manualmente. En consecuencia, el objetivo marcado consiste en encontrar metodologías que faciliten este proceso. El estudio se ha centrado en extender una metodología, en desarrollo, para la generación de soluciones para PC y sistemas empotrados. Durante dicho estudio se han empleado los estándares RVC (Reconfigurable Video Coding) y HEVC, así como DSPs de la compañía Texas Instruments. En su desarrollo se ha tratado de atender a todos los factores que influyen tanto en el desarrollo como en la puesta en marcha de estas nuevas implementaciones de descodificadores de vídeo; abarcando desde las herramientas a utilizar hasta aspectos del particionado de los algoritmos, sin que por ello se produzca una reducción en el rendimiento de las aplicaciones. Los resultados de este estudio son una descripción de la metodología empleada, la caracterización del proceso de migración de software, y medidas de rendimiento para el estándar HEVC en una implementación basada en RVC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fuel cycles are designed with the aim of obtaining the highest amount of energy possible. Since higher burnup values are reached, it is necessary to improve our disposal designs, traditionally based on the conservative assumption that they contain fresh fuel. The criticality calculations involved must consider burnup by making the most of the experimental and computational capabilities developed, respectively, to measure and predict the isotopic content of the spent nuclear fuel. These high burnup scenarios encourage a review of the computational tools to find out possible weaknesses in the nuclear data libraries, in the methodologies applied and their applicability range. Experimental measurements of the spent nuclear fuel provide the perfect framework to benchmark the most well-known and established codes, both in the industry and academic research activity. For the present paper, SCALE 6.0/TRITON and MONTEBURNS 2.0 have been chosen to follow the isotopic content of four samples irradiated in the Spanish Vandellós-II pressurized water reactor up to burnup values ranging from 40 GWd/MTU to 75 GWd/MTU. By comparison with the experimental data reported for these samples, we can probe the applicability of these codes to deal with high burnup problems. We have developed new computational tools within MONTENBURNS 2.0. They make possible to handle an irradiation history that includes geometrical and positional changes of the samples within the reactor core. This paper describes the irradiation scenario against which the mentioned codes and our capabilities are to be benchmarked.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ex ante quantification of impactsis compulsory when establishing a Rural Development Program (RDP) in the European Union. Thus, the purpose of this paper is to learn how to perform it better. In order to this all of the European 2007-2013 RDPs (a total of 88) and all of their corresponding available ex ante evaluations were analyzed.Results show that less than 50% of all RDPs quantify all the impact indicators and that the most used methodology that allows the quantification of all impact indicators is Input-Output. There are two main difficulties cited for not accomplishing the impact quantification: the heterogeneity of actors and factors involved in the program impacts and the lack of needed information.These difficulties should be addressedby usingnew methods that allow approaching the complexity of the programs and by implementing a better planning that facilitatesgathering the needed information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El sector del transporte por carretera es uno de los principales contribuyentes de consumo de combustible y emisiones de España. Por lo tanto, la evaluación de los impactos ambientales del tráfico rodado es esencial para los programas de mitigación del cambio climático y la eficiencia energética. Sin embargo, uno de los retos en la planificación del transporte y el diseño de políticas consiste en la aplicación de metodologías de evaluación de emisiones consistentes, el diseño de estrategias y la evaluación de su eficacia. Las metodologías existentes de evaluación de las emisiones del transporte por carretera, utilizan diferentes niveles de análisis y períodos. Sin embargo, estos análisis son puntuales y no existe una continuidad en el análisis de diferentes estrategias o políticas. Esta tesis doctoral proporciona conocimientos y herramientas para el análisis de las políticas destinadas a reducir las emisiones de tráfico, tomando España como caso de estudio. La investigación se estructura en dos partes: i) el desarrollo y aplicación de metodologías para el análisis de factores y políticas que contribuyen en la evolución de las emisiones GEI del transporte por carretera en España; desde una perspectiva nacional; y ii) el desarrollo y aplicación de un marco metodológico para estimar las emisiones del tráfico interurbano y de evaluar estrategias centradas en la operación del tráfico y en la infraestructura. En resumen, esta tesis demuestra la idoneidad de utilizar diferentes herramientas para analizar las emisiones de tráfico desde diferentes puntos de vista. Desde el diseño de políticas de mitigación y eficiencia energética a nivel nacional, a estrategias centradas en la operación del tráfico interurbano y la infraestructura. Road transport is one of the major contributors to fuel consumption and emissions in Spain. Consequently, assessing the environmental impacts of road traffic is essential for climate change mitigation and energy efficiency programs. However, one of the key challenges of policy makers and transport planners consists of implementing consistent assessment emissions methodologies, applying mitigation strategies, and knowing their effectiveness. Current state-of-the-art emissions assessment methodologies estimate emissions from different levels and periods, using different approaches. Nevertheless, these studies are timely and they usually take different methodologies for analysing different strategies or policies, regardless of the assessment as a whole. This doctoral thesis provides knowledge and methodologies for analysing policies designed to reduce road traffic emissions, using the case study of Spain. The research procedure consists of two main scopes: i) the development and application of methodologies for analysing key factors and policies driving the GHG emissions of road transport in Spain; from a national perspective; and ii) the development and application of a road traffic emissions model for assessing operational and infrastructure strategies of the interurban road network at segment level. In summary, this thesis demonstrates the appropriateness to use different tools to analyse road traffic emissions at different levels: from appropriate nationwide mitigation and energy efficiency policies, to strategies focused on the operation of interurban traffic and infrastructure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

(ENG) IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses (SPA)DPSA (Metodologías Integradas de Análisis Determinista-Probabilista de Seguridad) es un conjunto de métodos que utilizan métodos probabilistas y deterministas estrechamente acoplados para abordar las respectivas fuentes de incertidumbre, permitiendo la toma de decisiones Informada por el Riesgo de forma consistente. El punto de inicio del marco IDPSA es que la justificación de seguridad debe estar basada en el acoplamiento entre consideraciones deterministas (consecuencias) y probabilistas (frecuencia) para abordar la interacción mutua entre perturbaciones estocásticas (como por ejemplo fallos de los equipos, acciones humanas, fenómenos físicos estocásticos) y la respuesta determinista de la planta (como por ejemplo los transitorios). Este artículo da una visión general de algunos métodos IDSPA así como posibles aplicaciones al análisis de seguridad de los PWR.