937 resultados para methodologies
Resumo:
Consequence analysis is a key aspect of anchoring assessment of landslide impacts to present and long-term development planning. Although several approaches have been developed over the last decade, some of them are difficult to apply in practice, mainly because of the lack of valuable data on historical damages or on damage functions. In this paper, two possible consequence indicators based on a combination of descriptors of the exposure of the elements at risk are proposed in order to map the potential impacts of landslides and highlight the most vulnerable areas. The first index maps the physical vulnerability due to landslide; the second index maps both direct damage (physical, structural, functional) and indirect damage (socio-economic impacts) of landslide hazards. The indexes have been computed for the 200 km2 area of the Barcelonnette Basin (South French Alps), and their potential applications are discussed.
Resumo:
Virus-specific CD4(+) T cells play a major role in viral infections, such as hepatitis C virus (HCV). Viral clearance is associated with vigorous and multi-specific CD4(+) T-cell responses, while chronic infection has been shown to be associated with weak or absent T-cell responses. Most of these studies have used functional assays to analyze virus-specific CD4(+) T-cell responses; however, these and other detection methods have various limitations. Therefore, the important question of whether virus-specific CD4(+) T cells are completely absent or primarily impaired in specific effector functions during chronic infection, has yet to be analyzed in detail. A novel assay, in which virus-specific CD4(+) T-cell frequencies can be determined by de novo CD154 (CD40 ligand) expression in response to viral antigens, can help to overcome some of the limitations of functional assays and restrictions of multimer-based methods. This and other current established methods for the detection of HCV-specific CD4(+) T cells will be discussed in this review.
Resumo:
The objective of the following chapters is to contribute to addressing these data quality issues by improving research on the measurement of abortion incidence and abortion related morbidity. To do so, they provide over- views of existing methods of and approaches to estimating abortion incidence and morbidity. The volume supplies detailed descriptions and examples of key methods. Its goal is to provide a clear understanding of the relative merits of available study designs to quantify abortion incidence and abortion related morbidity. Information on methodologies will greatly assist researchers worldwide in carrying out studies on these topics, particularly in settings where abortion is legally restricted.
Resumo:
Fil: Sánchez, Leandro Enrique. Universidad Nacional de La Plata. Facultad de Humanidades y Ciencias de la Educación. Instituto de Investigaciones en Humanidades y Ciencias Sociales (UNLP-CONICET); Argentina.
Resumo:
Fil: Sánchez, Leandro Enrique. Universidad Nacional de La Plata. Facultad de Humanidades y Ciencias de la Educación. Instituto de Investigaciones en Humanidades y Ciencias Sociales (UNLP-CONICET); Argentina.
Resumo:
Fil: Sánchez, Leandro Enrique. Universidad Nacional de La Plata. Facultad de Humanidades y Ciencias de la Educación. Instituto de Investigaciones en Humanidades y Ciencias Sociales (UNLP-CONICET); Argentina.
Resumo:
The prediction of the tritium production is required for handling procedures of samples, safety&maintenance and licensing of the International Fusion Materials Irradiation Facility (IFMIF).
Resumo:
- Need of Tritium production - Neutronic objectives - The Frascati experiment - Measurements of Tritium activity
Resumo:
Determining as accurate as possible spent nuclear fuel isotopic content is gaining importance due to its safety and economic implications. Since nowadays higher burn ups are achievable through increasing initial enrichments, more efficient burn up strategies within the reactor cores and the extension of the irradiation periods, establishing and improving computation methodologies is mandatory in order to carry out reliable criticality and isotopic prediction calculations. Several codes (WIMSD5, SERPENT 1.1.7, SCALE 6.0, MONTEBURNS 2.0 and MCNP-ACAB) and methodologies are tested here and compared to consolidated benchmarks (OECD/NEA pin cell moderated with light water) with the purpose of validating them and reviewing the state of the isotopic prediction capabilities. These preliminary comparisons will suggest what can be generally expected of these codes when applied to real problems. In the present paper, SCALE 6.0 and MONTEBURNS 2.0 are used to model the same reported geometries, material compositions and burn up history of the Spanish Van de llós II reactor cycles 7-11 and to reproduce measured isotopies after irradiation and decay times. We analyze comparisons between measurements and each code results for several grades of geometrical modelization detail, using different libraries and cross-section treatment methodologies. The power and flux normalization method implemented in MONTEBURNS 2.0 is discussed and a new normalization strategy is developed to deal with the selected and similar problems, further options are included to reproduce temperature distributions of the materials within the fuel assemblies and it is introduced a new code to automate series of simulations and manage material information between them. In order to have a realistic confidence level in the prediction of spent fuel isotopic content, we have estimated uncertainties using our MCNP-ACAB system. This depletion code, which combines the neutron transport code MCNP and the inventory code ACAB, propagates the uncertainties in the nuclide inventory assessing the potential impact of uncertainties in the basic nuclear data: cross-section, decay data and fission yields
Resumo:
In the beginning of the 90s, ontology development was similar to an art: ontology developers did not have clear guidelines on how to build ontologies but only some design criteria to be followed. Work on principles, methods and methodologies, together with supporting technologies and languages, made ontology development become an engineering discipline, the so-called Ontology Engineering. Ontology Engineering refers to the set of activities that concern the ontology development process and the ontology life cycle, the methods and methodologies for building ontologies, and the tool suites and languages that support them. Thanks to the work done in the Ontology Engineering field, the development of ontologies within and between teams has increased and improved, as well as the possibility of reusing ontologies in other developments and in final applications. Currently, ontologies are widely used in (a) Knowledge Engineering, Artificial Intelligence and Computer Science, (b) applications related to knowledge management, natural language processing, e-commerce, intelligent information integration, information retrieval, database design and integration, bio-informatics, education, and (c) the Semantic Web, the Semantic Grid, and the Linked Data initiative. In this paper, we provide an overview of Ontology Engineering, mentioning the most outstanding and used methodologies, languages, and tools for building ontologies. In addition, we include some words on how all these elements can be used in the Linked Data initiative.
Resumo:
This communication presents an overview of their first results and innovate methodologies, focused in their possibilities and limitations for the reconstruction of recent floods and paleofloods over the World.