15 resultados para Laws and scientometric indicators

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, developers of web application mashups face a sheer overwhelming variety and pluralism of web services. Therefore, choosing appropriate web services to achieve specific goals requires a certain amount of knowledge as well as expertise. In order to support users in choosing appropriate web services it is not only important to match their search criteria to a dataset of possible choices but also to rank the results according to their relevance, thus minimizing the time it takes for taking such a choice. Therefore, we investigated six ranking approaches in an empirical manner and compared them to each other. Moreover, we have had a look on how one can combine those ranking algorithms linearly in order to maximize the quality of their outputs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the main results of a comparative evaluation of some acoustical parameters with the user´s perception of urban sounds. The study was carried out in three open spaces integrated with different environmental characteristics but similar objective conditions of urban noise. The subjective evaluation was done by means of a survey simultaneously with the objective measurements. The results of the crossed analysis confirmed that in environments with similar noise levels not always exists direct correlation between the objetive indicators and the acoustic comfort of the people. To predict the acustical quality of the soundscape it is necessary to consider aspects such as the background noise and the perception of natural or technological sounds as complements of the general sound level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainability is an adjective used to characterize agriculture according to the degree of fulfillment of goals. Those goals are related to agro-ecological, environmental and socio-economic dimensions. Sustainability is a dynamic and temporal character. In absolute terms there is not an ending value because it changes as its dimensions make it. Spain is one of the main agricultural countries of the European Union both in terms of crop land and value of productions. The object of this study is to present a methodology of sustainability account to be incorporated into national statistical and to assess their performance in the course of the years. For that reason the data sources used have been the statistics of the Department of Agriculture and from others database. We presented a set of indicators of sustainability and its evaluation in a time series of at least 30 years. The trend analysis offers the evolution of the numerical values of the indicators in terms of efficiency, physical units used for a unit of product or its value in euros. The analyzed crops have been: wheat, barley, maize, sunflower, sugar beet, wine grape, olive oil, citrus, melon and tomato. Physical indicators were: land, water, energy, erosion, soil organic matter, and carbon balance; socio-economic indicators were: agricultural final production, prices, income, employment and use of fertilizers. In general, all crops increased their productive efficiency, higher in irrigated than on dry land. Spanish agricultural carbon sequestration capacity has multiplied by five in the last seventy years, as a result of the increase in the productivity of crops, in terms of total biomass and the modification of the soil management techniques. Livestock sector presents data of pork, broilers and laying hen. Those showed an improvement in efficiency and economic indicators. Overall we can say that Spanish agriculture and livestock subsector have a tendency towards sustainability, being its main threats extreme meteorological factors and the instability of todays markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows the results of a research aimed to formulate a general model for supporting the implementation and management of an urban road pricing scheme. After a preliminary work, to define the state of the art in the field of sustainable urban mobility strategies, the problem has been theoretically set up in terms of transport economy, introducing the external costs’ concept duly translated into the principle of pricing for the use of public infrastructures. The research is based on the definition of a set of direct and indirect indicators to qualify the urban areas by land use, mobility, environmental and economic conditions. These indicators have been calculated for a selected set of typical urban areas in Europe on the basis of the results of a survey carried out by means of a specific questionnaire. Once identified the most typical and interesting applications of the road pricing concept in cities such as London (Congestion Charging), Milan (Ecopass), Stockholm (Congestion Tax) and Rome (ZTL), a large benchmarking exercise and the cross analysis of direct and indirect indicators, has allowed to define a simple general model, guidelines and key requirements for the implementation of a pricing scheme based traffic restriction in a generic urban area. The model has been finally applied to the design of a road pricing scheme for a particular area in Madrid, and to the quantification of the expected results of its implementation from a land use, mobility, environmental and economic perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After more than 40 years of life, software evolution should be considered as a mature field. However, despite such a long history, many research questions still remain open, and controversial studies about the validity of the laws of software evolution are common. During the first part of these 40 years the laws themselves evolved to adapt to changes in both the research and the software industry environments. This process of adaption to new paradigms, standards, and practices stopped about 15 years ago, when the laws were revised for the last time. However, most controversial studies have been raised during this latter period. Based on a systematic and comprehensive literature review, in this paper we describe how and when the laws, and the software evolution field, evolved. We also address the current state of affairs about the validity of the laws, how they are perceived by the research community, and the developments and challenges that are likely to occur in the coming years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Leaf nitrogen and leaf surface area influence the exchange of gases between terrestrial ecosystems and the atmosphere, and play a significant role in the global cycles of carbon, nitrogen and water. The purpose of this study is to use field-based and satellite remote-sensing-based methods to assess leaf nitrogen pools in five diverse European agricultural landscapes located in Denmark, Scotland (United Kingdom), Poland, the Netherlands and Italy. REGFLEC (REGularized canopy reFLECtance) is an advanced image-based inverse canopy radiative transfer modelling system which has shown proficiency for regional mapping of leaf area index (LAI) and leaf chlorophyll (CHLl) using remote sensing data. In this study, high spatial resolution (10–20 m) remote sensing images acquired from the multispectral sensors aboard the SPOT (Satellite For Observation of Earth) satellites were used to assess the capability of REGFLEC for mapping spatial variations in LAI, CHLland the relation to leaf nitrogen (Nl) data in five diverse European agricultural landscapes. REGFLEC is based on physical laws and includes an automatic model parameterization scheme which makes the tool independent of field data for model calibration. In this study, REGFLEC performance was evaluated using LAI measurements and non-destructive measurements (using a SPAD meter) of leaf-scale CHLl and Nl concentrations in 93 fields representing crop- and grasslands of the five landscapes. Furthermore, empirical relationships between field measurements (LAI, CHLl and Nl and five spectral vegetation indices (the Normalized Difference Vegetation Index, the Simple Ratio, the Enhanced Vegetation Index-2, the Green Normalized Difference Vegetation Index, and the green chlorophyll index) were used to assess field data coherence and to serve as a comparison basis for assessing REGFLEC model performance. The field measurements showed strong vertical CHLl gradient profiles in 26% of fields which affected REGFLEC performance as well as the relationships between spectral vegetation indices (SVIs) and field measurements. When the range of surface types increased, the REGFLEC results were in better agreement with field data than the empirical SVI regression models. Selecting only homogeneous canopies with uniform CHLl distributions as reference data for evaluation, REGFLEC was able to explain 69% of LAI observations (rmse = 0.76), 46% of measured canopy chlorophyll contents (rmse = 719 mg m−2) and 51% of measured canopy nitrogen contents (rmse = 2.7 g m−2). Better results were obtained for individual landscapes, except for Italy, where REGFLEC performed poorly due to a lack of dense vegetation canopies at the time of satellite recording. Presence of vegetation is needed to parameterize the REGFLEC model. Combining REGFLEC- and SVI-based model results to minimize errors for a "snap-shot" assessment of total leaf nitrogen pools in the five landscapes, results varied from 0.6 to 4.0 t km−2. Differences in leaf nitrogen pools between landscapes are attributed to seasonal variations, extents of agricultural area, species variations, and spatial variations in nutrient availability. In order to facilitate a substantial assessment of variations in Nl pools and their relation to landscape based nitrogen and carbon cycling processes, time series of satellite data are needed. The upcoming Sentinel-2 satellite mission will provide new multiple narrowband data opportunities at high spatio-temporal resolution which are expected to further improve remote sensing capabilities for mapping LAI, CHLl and Nl.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainability and the food-water-environment nexus. Food-water linkages in global agro-economic models. The CAPRI water module. Potential to jointly assess food and water policies. Pilot case study. Further development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyses the relationship between productive efficiency and online-social-networks (OSN) in Spanish telecommunications firms. A data-envelopment-analysis (DEA) is used and several indicators of business ?social Media? activities are incorporated. A super-efficiency analysis and bootstrapping techniques are performed to increase the model?s robustness and accuracy. Then, a logistic regression model is applied to characterise factors and drivers of good performance in OSN. Results reveal the company?s ability to absorb and utilise OSNs as a key factor in improving the productive efficiency. This paper presents a model for assessing the strategic performance of the presence and activity in OSN.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La adquisición de la competencia grupal es algo básico en la docencia universitaria. Esta tarea va a suponer evaluar diferentes factores en un número elevado de alumnos, lo que puede supone gran complejidad y un esfuerzo elevado. De cara a evitar este esfuerzo se puede pensar en emplear los registros de la interacción de los usuarios almacenados en las plataformas de aprendizaje. Para ello el presente trabajo se basa en el desarrollo de un sistema de Learning Analytics que es utilizado como herramienta para analizar las evidencias individuales de los distintos miembros de un equipo de trabajo. El trabajo desarrolla un modelo teórico apoyado en la herramienta, que permite relacionar las evidencias observadas de forma empírica para cada alumno, con indicadores obtenidos tanto de la acción individual como cooperativo de los miembros de un equipo realizadas a través de los foros de trabajo. Abstract — The development of the group work competence is something basic in university teaching. It should be evaluated, but this means to analyze different issues about the participation of a high number of students which is very complex and implies a lot of effort. In order to facilitate this evaluation it is possible to analyze the logs of students’ interaction in Learning Management Systems. The present work describes the development of a Learning Analytics system that analyzes the interaction of each of the members of working group. This tool is supported by a theoretical model, which allows establishing links between the empirical evidences of each student and the indicators of their action in working forums.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study analyses the structure of air traffic and its distribution among the different countries in the European Union, as well as traffic with an origin or destination in non-EU countries. Data sources are Eurostat statistics and actual flight information from EUROCONTROL. Relevant variables such as the number of flights, passengers or cargo tonnes and production indicators (RPKs) are used together with fuel consumption and CO2 emissions data. The segmentation of air traffic in terms of distance permits an assessment of air transport competition with surface transport modes. The results show a clear concentration of traffic in the five larger countries (France, Germany, Italy, Spain and UK), in terms of RPKs. In terms of distance the segment between 500 and 1000 km in the EU, has more flights, passengers, RTKs and CO2 emissions than larger distances. On the environmental side, the distribution of CO2 emissions within the EU Member States is presented, together with fuel efficiency parameters. In general, a direct relationship between RPKs and CO2 emissions is observed for all countries and all distance bands. Consideration is given to the uptake of alternative fuels. Segmenting CO2 emissions per distance band and aircraft type reveals which flights contribute the most the overall EU CO2 emissions. Finally, projections for future CO2 emissions are estimated, according to three different air traffic growth and biofuel introduction scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main scope of this research is to identify and evaluate solutions to redesign the parcels delivery logistic process to achieve higher level of quality, lower operational costs, energy consumptions and air pollution. The study is starting from the analysis of the delivery process managed by a leader company operating in Rome. Main delivery flows, personnel and fleet management costs, quality performances and environmental impacts are investigated. The results of this analysis are benchmarked with other European situations. On the basis of the feedback of this analysis, a set of operational measures, potentially able tackle the objectives, are identified and assessed by means of a simulative approach. The assessment is based on environmental and economic indicators allowing the comparison between new and reference scenarios from the viewpoints of the key players: operator, customer and Society. Moreover, the operational measures are combined into alternative packages by looking for the sets capable to maximize the benefits for the key players. The methodology, tested on Rome case study, is general and flexible enough to be extended to parcels delivery problem in different urban contexts, as well as to similar urban distribution problems (e.g. press, food, security, school)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitrogen (N) deposition has doubled the natural N inputs received by ecosystems through biological N fixation and is currently a global problem that is affecting the Mediterranean regions. We evaluated the existing relationships between increased atmospheric N deposition and biogeochemical indicators related to soil chemical factors and cryptogam species across semiarid central, southern, and eastern Spain. The cryptogam species studied were the biocrust-forming species Pleurochaete squarrosa (moss) and Cladonia foliacea (lichen). Sampling sites were chosen in Quercus coccifera (kermes oak) shrublands and Pinus halepensis (Aleppo pine) forests to cover a range of inorganic N deposition representative of the levels found in the Iberian Peninsula (between 4.4 and 8.1 kg N ha(-1) year(-1)). We extended the ambient N deposition gradient by including experimental plots to which N had been added for 3 years at rates of 10, 20, and 50 kg N ha(-1) year(-1). Overall, N deposition (extant plus simulated) increased soil inorganic N availability and caused soil acidification. Nitrogen deposition increased phosphomonoesterase (PME) enzyme activity and PME/nitrate reductase (NR) ratio in both species, whereas the NR activity was reduced only in the moss. Responses of PME and NR activities were attributed to an induced N to phosphorus imbalance and to N saturation, respectively. When only considering the ambient N deposition, soil organic C and N contents were positively related to N deposition, a response driven by pine forests. The PME/NR ratios of the moss were better predictors of N deposition rates than PME or NR activities alone in shrublands, whereas no correlation between N deposition and the lichen physiology was observed. We conclude that integrative physiological measurements, such as PME/NR ratios, measured on sensitive species such as P. squarrosa, can provide useful data for national-scale biomonitoring programs, whereas soil acidification and soil C and N storage could be useful as additional corroborating ecosystem indicators of chronic N pollution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we focus on the selection of safeguards in a fuzzy risk analysis and management methodology for information systems (IS). Assets are connected by dependency relationships, and a failure of one asset may affect other assets. After computing impact and risk indicators associated with previously identified threats, we identify and apply safeguards to reduce risks in the IS by minimizing the transmission probabilities of failures throughout the asset network. However, as safeguards have associated costs, the aim is to select the safeguards that minimize costs while keeping the risk within acceptable levels. To do this, we propose a dynamic programming-based method that incorporates simulated annealing to tackle optimizations problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En las últimas décadas, la agricultura sostenible ha sido objeto de gran interés y debate académico, no sólo en términos conceptuales, sino también en términos metodológicos. La persistencia de la inseguridad alimentaria y el deterioro de los recursos naturales en muchas regiones del mundo, ha provocado el surgimiento de numerosas iniciativas centradas en revitalizar la agricultura campesina así como renovadas discusiones sobre el rol que juega la agricultura como motor de desarrollo y principal actividad para alivio de la pobreza. Por ello, cuando hablamos de evaluar sistemas campesinos de montaña, debemos considerar tanto la dimensión alimentaria como las especificidades propias de los sistemas montañosos como base fundamental de la sostenibilidad. Al evaluar la contribución que han hecho alternativas tecnológicas y de manejo en la mejora de la sostenibilidad y la seguridad alimentaria de los sistemas campesinos de montaña en Mesoamérica, surgen tres preguntas de investigación: • ¿Se está evaluando la sostenibilidad de los sistemas campesinos teniendo en cuenta la variabilidad climática, la participación de los agricultores y las dinámicas temporales? • ¿Podemos rescatar tendencias comunes en estos sistemas y extrapolar los resultados a otras zonas? • ¿Son inequívocamente positivas las alternativas propuestas que se han llevado a cabo? En este trabajo se presentan tres evaluaciones de sostenibilidad que tratan de poner de manifiesto cuáles son los retos y oportunidades que enfrentan actualmente los sistemas campesinos de montaña. En primer lugar, se evalúan tres sistemas de manejo agrícola bajo dos años meteorológicamente contrastantes. Se determinó que durante el año que experimentó lluvias abundantes y temperaturas moderadas, los sistemas de bajos insumos, basados en el uso de abonos orgánicos y rotación de cultivos, obtuvieron los mejores resultados en indicadores ecológicos y similares resultados en los económicos y sociales que el sistema de altos insumos químicos. En el segundo año, con heladas tempranas y sequía invernal, la productividad se redujo para todos los sistemas pero los sistemas más diversificados (en variedades de maíz y/o siembra de otros cultivos) pudieron resistir mejor los contratiempos climáticos. En segundo lugar, se evalúa el grado de conocimiento (percepción) campesino para determinar los factores claves que determinan la sostenibilidad de sus sistemas y su seguridad alimentaria. Se determinó que los principales puntos críticos identificados por los campesinos (tamaño de parcela y pendiente del terreno) afectan de forma significativa a cuestiones de índole económica, pero no son capaces de explicar los desequilibrios alimenticios existentes. Realizando un análisis comparativo entre comunidades que presentaban buenos y malos resultados en cuanto a aporte energético y proteico, se determinó que la seguridad alimentaria estaba relacionada con la sostenibilidad de los sistemas y que concretamente estaba ligada a los atributos de equidad y autonomía. Otro resultado destacable fue que las comunidades más marginales y con mayor dificultad de acceso mostraron mayores niveles de inseguridad alimentaria, pero la variabilidad intergrupal fue muy alta. Eso demuestra que la seguridad alimentaria y nutricional forma parte de un complejo sistema de estrategias de autoabastecimiento ligada a la idiosincrasia misma de cada uno de los hogares. En tercer lugar, se evaluó el desempeño de las escuelas de campo de agricultores (ECAs) en la mejora de la sostenibilidad y la seguridad alimentaria de un sistema campesino de montaña. Para ver el efecto del impacto de estas metodologías a largo plazo, se estudiaron tres comunidades donde se habían implementado ECAs hace 8, 5 y 3 años. Encontramos que el impacto fue progresivo ya que fue la comunidad más antigua la que mejores valores obtuvo. El impacto de las ECAs fue rápido y persistente en los indicadores relacionados con la participación, el acceso a servicios básicos y la conservación de los recursos naturales. El estudio demostró un claro potencial de las ECAs en la mejora general de la sostenibilidad y la seguridad alimentaria de estos sistemas, sin embargo se observó una relación directa entre el aumento de producción agrícola y el uso de insumos externos, lo que puede suponer un punto crítico para los ideales sostenibles. ABSTRACT During the last decades, sustainable agriculture has been the subject of considerable academic interest and debate, not only in conceptual terms, but also in methodological ones. The persistence of high levels of environmental degradation and food insecurity in many regions has led to new initiatives focused on revitalizing peasant agriculture and renewed discussions of the role of sustainable agriculture as an engine for development, environmental conservation and poverty alleviation. Therefore, to assess mountain farming systems, we must consider food dimension and taking into account the specificities of the mountain systems as the foundation of sustainability. When evaluating contribution of technological and management alternative proposals in achieving sustainability and food security for peasant farming systems in Mesoamerican highlands, three research questions arise: • Is sustainability of peasant-farming systems being evaluated taking into account climate variability, participation of farmers and temporal dynamics? • Can we rescue common trends in these systems and extrapolate the results to other areas? • What alternative proposals that have been conducted are unequivocally positives? In this document, we present three evaluations of sustainability that try to highlight the challenges and opportunities that currently face mountain farming systems in Mesoamerica. First, we evaluate the sustainability of three agricultural management systems in two contrasting weather years. We determined that during the first year that exposed heavy rains and moderate temperatures, low-input systems, which are based on the use of organic fertilizers and crop rotation, provided better results in terms of ecological indicators and equal results in terms of economic and social indicators than those achieved using a high chemical input system. In the second year, which featured early frosts and a winter drought, productivity declined in all systems; however, the most diversified systems (in terms of the maize varieties grown and the sowing of other crops) more successfully resisted these climatic adversities. Second, we evaluate the farmers’ perception to determine the key drivers for achieving their sustainability and food and nutritional security. We determined that the key factors identified by farmers (landholding size and slope of cropland) exerted significant impacts on economic disparities but did not explain the malnutrition levels. We compared two contrasting hamlets according to their energy and protein supply, one namely Limón Timoté (LT), which did not present food problems and Limón Peña Blanca (LP), which did exhibit food insecurity. The results showed that FNS is linked to sustainability, and it is primarily related to the sustainability attributes of self-reliance and equity. Although the more marginated and inaccessible community exhibited more food insecurity, food and nutritional security depend upon a complex array of self-sufficiency strategies that remain linked to individual household idiosyncrasies. Third, we evaluated the impact of farmer field schools for improving the sustainability and food security of peasant mountain systems. In order to appreciate the long-term impact, we studied three communities where FFSs were implemented eight, five and three years ago, respectively. We found that FFSs have a gradual impact, as the community that first implemented FFSs scores highest. The impact of FFSs was broad and long-lasting for indicators related to participation, access to basic services and conservation of natural resources. This study demonstrates the potential of FFSs, but more attention will have to be paid to critical indicators in order to scale up their potential in the future. We observed a direct relationship between the increase in agricultural production and the use of external inputs, which is a critical point for sustainable ideals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El análisis determinista de seguridad (DSA) es el procedimiento que sirve para diseñar sistemas, estructuras y componentes relacionados con la seguridad en las plantas nucleares. El DSA se basa en simulaciones computacionales de una serie de hipotéticos accidentes representativos de la instalación, llamados escenarios base de diseño (DBS). Los organismos reguladores señalan una serie de magnitudes de seguridad que deben calcularse en las simulaciones, y establecen unos criterios reguladores de aceptación (CRA), que son restricciones que deben cumplir los valores de esas magnitudes. Las metodologías para realizar los DSA pueden ser de 2 tipos: conservadoras o realistas. Las metodologías conservadoras utilizan modelos predictivos e hipótesis marcadamente pesimistas, y, por ello, relativamente simples. No necesitan incluir un análisis de incertidumbre de sus resultados. Las metodologías realistas se basan en hipótesis y modelos predictivos realistas, generalmente mecanicistas, y se suplementan con un análisis de incertidumbre de sus principales resultados. Se les denomina también metodologías BEPU (“Best Estimate Plus Uncertainty”). En ellas, la incertidumbre se representa, básicamente, de manera probabilista. Para metodologías conservadores, los CRA son, simplemente, restricciones sobre valores calculados de las magnitudes de seguridad, que deben quedar confinados en una “región de aceptación” de su recorrido. Para metodologías BEPU, el CRA no puede ser tan sencillo, porque las magnitudes de seguridad son ahora variables inciertas. En la tesis se desarrolla la manera de introducción de la incertidumbre en los CRA. Básicamente, se mantiene el confinamiento a la misma región de aceptación, establecida por el regulador. Pero no se exige el cumplimiento estricto sino un alto nivel de certidumbre. En el formalismo adoptado, se entiende por ello un “alto nivel de probabilidad”, y ésta corresponde a la incertidumbre de cálculo de las magnitudes de seguridad. Tal incertidumbre puede considerarse como originada en los inputs al modelo de cálculo, y propagada a través de dicho modelo. Los inputs inciertos incluyen las condiciones iniciales y de frontera al cálculo, y los parámetros empíricos de modelo, que se utilizan para incorporar la incertidumbre debida a la imperfección del modelo. Se exige, por tanto, el cumplimiento del CRA con una probabilidad no menor a un valor P0 cercano a 1 y definido por el regulador (nivel de probabilidad o cobertura). Sin embargo, la de cálculo de la magnitud no es la única incertidumbre existente. Aunque un modelo (sus ecuaciones básicas) se conozca a la perfección, la aplicación input-output que produce se conoce de manera imperfecta (salvo que el modelo sea muy simple). La incertidumbre debida la ignorancia sobre la acción del modelo se denomina epistémica; también se puede decir que es incertidumbre respecto a la propagación. La consecuencia es que la probabilidad de cumplimiento del CRA no se puede conocer a la perfección; es una magnitud incierta. Y así se justifica otro término usado aquí para esta incertidumbre epistémica: metaincertidumbre. Los CRA deben incorporar los dos tipos de incertidumbre: la de cálculo de la magnitud de seguridad (aquí llamada aleatoria) y la de cálculo de la probabilidad (llamada epistémica o metaincertidumbre). Ambas incertidumbres pueden introducirse de dos maneras: separadas o combinadas. En ambos casos, el CRA se convierte en un criterio probabilista. Si se separan incertidumbres, se utiliza una probabilidad de segundo orden; si se combinan, se utiliza una probabilidad única. Si se emplea la probabilidad de segundo orden, es necesario que el regulador imponga un segundo nivel de cumplimiento, referido a la incertidumbre epistémica. Se denomina nivel regulador de confianza, y debe ser un número cercano a 1. Al par formado por los dos niveles reguladores (de probabilidad y de confianza) se le llama nivel regulador de tolerancia. En la Tesis se razona que la mejor manera de construir el CRA BEPU es separando las incertidumbres, por dos motivos. Primero, los expertos defienden el tratamiento por separado de incertidumbre aleatoria y epistémica. Segundo, el CRA separado es (salvo en casos excepcionales) más conservador que el CRA combinado. El CRA BEPU no es otra cosa que una hipótesis sobre una distribución de probabilidad, y su comprobación se realiza de forma estadística. En la tesis, los métodos estadísticos para comprobar el CRA BEPU en 3 categorías, según estén basados en construcción de regiones de tolerancia, en estimaciones de cuantiles o en estimaciones de probabilidades (ya sea de cumplimiento, ya sea de excedencia de límites reguladores). Según denominación propuesta recientemente, las dos primeras categorías corresponden a los métodos Q, y la tercera, a los métodos P. El propósito de la clasificación no es hacer un inventario de los distintos métodos en cada categoría, que son muy numerosos y variados, sino de relacionar las distintas categorías y citar los métodos más utilizados y los mejor considerados desde el punto de vista regulador. Se hace mención especial del método más utilizado hasta el momento: el método no paramétrico de Wilks, junto con su extensión, hecha por Wald, al caso multidimensional. Se decribe su método P homólogo, el intervalo de Clopper-Pearson, típicamente ignorado en el ámbito BEPU. En este contexto, se menciona el problema del coste computacional del análisis de incertidumbre. Los métodos de Wilks, Wald y Clopper-Pearson requieren que la muestra aleatortia utilizada tenga un tamaño mínimo, tanto mayor cuanto mayor el nivel de tolerancia exigido. El tamaño de muestra es un indicador del coste computacional, porque cada elemento muestral es un valor de la magnitud de seguridad, que requiere un cálculo con modelos predictivos. Se hace especial énfasis en el coste computacional cuando la magnitud de seguridad es multidimensional; es decir, cuando el CRA es un criterio múltiple. Se demuestra que, cuando las distintas componentes de la magnitud se obtienen de un mismo cálculo, el carácter multidimensional no introduce ningún coste computacional adicional. Se prueba así la falsedad de una creencia habitual en el ámbito BEPU: que el problema multidimensional sólo es atacable desde la extensión de Wald, que tiene un coste de computación creciente con la dimensión del problema. En el caso (que se da a veces) en que cada componente de la magnitud se calcula independientemente de los demás, la influencia de la dimensión en el coste no se puede evitar. Las primeras metodologías BEPU hacían la propagación de incertidumbres a través de un modelo sustitutivo (metamodelo o emulador) del modelo predictivo o código. El objetivo del metamodelo no es su capacidad predictiva, muy inferior a la del modelo original, sino reemplazar a éste exclusivamente en la propagación de incertidumbres. Para ello, el metamodelo se debe construir con los parámetros de input que más contribuyan a la incertidumbre del resultado, y eso requiere un análisis de importancia o de sensibilidad previo. Por su simplicidad, el modelo sustitutivo apenas supone coste computacional, y puede estudiarse exhaustivamente, por ejemplo mediante muestras aleatorias. En consecuencia, la incertidumbre epistémica o metaincertidumbre desaparece, y el criterio BEPU para metamodelos se convierte en una probabilidad simple. En un resumen rápido, el regulador aceptará con más facilidad los métodos estadísticos que menos hipótesis necesiten; los exactos más que los aproximados; los no paramétricos más que los paramétricos, y los frecuentistas más que los bayesianos. El criterio BEPU se basa en una probabilidad de segundo orden. La probabilidad de que las magnitudes de seguridad estén en la región de aceptación no sólo puede asimilarse a una probabilidad de éxito o un grado de cumplimiento del CRA. También tiene una interpretación métrica: representa una distancia (dentro del recorrido de las magnitudes) desde la magnitud calculada hasta los límites reguladores de aceptación. Esta interpretación da pie a una definición que propone esta tesis: la de margen de seguridad probabilista. Dada una magnitud de seguridad escalar con un límite superior de aceptación, se define el margen de seguridad (MS) entre dos valores A y B de la misma como la probabilidad de que A sea menor que B, obtenida a partir de las incertidumbres de A y B. La definición probabilista de MS tiene varias ventajas: es adimensional, puede combinarse de acuerdo con las leyes de la probabilidad y es fácilmente generalizable a varias dimensiones. Además, no cumple la propiedad simétrica. El término margen de seguridad puede aplicarse a distintas situaciones: distancia de una magnitud calculada a un límite regulador (margen de licencia); distancia del valor real de la magnitud a su valor calculado (margen analítico); distancia desde un límite regulador hasta el valor umbral de daño a una barrera (margen de barrera). Esta idea de representar distancias (en el recorrido de magnitudes de seguridad) mediante probabilidades puede aplicarse al estudio del conservadurismo. El margen analítico puede interpretarse como el grado de conservadurismo (GC) de la metodología de cálculo. Utilizando la probabilidad, se puede cuantificar el conservadurismo de límites de tolerancia de una magnitud, y se pueden establecer indicadores de conservadurismo que sirvan para comparar diferentes métodos de construcción de límites y regiones de tolerancia. Un tópico que nunca se abordado de manera rigurosa es el de la validación de metodologías BEPU. Como cualquier otro instrumento de cálculo, una metodología, antes de poder aplicarse a análisis de licencia, tiene que validarse, mediante la comparación entre sus predicciones y valores reales de las magnitudes de seguridad. Tal comparación sólo puede hacerse en escenarios de accidente para los que existan valores medidos de las magnitudes de seguridad, y eso ocurre, básicamente en instalaciones experimentales. El objetivo último del establecimiento de los CRA consiste en verificar que se cumplen para los valores reales de las magnitudes de seguridad, y no sólo para sus valores calculados. En la tesis se demuestra que una condición suficiente para este objetivo último es la conjunción del cumplimiento de 2 criterios: el CRA BEPU de licencia y un criterio análogo, pero aplicado a validación. Y el criterio de validación debe demostrarse en escenarios experimentales y extrapolarse a plantas nucleares. El criterio de licencia exige un valor mínimo (P0) del margen probabilista de licencia; el criterio de validación exige un valor mínimo del margen analítico (el GC). Esos niveles mínimos son básicamente complementarios; cuanto mayor uno, menor el otro. La práctica reguladora actual impone un valor alto al margen de licencia, y eso supone que el GC exigido es pequeño. Adoptar valores menores para P0 supone menor exigencia sobre el cumplimiento del CRA, y, en cambio, más exigencia sobre el GC de la metodología. Y es importante destacar que cuanto mayor sea el valor mínimo del margen (de licencia o analítico) mayor es el coste computacional para demostrarlo. Así que los esfuerzos computacionales también son complementarios: si uno de los niveles es alto (lo que aumenta la exigencia en el cumplimiento del criterio) aumenta el coste computacional. Si se adopta un valor medio de P0, el GC exigido también es medio, con lo que la metodología no tiene que ser muy conservadora, y el coste computacional total (licencia más validación) puede optimizarse. ABSTRACT Deterministic Safety Analysis (DSA) is the procedure used in the design of safety-related systems, structures and components of nuclear power plants (NPPs). DSA is based on computational simulations of a set of hypothetical accidents of the plant, named Design Basis Scenarios (DBS). Nuclear regulatory authorities require the calculation of a set of safety magnitudes, and define the regulatory acceptance criteria (RAC) that must be fulfilled by them. Methodologies for performing DSA van be categorized as conservative or realistic. Conservative methodologies make use of pessimistic model and assumptions, and are relatively simple. They do not need an uncertainty analysis of their results. Realistic methodologies are based on realistic (usually mechanistic) predictive models and assumptions, and need to be supplemented with uncertainty analyses of their results. They are also termed BEPU (“Best Estimate Plus Uncertainty”) methodologies, and are typically based on a probabilistic representation of the uncertainty. For conservative methodologies, the RAC are simply the restriction of calculated values of safety magnitudes to “acceptance regions” defined on their range. For BEPU methodologies, the RAC cannot be so simple, because the safety magnitudes are now uncertain. In the present Thesis, the inclusion of uncertainty in RAC is studied. Basically, the restriction to the acceptance region must be fulfilled “with a high certainty level”. Specifically, a high probability of fulfillment is required. The calculation uncertainty of the magnitudes is considered as propagated from inputs through the predictive model. Uncertain inputs include model empirical parameters, which store the uncertainty due to the model imperfection. The fulfillment of the RAC is required with a probability not less than a value P0 close to 1 and defined by the regulator (probability or coverage level). Calculation uncertainty is not the only one involved. Even if a model (i.e. the basic equations) is perfectly known, the input-output mapping produced by the model is imperfectly known (unless the model is very simple). This ignorance is called epistemic uncertainty, and it is associated to the process of propagation). In fact, it is propagated to the probability of fulfilling the RAC. Another term used on the Thesis for this epistemic uncertainty is metauncertainty. The RAC must include the two types of uncertainty: one for the calculation of the magnitude (aleatory uncertainty); the other one, for the calculation of the probability (epistemic uncertainty). The two uncertainties can be taken into account in a separate fashion, or can be combined. In any case the RAC becomes a probabilistic criterion. If uncertainties are separated, a second-order probability is used; of both are combined, a single probability is used. On the first case, the regulator must define a level of fulfillment for the epistemic uncertainty, termed regulatory confidence level, as a value close to 1. The pair of regulatory levels (probability and confidence) is termed the regulatory tolerance level. The Thesis concludes that the adequate way of setting the BEPU RAC is by separating the uncertainties. There are two reasons to do so: experts recommend the separation of aleatory and epistemic uncertainty; and the separated RAC is in general more conservative than the joint RAC. The BEPU RAC is a hypothesis on a probability distribution, and must be statistically tested. The Thesis classifies the statistical methods to verify the RAC fulfillment in 3 categories: methods based on tolerance regions, in quantile estimators and on probability (of success or failure) estimators. The former two have been termed Q-methods, whereas those in the third category are termed P-methods. The purpose of our categorization is not to make an exhaustive survey of the very numerous existing methods. Rather, the goal is to relate the three categories and examine the most used methods from a regulatory standpoint. Special mention deserves the most used method, due to Wilks, and its extension to multidimensional variables (due to Wald). The counterpart P-method of Wilks’ is Clopper-Pearson interval, typically ignored in the BEPU realm. The problem of the computational cost of an uncertainty analysis is tackled. Wilks’, Wald’s and Clopper-Pearson methods require a minimum sample size, which is a growing function of the tolerance level. The sample size is an indicator of the computational cost, because each element of the sample must be calculated with the predictive models (codes). When the RAC is a multiple criteria, the safety magnitude becomes multidimensional. When all its components are output of the same calculation, the multidimensional character does not introduce additional computational cost. In this way, an extended idea in the BEPU realm, stating that the multi-D problem can only be tackled with the Wald extension, is proven to be false. When the components of the magnitude are independently calculated, the influence of the problem dimension on the cost cannot be avoided. The former BEPU methodologies performed the uncertainty propagation through a surrogate model of the code, also termed emulator or metamodel. The goal of a metamodel is not the predictive capability, clearly worse to the original code, but the capacity to propagate uncertainties with a lower computational cost. The emulator must contain the input parameters contributing the most to the output uncertainty, and this requires a previous importance analysis. The surrogate model is practically inexpensive to run, so that it can be exhaustively analyzed through Monte Carlo. Therefore, the epistemic uncertainty due to sampling will be reduced to almost zero, and the BEPU RAC for metamodels includes a simple probability. The regulatory authority will tend to accept the use of statistical methods which need a minimum of assumptions: exact, nonparametric and frequentist methods rather than approximate, parametric and bayesian methods, respectively. The BEPU RAC is based on a second-order probability. The probability of the safety magnitudes being inside the acceptance region is a success probability and can be interpreted as a fulfillment degree if the RAC. Furthermore, it has a metric interpretation, as a distance (in the range of magnitudes) from calculated values of the magnitudes to acceptance regulatory limits. A probabilistic definition of safety margin (SM) is proposed in the thesis. The same from a value A to other value B of a safety magnitude is defined as the probability that A is less severe than B, obtained from the uncertainties if A and B. The probabilistic definition of SM has several advantages: it is nondimensional, ranges in the interval (0,1) and can be easily generalized to multiple dimensions. Furthermore, probabilistic SM are combined according to the probability laws. And a basic property: probabilistic SM are not symmetric. There are several types of SM: distance from a calculated value to a regulatory limit (licensing margin); or from the real value to the calculated value of a magnitude (analytical margin); or from the regulatory limit to the damage threshold (barrier margin). These representations of distances (in the magnitudes’ range) as probabilities can be applied to the quantification of conservativeness. Analytical margins can be interpreted as the degree of conservativeness (DG) of the computational methodology. Conservativeness indicators are established in the Thesis, useful in the comparison of different methods of constructing tolerance limits and regions. There is a topic which has not been rigorously tackled to the date: the validation of BEPU methodologies. Before being applied in licensing, methodologies must be validated, on the basis of comparisons of their predictions ad real values of the safety magnitudes. Real data are obtained, basically, in experimental facilities. The ultimate goal of establishing RAC is to verify that real values (aside from calculated values) fulfill them. In the Thesis it is proved that a sufficient condition for this goal is the conjunction of 2 criteria: the BEPU RAC and an analogous criterion for validation. And this las criterion must be proved in experimental scenarios and extrapolated to NPPs. The licensing RAC requires a minimum value (P0) of the probabilistic licensing margin; the validation criterion requires a minimum value of the analytical margin (i.e., of the DG). These minimum values are basically complementary; the higher one of them, the lower the other one. The regulatory practice sets a high value on the licensing margin, so that the required DG is low. The possible adoption of lower values for P0 would imply weaker exigence on the RCA fulfillment and, on the other hand, higher exigence on the conservativeness of the methodology. It is important to highlight that a higher minimum value of the licensing or analytical margin requires a higher computational cost. Therefore, the computational efforts are also complementary. If medium levels are adopted, the required DG is also medium, and the methodology does not need to be very conservative. The total computational effort (licensing plus validation) could be optimized.