962 resultados para data availability


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Government of Japan, through the Institute for Cetacean Research (Tokyo), has established a DNA register for whales taken under special permit or otherwise destined for commercial markets (IWC 2005; IWC 2010a). The functionality of this DNA register, for the purposes of traceability/trackability, is critical to the current negotiations on the future of the IWC (IWC 2010b). Here we request access to the DNA register for 3 species of whales (fin, sei and Antarctic minke) for the purposes of tracking the origins of whale products purchased at commercial outlets in Seoul, South Korea and Santa Monica, US, as described in the Baker et al. (2010). The attached proposal was included as Supplementary Material to this published article and submitted for consideration to the IWC Data Availability Group (DAG) on 12 April 2010. However, the DAG declined to forward the proposal to the data holders, recommending that we “wait until the Scientific Committee has reviewed the proposed DNA register/market sampling text in the draft Consensus Decision in accordance with the Commission's instructions and then reported to the Commission itself” (email 16 May 2010). We assume that this will take place at SC/62 in Agadir and request that this proposal be considered for endorsement by the DNA subcommittee.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The fall of the Berlin Wall opened the way for a reform path – the transition process – which accompanied ten former Socialist countries in Central and South Eastern Europe to knock at the EU doors. By the way, at the time of the EU membership several economic and structural weaknesses remained. A tendency towards convergence between the new Member States (NMS) and the EU average income level emerged, together with a spread of inequality at the sub-regional level, mainly driven by the backwardness of the agricultural and rural areas. Several progresses were made in evaluating the policies for rural areas, but a shared definition of rurality is still missing. Numerous indicators were calculated for assessing the effectiveness of the Common Agricultural Policy and Rural Development Policy. Previous analysis on the Central and Eastern European countries found that the characteristics of the most backward areas were insufficiently addressed by the policies enacted; the low data availability and accountability at a sub-regional level, and the deficiencies in institutional planning and implementation represented an obstacle for targeting policies and payments. The next pages aim at providing a basis for understanding the connections between the peculiarities of the transition process, the current development performance of NMS and the EU role, with particular attention to the agricultural and rural areas. Applying a mixed methodological approach (multivariate statistics, non-parametric methods, spatial econometrics), this study contributes to the identification of rural areas and to the analysis of the changes occurred during the EU membership in Hungary, assessing the effect of CAP introduction and its contribution to the convergence of the Hungarian agricultural and rural. The author believes that more targeted – and therefore efficient – policies for agricultural and rural areas require a deeper knowledge of their structural and dynamic characteristics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Heart diseases are the leading cause of death worldwide, both for men and women. However, the ionic mechanisms underlying many cardiac arrhythmias and genetic disorders are not completely understood, thus leading to a limited efficacy of the current available therapies and leaving many open questions for cardiac electrophysiologists. On the other hand, experimental data availability is still a great issue in this field: most of the experiments are performed in vitro and/or using animal models (e.g. rabbit, dog and mouse), even when the final aim is to better understand the electrical behaviour of in vivo human heart either in physiological or pathological conditions. Computational modelling constitutes a primary tool in cardiac electrophysiology: in silico simulations, based on the available experimental data, may help to understand the electrical properties of the heart and the ionic mechanisms underlying a specific phenomenon. Once validated, mathematical models can be used for making predictions and testing hypotheses, thus suggesting potential therapeutic targets. This PhD thesis aims to apply computational cardiac modelling of human single cell action potential (AP) to three clinical scenarios, in order to gain new insights into the ionic mechanisms involved in the electrophysiological changes observed in vitro and/or in vivo. The first context is blood electrolyte variations, which may occur in patients due to different pathologies and/or therapies. In particular, we focused on extracellular Ca2+ and its effect on the AP duration (APD). The second context is haemodialysis (HD) therapy: in addition to blood electrolyte variations, patients undergo a lot of other different changes during HD, e.g. heart rate, cell volume, pH, and sympatho-vagal balance. The third context is human hypertrophic cardiomyopathy (HCM), a genetic disorder characterised by an increased arrhythmic risk, and still lacking a specific pharmacological treatment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Klimamontoring benötigt eine operative, raum-zeitliche Analyse der Klimavariabilität. Mit dieser Zielsetzung, funktionsbereite Karten regelmäßig zu erstellen, ist es hilfreich auf einen Blick, die räumliche Variabilität der Klimaelemente in der zeitlichen Veränderungen darzustellen. Für aktuelle und kürzlich vergangene Jahre entwickelte der Deutsche Wetterdienst ein Standardverfahren zur Erstellung solcher Karten. Die Methode zur Erstellung solcher Karten variiert für die verschiedenen Klimaelemente bedingt durch die Datengrundlage, die natürliche Variabilität und der Verfügbarkeit der in-situ Daten.rnIm Rahmen der Analyse der raum-zeitlichen Variabilität innerhalb dieser Dissertation werden verschiedene Interpolationsverfahren auf die Mitteltemperatur der fünf Dekaden der Jahre 1951-2000 für ein relativ großes Gebiet, der Region VI der Weltorganisation für Meteorologie (Europa und Naher Osten) angewendet. Die Region deckt ein relativ heterogenes Arbeitsgebiet von Grönland im Nordwesten bis Syrien im Südosten hinsichtlich der Klimatologie ab.rnDas zentrale Ziel der Dissertation ist eine Methode zur räumlichen Interpolation der mittleren Dekadentemperaturwerte für die Region VI zu entwickeln. Diese Methode soll in Zukunft für die operative monatliche Klimakartenerstellung geeignet sein. Diese einheitliche Methode soll auf andere Klimaelemente übertragbar und mit der entsprechenden Software überall anwendbar sein. Zwei zentrale Datenbanken werden im Rahmen dieser Dissertation verwendet: So genannte CLIMAT-Daten über dem Land und Schiffsdaten über dem Meer.rnIm Grunde wird die Übertragung der Punktwerte der Temperatur per räumlicher Interpolation auf die Fläche in drei Schritten vollzogen. Der erste Schritt beinhaltet eine multiple Regression zur Reduktion der Stationswerte mit den vier Einflussgrößen der Geographischen Breite, der Höhe über Normalnull, der Jahrestemperaturamplitude und der thermischen Kontinentalität auf ein einheitliches Niveau. Im zweiten Schritt werden die reduzierten Temperaturwerte, so genannte Residuen, mit der Interpolationsmethode der Radialen Basis Funktionen aus der Gruppe der Neuronalen Netzwerk Modelle (NNM) interpoliert. Im letzten Schritt werden die interpolierten Temperaturraster mit der Umkehrung der multiplen Regression aus Schritt eins mit Hilfe der vier Einflussgrößen auf ihr ursprüngliches Niveau hochgerechnet.rnFür alle Stationswerte wird die Differenz zwischen geschätzten Wert aus der Interpolation und dem wahren gemessenen Wert berechnet und durch die geostatistische Kenngröße des Root Mean Square Errors (RMSE) wiedergegeben. Der zentrale Vorteil ist die wertegetreue Wiedergabe, die fehlende Generalisierung und die Vermeidung von Interpolationsinseln. Das entwickelte Verfahren ist auf andere Klimaelemente wie Niederschlag, Schneedeckenhöhe oder Sonnenscheindauer übertragbar.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Advances in information technology and global data availability have opened the door for assessments of sustainable development at a truly macro scale. It is now fairly easy to conduct a study of sustainability using the entire planet as the unit of analysis; this is precisely what this work set out to accomplish. The study began by examining some of the best known composite indicator frameworks developed to measure sustainability at the country level today. Most of these were found to value human development factors and a clean local environment, but to gravely overlook consumption of (remote) resources in relation to nature’s capacity to renew them, a basic requirement for a sustainable state. Thus, a new measuring standard is proposed, based on the Global Sustainability Quadrant approach. In a two‐dimensional plot of nations’ Human Development Index (HDI) vs. their Ecological Footprint (EF) per capita, the Sustainability Quadrant is defined by the area where both dimensions satisfy the minimum conditions of sustainable development: an HDI score above 0.8 (considered ‘high’ human development), and an EF below the fair Earth‐share of 2.063 global hectares per person. After developing methods to identify those countries that are closest to the Quadrant in the present‐day and, most importantly, those that are moving towards it over time, the study tackled the question: what indicators of performance set these countries apart? To answer this, an analysis of raw data, covering a wide array of environmental, social, economic, and governance performance metrics, was undertaken. The analysis used country rank lists for each individual metric and compared them, using the Pearson Product Moment Correlation function, to the rank lists generated by the proximity/movement relative to the Quadrant measuring methods. The analysis yielded a list of metrics which are, with a high degree of statistical significance, associated with proximity to – and movement towards – the Quadrant; most notably: Favorable for sustainable development: use of contraception, high life expectancy, high literacy rate, and urbanization. Unfavorable for sustainable development: high GDP per capita, high language diversity, high energy consumption, and high meat consumption. A momentary gain, but a burden in the long‐run: high carbon footprint and debt. These results could serve as a solid stepping stone for the development of more reliable composite index frameworks for assessing countries’ sustainability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Information management and geoinformation systems (GIS) have become indispensable in a large majority of protected areas all over the world. These tools are used for management purposes as well as for research and in recent years have become even more important for visitor information, education and communication. This study is divided into two parts: the first part provides a general overview of GIS and information management in a selected number of national park organizations. The second part lists and evaluates the needs of evolving large protected areas in Switzerland. The results show a wide use of GIS and information management tools in well established protected areas. The more isolated use of singular GIS tools has increasingly been replaced by an integrated geoinformation management. However, interview partners pointed out that human resources for GIS in most parks are limited. The interviews also highlight uneven access to national geodata. The view of integrated geoinformation management is not yet fully developed in the park projects in Switzerland. Short-term needs, such as software and data availability, motivate a large number of responses collected within an exhaustive questionnaire. Nevertheless, the need for coordinated action has been identified and should be followed up. The park organizations in North America show how an effective coordination and cooperation might be organized.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die Verwaltung von Lagerbeständen in Unternehmen muss erhebliche Anforderungen an die Datenverfügbarkeit, -sicherheit und -konsistenz erfüllen. Dies wird heute durch eine zentrale Datenhaltung in Lagerverwaltungssystemen gewährleistet. Auf der anderen Seite ist in vielen Bereichen (z. B. Materialfluss- und Transportsteuerung, Produktionssteuerung) eine Entwicklungstendenz in Richtung dezentraler Steuerungsstrategien zu beobachten, welche eine erhöhte Flexibilität und reduzierte Komplexität versprechen. Im Rahmen eines von der Deutschen Forschungsgemeinschaft (DFG) geförderten Projekts werden im vorliegenden Beitrag Konzepte zur verteilten Gestaltung von Lagerverwaltungssystemen vorgestellt und diskutiert.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Extending phenological records into the past is essential for the understanding of past ecological change and evaluating the effects of climate change on ecosystems. A growing body of historical phenological information is now available for Europe, North America, and Asia. In East Asia, long-term phenological series are still relatively scarce. This study extracted plant phenological observations from old diaries in the period 1834–1962. A spring phenology index (SPI) for the modern period (1963–2009) was defined as the mean flowering time of three shrubs (first flowering of Amygdalus davidiana and Cercis chinensis, 50% of full flowering of Paeonia suffruticosa) according to the data availability. Applying calibrated transfer functions from the modern period to the historical data, we reconstructed a continuous SPI time series across eastern China from 1834 to 2009. In the recent 30 years, the SPI is 2.1–6.3 days earlier than during any other consecutive 30 year period before 1970. A moving linear trend analysis shows that the advancing trend of SPI over the past three decades reaches upward of 4.1 d/decade, which exceeds all previously observed trends in the past 30 year period. In addition, the SPI series correlates significantly with spring (February to April) temperatures in the study area, with an increase in spring temperature of 1°C inducing an earlier SPI by 3.1 days. These shifts of SPI provide important information regarding regional vegetation-climate relationships, and they are helpful to assess long term of climate change impacts on biophysical systems and biodiversity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the current challenges in evolutionary ecology is understanding the long-term persistence of contemporary-evolving predator–prey interactions across space and time. To address this, we developed an extension of a multi-locus, multi-trait eco-evolutionary individual-based model that incorporates several interacting species in explicit landscapes. We simulated eco-evolutionary dynamics of multiple species food webs with different degrees of connectance across soil-moisture islands. A broad set of parameter combinations led to the local extinction of species, but some species persisted, and this was associated with (1) high connectance and omnivory and (2) ongoing evolution, due to multi-trait genetic variability of the embedded species. Furthermore, persistence was highest at intermediate island distances, likely because of a balance between predation-induced extinction (strongest at short island distances) and the coupling of island diversity by top predators, which by travelling among islands exert global top-down control of biodiversity. In the simulations with high genetic variation, we also found widespread trait evolutionary changes indicative of eco-evolutionary dynamics. We discuss how the ever-increasing computing power and high-resolution data availability will soon allow researchers to start bridging the in vivo–in silico gap.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the climate policy process, and project future climate change. Present-day analysis requires the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. Here we describe datasets and a methodology developed by the global carbon cycle science community to quantify all major components of the global carbon budget, including their uncertainties. We discuss changes compared to previous estimates, consistency within and among components, and methodology and data limitations. Based on energy statistics, we estimate that the global emissions of CO2 from fossil fuel combustion and cement production were 9.5 ± 0.5 PgC yr−1 in 2011, 3.0 percent above 2010 levels. We project these emissions will increase by 2.6% (1.9–3.5%) in 2012 based on projections of Gross World Product and recent changes in the carbon intensity of the economy. Global net CO2 emissions from Land-Use Change, including deforestation, are more difficult to update annually because of data availability, but combined evidence from land cover change data, fire activity in regions undergoing deforestation and models suggests those net emissions were 0.9 ± 0.5 PgC yr−1 in 2011. The global atmospheric CO2 concentration is measured directly and reached 391.38 ± 0.13 ppm at the end of year 2011, increasing 1.70 ± 0.09 ppm yr−1 or 3.6 ± 0.2 PgC yr−1 in 2011. Estimates from four ocean models suggest that the ocean CO2 sink was 2.6 ± 0.5 PgC yr−1 in 2011, implying a global residual terrestrial CO2 sink of 4.1 ± 0.9 PgC yr−1. All uncertainties are reported as ±1 sigma (68% confidence assuming Gaussian error distributions that the real value lies within the given interval), reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. This paper is intended to provide a baseline to keep track of annual carbon budgets in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Gravity field and steady-state Ocean Circulation Explorer (GOCE) was the first Earth explorer core mission of the European Space Agency. It was launched on March 17, 2009 into a Sun-synchronous dusk-dawn orbit and re-entered into the Earth’s atmosphere on November 11, 2013. The satellite altitude was between 255 and 225 km for the measurement phases. The European GOCE Gravity consortium is responsible for the Level 1b to Level 2 data processing in the frame of the GOCE High-level processing facility (HPF). The Precise Science Orbit (PSO) is one Level 2 product, which was produced under the responsibility of the Astronomical Institute of the University of Bern within the HPF. This PSO product has been continuously delivered during the entire mission. Regular checks guaranteed a high consistency and quality of the orbits. A correlation between solar activity, GPS data availability and quality of the orbits was found. The accuracy of the kinematic orbit primarily suffers from this. Improvements in modeling the range corrections at the retro-reflector array for the SLR measurements were made and implemented in the independent SLR validation for the GOCE PSO products. The satellite laser ranging (SLR) validation finally states an orbit accuracy of 2.42 cm for the kinematic and 1.84 cm for the reduced-dynamic orbits over the entire mission. The common-mode accelerations from the GOCE gradiometer were not used for the official PSO product, but in addition to the operational HPF work a study was performed to investigate to which extent common-mode accelerations improve the reduced-dynamic orbit determination results. The accelerometer data may be used to derive realistic constraints for the empirical accelerations estimated for the reduced-dynamic orbit determination, which already improves the orbit quality. On top of that the accelerometer data may further improve the orbit quality if realistic constraints and state-of-the-art background models such as gravity field and ocean tide models are used for the reduced-dynamic orbit determination.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The European Registry for Patients with Mechanical Circulatory Support (EUROMACS) was founded on 10 December 2009 with the initiative of Roland Hetzer (Deutsches Herzzentrum Berlin, Berlin, Germany) and Jan Gummert (Herz- und Diabeteszentrum Nordrhein-Westfalen, Bad Oeynhausen, Germany) with 15 other founding international members. It aims to promote scientific research to improve care of end-stage heart failure patients with ventricular assist device or a total artificial heart as long-term mechanical circulatory support. Likewise, the organization aims to provide and maintain a registry of device implantation data and long-term follow-up of patients with mechanical circulatory support. Hence, EUROMACS affiliated itself with Dendrite Clinical Systems Ltd to offer its members a software tool that allows input and analysis of patient clinical data on a daily basis. EUROMACS facilitates further scientific studies by offering research groups access to any available data wherein patients and centres are anonymized. Furthermore, EUROMACS aims to stimulate cooperation with clinical and research institutions and with peer associations involved to further its aims. EUROMACS is the only European-based Registry for Patients with Mechanical Circulatory Support with rapid increase in institutional and individual membership. Because of the expeditious data input, the European Association for Cardiothoracic Surgeons saw the need to optimize the data availability and the significance of the registry to improve care of patients with mechanical circulatory support and its potential contribution to scientific intents; hence, the beginning of their alliance in 2012. This first annual report is designed to provide an overview of EUROMACS' structure, its activities, a first data collection and an insight to its scientific contributions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atoll islands are subject to a variety of processes that influence their geomorphological development. Analysis of historical shoreline changes using remotely sensed images has become an efficient approach to both quantify past changes and estimate future island response. However, the detection of long-term changes in beach width is challenging mainly for two reasons: first, data availability is limited for many remote Pacific islands. Second, beach environments are highly dynamic and strongly influenced by seasonal or episodic shoreline oscillations. Consequently, remote-sensing studies on beach morphodynamics of atoll islands deal with dynamic features covered by a low sampling frequency. Here we present a study of beach dynamics for nine islands on Takú Atoll, Papua New Guinea, over a seven-decade period. A considerable chronological gap between aerial photographs and satellite images was addressed by applying a new method that reweighted positions of the beach limit by identifying "outlier" shoreline positions. On top of natural beach variability observed along the reweighted beach sections, we found that one third of the analyzed islands show a statistically significant decrease in reweighted beach width since 1943. The total loss of beach area for all islands corresponds to 44% of the initial beach area. Variable shoreline trajectories suggest that changes in beach width on Takú Atoll are dependent on local control (that is, human activity and longshore sediment transport). Our results show that remote imagery with a low sampling frequency may be sufficient to characterize prominent morphological changes in planform beach configuration of reef islands.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In just a few years cloud computing has become a very popular paradigm and a business success story, with storage being one of the key features. To achieve high data availability, cloud storage services rely on replication. In this context, one major challenge is data consistency. In contrast to traditional approaches that are mostly based on strong consistency, many cloud storage services opt for weaker consistency models in order to achieve better availability and performance. This comes at the cost of a high probability of stale data being read, as the replicas involved in the reads may not always have the most recent write. In this paper, we propose a novel approach, named Harmony, which adaptively tunes the consistency level at run-time according to the application requirements. The key idea behind Harmony is an intelligent estimation model of stale reads, allowing to elastically scale up or down the number of replicas involved in read operations to maintain a low (possibly zero) tolerable fraction of stale reads. As a result, Harmony can meet the desired consistency of the applications while achieving good performance. We have implemented Harmony and performed extensive evaluations with the Cassandra cloud storage on Grid?5000 testbed and on Amazon EC2. The results show that Harmony can achieve good performance without exceeding the tolerated number of stale reads. For instance, in contrast to the static eventual consistency used in Cassandra, Harmony reduces the stale data being read by almost 80% while adding only minimal latency. Meanwhile, it improves the throughput of the system by 45% while maintaining the desired consistency requirements of the applications when compared to the strong consistency model in Cassandra.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Existe una creciente preocupación por las catástrofes de origen natural que están por llegar, motivo por el que se están realizando estudios desde prácticamente todas las ramas de la ciencia. La razón para ello se puede encontrar en el miedo a que los eventos futuros puedan dificultar las actividades humanas, aunque no es el único factor. Por todo ello, se produce una dispersión muy importante incluso en los conceptos más elementales como qué debe ser considerado o cómo debe llamarse y catalogarse uno u otro elemento. En consecuencia, los métodos para comprender los riesgos naturales también son muy diferentes, rara vez encontrándose enfoques realmente multidisciplinares. Se han realizado algunos esfuerzos para crear un marco de entendimiento común como por ejemplo, la "Directiva sobre inundaciones" o, más recientemente, la Directiva Inspire. Las entidades aseguradoras y reaseguradoras son un actor importante entre los muchos involucrados en los estudios de riesgos. Su interés radica en el hecho de que terminan pagando la mayor parte de la factura, si no toda. Pero, a cuánto puede ascender esa factura, no es una pregunta fácil de responder aún en casos muy concretos, y sin embargo, es la pregunta que constantemente se plantea por parte de los tomadores de decisiones a todos los niveles. Este documento resume las actividades de investigación que han llevado a cabo al objeto de sentar un marco de referencia, implementando de enfoques numéricos capaces de hacer frente a algunas de las cuestiones más relevantes que se encuentran en casi todos los estudios de riesgos naturales, ensayando conceptos de manera pragmática. Para ello, se escogió un lugar experimental de acuerdo a diferentes criterios, como la densidad de población, la facilidad de proporcionar los límites geográficos claros, la presencia de tres de los procesos geológicos más importantes (inundaciones, terremotos y vulcanismo) y la disponibilidad de datos. El modelo aquí propuesto aprovecha fuentes de datos muy diversas para evaluar los peligros naturales, poniendo de relieve la necesidad de un enfoque multidisciplinar y emplea un catálogo de datos único, unificado, independiente (no orientado), coherente y homogéneo para estimar el valor de las propiedades. Ahora bien, los datos se explotan de manera diferente según cada tipo de peligro, manteniendo sin variación los conceptos subyacentes. Durante esta investigación, se ha encontrado una gran brecha en la relación entre las pérdidas reales y las probabilidades del peligro, algo contrario a lo que se ha pensado que debía ser el comportamiento más probable de los riesgos naturales, demostrando que los estudios de riesgo tienen vida útil muy limitada. En parte debido ello, el modelo propuesto en este estudio es el de trabajar con escenarios, fijando una probabilidad de ocurrencia, lo que es contrario al modelo clásico de evaluar funciones continuas de riesgo. Otra razón para abordar la cuestión mediante escenarios es forzar al modelo para proporcionar unas cifras creíbles de daño máximo fijando cuestiones como la ubicación espacial de un evento y sus probabilidades, aportando una nueva visión del "peor escenario posible” de probabilidad conocida. ABSTRACT There is a growing concern about catastrophes of natural origin about to come hence many studies are being carried out from almost any science branch. Even though it is not the only one, fear for the upcoming events that might jeopardize any given human activity is the main motive. A forking effect is therefore heavily present even on the basic concepts of what is to be considered or how should it be named and catalogued; as a consequence, methods towards understanding natural risks also show great differences and a multidisciplinary approach has seldomly been followed. Some efforts were made to create a common understanding of such a matter, the “Floods Directive” or more recently the Inspire Directive, are a couple of examples. The insurance sector is an important actor among the many involved. Their interest relies on the fact that, eventually, they pay most of the bill if not all. But how much could that be is not an easy question to be answerd even in a very specific case, and it is almost always the question posed by decision makers at all levels. This document summarizes research activities that have being carried out in order to put some solid ground to be followed, implementing numerical approaches that are capable of coping with some of the most relevant issues found in almost all natural risk studies, testing concepts pragmatically. In order to do so, an experimental site was selected according to different criteria, such as population density, the ease of providing clear geographical boundaries, the presence of three of the most important geological processes (floods, earthquakes and volcanism) and data availability. The model herein proposed takes advantage of very diferent data sources in the assessment of hazard, pointing out how a multidisciplinary approach is needed, and uses only one unified, independent, consistent, homogeneous (non objective driven) source for assessing property value. Data is exploited differently according to each hazard type, but the underlying concepts remain the same. During this research, a deep detachment was found between actual loss and hazard chances, contrarily to what has been thought to be the most likely behaviour of natural hazards, proving that risk studies have a very limited lifespan. Partially because of such finding, the model in this study addresses scenarios with fixed probability of occurrence, as opposed to studying a continuous hazard function as usually proposed. Another reason for studying scenarios was to force the model to provide a reliable figure after a set of given parameters where fixed, such as the spatial location of an event and its chances, so the “worst case” of a given return period could be found.