179 resultados para Calibration uncertainty

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A traditional method of validating the performance of a flood model when remotely sensed data of the flood extent are available is to compare the predicted flood extent to that observed. The performance measure employed often uses areal pattern-matching to assess the degree to which the two extents overlap. Recently, remote sensing of flood extents using synthetic aperture radar (SAR) and airborne scanning laser altimetry (LIDAR) has made more straightforward the synoptic measurement of water surface elevations along flood waterlines, and this has emphasised the possibility of using alternative performance measures based on height. This paper considers the advantages that can accrue from using a performance measure based on waterline elevations rather than one based on areal patterns of wet and dry pixels. The two measures were compared for their ability to estimate flood inundation uncertainty maps from a set of model runs carried out to span the acceptable model parameter range in a GLUE-based analysis. A 1 in 5-year flood on the Thames in 1992 was used as a test event. As is typical for UK floods, only a single SAR image of observed flood extent was available for model calibration and validation. A simple implementation of a two-dimensional flood model (LISFLOOD-FP) was used to generate model flood extents for comparison with that observed. The performance measure based on height differences of corresponding points along the observed and modelled waterlines was found to be significantly more sensitive to the channel friction parameter than the measure based on areal patterns of flood extent. The former was able to restrict the parameter range of acceptable model runs and hence reduce the number of runs necessary to generate an inundation uncertainty map. A result of this was that there was less uncertainty in the final flood risk map. The uncertainty analysis included the effects of uncertainties in the observed flood extent as well as in model parameters. The height-based measure was found to be more sensitive when increased heighting accuracy was achieved by requiring that observed waterline heights varied slowly along the reach. The technique allows for the decomposition of the reach into sections, with different effective channel friction parameters used in different sections, which in this case resulted in lower r.m.s. height differences between observed and modelled waterlines than those achieved by runs using a single friction parameter for the whole reach. However, a validation of the modelled inundation uncertainty using the calibration event showed a significant difference between the uncertainty map and the observed flood extent. While this was true for both measures, the difference was especially significant for the height-based one. This is likely to be due to the conceptually simple flood inundation model and the coarse application resolution employed in this case. The increased sensitivity of the height-based measure may lead to an increased onus being placed on the model developer in the production of a valid model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New radiocarbon calibration curves, IntCal04 and Marine04, have been constructed and internationally ratified to replace the terrestrial and marine components of IntCal98. The new calibration data sets extend an additional 2000 yr, from 0-26 cal kyr BP (Before Present, 0 cal. BP = AD 1950), and provide much higher resolution, greater precision, and more detailed structure than IntCal98. For the Marine04 curve, dendrochronologically-dated tree-ring samples, converted with a box diffusion model to marine mixed-layer ages, cover the period from 0-10.5 call kyr BR Beyond 10.5 cal kyr BP, high-resolution marine data become available from foraminifera in varved sediments and U/Th-dated corals. The marine records are corrected with site-specific C-14 reservoir age information to provide a single global marine mixed-layer calibration from 10.5-26.0 cal kyr BR A substantial enhancement relative to IntCal98 is the introduction of a random walk model, which takes into account the uncertainty in both the calendar age and the C-14 age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed here. The tree-ring data sets, sources of uncertainty, and regional offsets are presented in detail in a companion paper by Reimer et al. (this issue).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new calibration curve for the conversion of radiocarbon ages to calibrated (cal) ages has been constructed and internationally ratified to replace ImCal98, which extended from 0-24 cal kyr BP (Before Present, 0 cal BP = AD 1950). The new calibration data set for terrestrial samples extends from 0-26 cal kyr BP, but with much higher resolution beyond 11.4 cal kyr BP than ImCal98. Dendrochronologically-dated tree-ring samples cover the period from 0-12.4 cal kyr BP. Beyond the end of the tree rings, data from marine records (corals and foraminifera) are converted to the atmospheric equivalent with a site-specific marine reservoir correction to provide terrestrial calibration from 12.4-26.0 cal kyr BP. A substantial enhancement relative to ImCal98 is the introduction of a coherent statistical approach based on a random walk model, which takes into account the uncertainty in both the calendar age and the C-14 age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The tree-ring data sets, sources of uncertainty, and regional offsets are discussed here. The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine 04) are discussed in brief, but details are presented in Hughen et al. (this issue a). We do not make a recommendation for calibration beyond 26 cal kyr BP at this time; however, potential calibration data sets are compared in another paper (van der Plicht et al., this issue).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water quality models generally require a relatively large number of parameters to define their functional relationships, and since prior information on parameter values is limited, these are commonly defined by fitting the model to observed data. In this paper, the identifiability of water quality parameters and the associated uncertainty in model simulations are investigated. A modification to the water quality model `Quality Simulation Along River Systems' is presented in which an improved flow component is used within the existing water quality model framework. The performance of the model is evaluated in an application to the Bedford Ouse river, UK, using a Monte-Carlo analysis toolbox. The essential framework of the model proved to be sound, and calibration and validation performance was generally good. However some supposedly important water quality parameters associated with algal activity were found to be completely insensitive, and hence non-identifiable, within the model structure, while others (nitrification and sedimentation) had optimum values at or close to zero, indicating that those processes were not detectable from the data set examined. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improvements in the resolution of satellite imagery have enabled extraction of water surface elevations at the margins of the flood. Comparison between modelled and observed water surface elevations provides a new means for calibrating and validating flood inundation models, however the uncertainty in this observed data has yet to be addressed. Here a flood inundation model is calibrated using a probabilistic treatment of the observed data. A LiDAR guided snake algorithm is used to determine an outline of a flood event in 2006 on the River Dee, North Wales, UK, using a 12.5m ERS-1 image. Points at approximately 100m intervals along this outline are selected, and the water surface elevation recorded as the LiDAR DEM elevation at each point. With a planar water surface from the gauged upstream to downstream water elevations as an approximation, the water surface elevations at points along this flooded extent are compared to their ‘expected’ value. The pattern of errors between the two show a roughly normal distribution, however when plotted against coordinates there is obvious spatial autocorrelation. The source of this spatial dependency is investigated by comparing errors to the slope gradient and aspect of the LiDAR DEM. A LISFLOOD-FP model of the flood event is set-up to investigate the effect of observed data uncertainty on the calibration of flood inundation models. Multiple simulations are run using different combinations of friction parameters, from which the optimum parameter set will be selected. For each simulation a T-test is used to quantify the fit between modelled and observed water surface elevations. The points chosen for use in this T-test are selected based on their error. The criteria for selection enables evaluation of the sensitivity of the choice of optimum parameter set to uncertainty in the observed data. This work explores the observed data in detail and highlights possible causes of error. The identification of significant error (RMSE = 0.8m) between approximate expected and actual observed elevations from the remotely sensed data emphasises the limitations of using this data in a deterministic manner within the calibration process. These limitations are addressed by developing a new probabilistic approach to using the observed data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study presents a new simple approach for combining empirical with raw (i.e., not bias corrected) coupled model ensemble forecasts in order to make more skillful interval forecasts of ENSO. A Bayesian normal model has been used to combine empirical and raw coupled model December SST Niño-3.4 index forecasts started at the end of the preceding July (5-month lead time). The empirical forecasts were obtained by linear regression between December and the preceding July Niño-3.4 index values over the period 1950–2001. Coupled model ensemble forecasts for the period 1987–99 were provided by ECMWF, as part of the Development of a European Multimodel Ensemble System for Seasonal to Interannual Prediction (DEMETER) project. Empirical and raw coupled model ensemble forecasts alone have similar mean absolute error forecast skill score, compared to climatological forecasts, of around 50% over the period 1987–99. The combined forecast gives an increased skill score of 74% and provides a well-calibrated and reliable estimate of forecast uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Producing projections of future crop yields requires careful thought about the appropriate use of atmosphere-ocean global climate model (AOGCM) simulations. Here we describe and demonstrate multiple methods for ‘calibrating’ climate projections using an ensemble of AOGCM simulations in a ‘perfect sibling’ framework. Crucially, this type of analysis assesses the ability of each calibration methodology to produce reliable estimates of future climate, which is not possible just using historical observations. This type of approach could be more widely adopted for assessing calibration methodologies for crop modelling. The calibration methods assessed include the commonly used ‘delta’ (change factor) and ‘nudging’ (bias correction) approaches. We focus on daily maximum temperature in summer over Europe for this idealised case study, but the methods can be generalised to other variables and other regions. The calibration methods, which are relatively easy to implement given appropriate observations, produce more robust projections of future daily maximum temperatures and heat stress than using raw model output. The choice over which calibration method to use will likely depend on the situation, but change factor approaches tend to perform best in our examples. Finally, we demonstrate that the uncertainty due to the choice of calibration methodology is a significant contributor to the total uncertainty in future climate projections for impact studies. We conclude that utilising a variety of calibration methods on output from a wide range of AOGCMs is essential to produce climate data that will ensure robust and reliable crop yield projections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new speleothem record of atmospheric Δ14C between 28 and 44 ka that offers considerable promise for resolving some of the uncertainty associated with existing radiocarbon calibration curves for this time period. The record is based on a comprehensive suite of AMS 14C ages, using new low-blank protocols, and U–Th ages using high precision MC-ICPMS procedures. Atmospheric Δ14C was calculated by correcting 14C ages with a constant dead carbon fraction (DCF) of 22.7 ± 5.9%, based on a comparison of stalagmite 14C ages with the IntCal04 (Reimer et al., 2004) calibration curve between 15 and 11 ka. The new Δ14C speleothem record shows similar structure and amplitude to that derived from Cariaco Basin foraminifera (Hughen et al., 2004, 2006), and the match is further improved if the latter is tied to the most recent Greenland ice core chronology (Svensson et al., 2008). These data are however in conflict with a previously published 14C data set for a stalagmite record from the Bahamas — GB-89-24-1 (Beck et al., 2001), which likely suffered from 14C analytical blank subtraction issues in the older part of the record. The new Bahamas speleothem ∆14C data do not show the extreme shifts between 44 and 40 ka reported in the previous study (Beck et al., 2001). Causes for the observed structure in derived atmospheric Δ14C variation based on the new speleothem data are investigated with a suite of simulations using an earth system model of intermediate complexity. Data-model comparison indicates that major fluctuations in atmospheric ∆14C during marine isotope stage 3 is primarily a function of changes in geomagnetic field intensity, although ocean–atmosphere system reorganisation also played a supporting role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The CWRF is developed as a climate extension of the Weather Research and Forecasting model (WRF) by incorporating numerous improvements in the representation of physical processes and integration of external (top, surface, lateral) forcings that are crucial to climate scales, including interactions between land, atmosphere, and ocean; convection and microphysics; and cloud, aerosol, and radiation; and system consistency throughout all process modules. This extension inherits all WRF functionalities for numerical weather prediction while enhancing the capability for climate modeling. As such, CWRF can be applied seamlessly to weather forecast and climate prediction. The CWRF is built with a comprehensive ensemble of alternative parameterization schemes for each of the key physical processes, including surface (land, ocean), planetary boundary layer, cumulus (deep, shallow), microphysics, cloud, aerosol, and radiation, and their interactions. This facilitates the use of an optimized physics ensemble approach to improve weather or climate prediction along with a reliable uncertainty estimate. The CWRF also emphasizes the societal service capability to provide impactrelevant information by coupling with detailed models of terrestrial hydrology, coastal ocean, crop growth, air quality, and a recently expanded interactive water quality and ecosystem model. This study provides a general CWRF description and basic skill evaluation based on a continuous integration for the period 1979– 2009 as compared with that of WRF, using a 30-km grid spacing over a domain that includes the contiguous United States plus southern Canada and northern Mexico. In addition to advantages of greater application capability, CWRF improves performance in radiation and terrestrial hydrology over WRF and other regional models. Precipitation simulation, however, remains a challenge for all of the tested models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flash floods pose a significant danger for life and property. Unfortunately, in arid and semiarid environment the runoff generation shows a complex non-linear behavior with a strong spatial and temporal non-uniformity. As a result, the predictions made by physically-based simulations in semiarid areas are subject to great uncertainty, and a failure in the predictive behavior of existing models is common. Thus better descriptions of physical processes at the watershed scale need to be incorporated into the hydrological model structures. For example, terrain relief has been systematically considered static in flood modelling at the watershed scale. Here, we show that the integrated effect of small distributed relief variations originated through concurrent hydrological processes within a storm event was significant on the watershed scale hydrograph. We model these observations by introducing dynamic formulations of two relief-related parameters at diverse scales: maximum depression storage, and roughness coefficient in channels. In the final (a posteriori) model structure these parameters are allowed to be both time-constant or time-varying. The case under study is a convective storm in a semiarid Mediterranean watershed with ephemeral channels and high agricultural pressures (the Rambla del Albujón watershed; 556 km 2 ), which showed a complex multi-peak response. First, to obtain quasi-sensible simulations in the (a priori) model with time-constant relief-related parameters, a spatially distributed parameterization was strictly required. Second, a generalized likelihood uncertainty estimation (GLUE) inference applied to the improved model structure, and conditioned to observed nested hydrographs, showed that accounting for dynamic relief-related parameters led to improved simulations. The discussion is finally broadened by considering the use of the calibrated model both to analyze the sensitivity of the watershed to storm motion and to attempt the flood forecasting of a stratiform event with highly different behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As satellite technology develops, satellite rainfall estimates are likely to become ever more important in the world of food security. It is therefore vital to be able to identify the uncertainty of such estimates and for end users to be able to use this information in a meaningful way. This paper presents new developments in the methodology of simulating satellite rainfall ensembles from thermal infrared satellite data. Although the basic sequential simulation methodology has been developed in previous studies, it was not suitable for use in regions with more complex terrain and limited calibration data. Developments in this work include the creation of a multithreshold, multizone calibration procedure, plus investigations into the causes of an overestimation of low rainfall amounts and the best way to take into account clustered calibration data. A case study of the Ethiopian highlands has been used as an illustration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.