120 resultados para Model Output Statistics


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We compared output from 3 dynamic process-based models (DMs: ECOSSE, MILLENNIA and the Durham Carbon Model) and 9 bioclimatic envelope models (BCEMs; including BBOG ensemble and PEATSTASH) ranging from simple threshold to semi-process-based models. Model simulations were run at 4 British peatland sites using historical climate data and climate projections under a medium (A1B) emissions scenario from the 11-RCM (regional climate model) ensemble underpinning UKCP09. The models showed that blanket peatlands are vulnerable to projected climate change; however, predictions varied between models as well as between sites. All BCEMs predicted a shift from presence to absence of a climate associated with blanket peat, where the sites with the lowest total annual precipitation were closest to the presence/absence threshold. DMs showed a more variable response. ECOSSE predicted a decline in net C sink and shift to net C source by the end of this century. The Durham Carbon Model predicted a smaller decline in the net C sink strength, but no shift to net C source. MILLENNIA predicted a slight overall increase in the net C sink. In contrast to the BCEM projections, the DMs predicted that the sites with coolest temperatures and greatest total annual precipitation showed the largest change in carbon sinks. In this model inter-comparison, the greatest variation in model output in response to climate change projections was not between the BCEMs and DMs but between the DMs themselves, because of different approaches to modelling soil organic matter pools and decomposition amongst other processes. The difference in the sign of the response has major implications for future climate feedbacks, climate policy and peatland management. Enhanced data collection, in particular monitoring peatland response to current change, would significantly improve model development and projections of future change.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UKMeteorological Office Hadley Centre’s climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with extremes from the MIRA dataset. The results suggest that the model reproduces the number and spatial distribution of rainfall extremes with some accuracy, but that mean rainfall and rainfall variability is underestimated (over-estimated) over wet (dry) regions of southern Africa.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper investigates the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, simple and simplistic models may produce similar outputs to more robust and disaggregated models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The temporal variability of the atmosphere through which radio waves pass in the technique of differential radar interferometry can seriously limit the accuracy with which the method can measure surface motion. A forward, nested mesoscale model of the atmosphere can be used to simulate the variable water content along the radar path and the resultant phase delays. Using this approach we demonstrate how to correct an interferogram of Mount Etna in Sicily associated with an eruption in 2004-5. The regional mesoscale model (Unified Model) used to simulate the atmosphere at higher resolutions consists of four nested domains increasing in resolution (12, 4, 1, 0.3 km), sitting within the analysis version of a global numerical model that is used to initiate the simulation. Using the high resolution 3D model output we compute the surface pressure, temperature and the water vapour, liquid and solid water contents, enabling the dominant hydrostatic and wet delays to be calculated at specific times corresponding to the acquisition of the radar data. We can also simulate the second-order delay effects due to liquid water and ice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A simple four-dimensional assimilation technique, called Newtonian relaxation, has been applied to the Hamburg climate model (ECHAM), to enable comparison of model output with observations for short periods of time. The prognostic model variables vorticity, divergence, temperature, and surface pressure have been relaxed toward European Center for Medium-Range Weather Forecasts (ECMWF) global meteorological analyses. Several experiments have been carried out, in which the values of the relaxation coefficients have been varied to find out which values are most usable for our purpose. To be able to use the method for validation of model physics or chemistry, good agreement of the model simulated mass and wind field is required. In addition, the model physics should not be disturbed too strongly by the relaxation forcing itself. Both aspects have been investigated. Good agreement with basic observed quantities, like wind, temperature, and pressure is obtained for most simulations in the extratropics. Derived variables, like precipitation and evaporation, have been compared with ECMWF forecasts and observations. Agreement for these variables is smaller than for the basic observed quantities. Nevertheless, considerable improvement is obtained relative to a control run without assimilation. Differences between tropics and extratropics are smaller than for the basic observed quantities. Results also show that precipitation and evaporation are affected by a sort of continuous spin-up which is introduced by the relaxation: the bias (ECMWF-ECHAM) is increasing with increasing relaxation forcing. In agreement with this result we found that with increasing relaxation forcing the vertical exchange of tracers by turbulent boundary layer mixing and, in a lesser extent, by convection, is reduced.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work proposes a unified neurofuzzy modelling scheme. To begin with, the initial fuzzy base construction method is based on fuzzy clustering utilising a Gaussian mixture model (GMM) combined with the analysis of covariance (ANOVA) decomposition in order to obtain more compact univariate and bivariate membership functions over the subspaces of the input features. The mean and covariance of the Gaussian membership functions are found by the expectation maximisation (EM) algorithm with the merit of revealing the underlying density distribution of system inputs. The resultant set of membership functions forms the basis of the generalised fuzzy model (GFM) inference engine. The model structure and parameters of this neurofuzzy model are identified via the supervised subspace orthogonal least square (OLS) learning. Finally, instead of providing deterministic class label as model output by convention, a logistic regression model is applied to present the classifier’s output, in which the sigmoid type of logistic transfer function scales the outputs of the neurofuzzy model to the class probability. Experimental validation results are presented to demonstrate the effectiveness of the proposed neurofuzzy modelling scheme.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Neurovascular coupling in response to stimulation of the rat barrel cortex was investigated using concurrent multichannel electrophysiology and laser Doppler flowmetry. The data were used to build a linear dynamic model relating neural activity to blood flow. Local field potential time series were subject to current source density analysis, and the time series of a layer IV sink of the barrel cortex was used as the input to the model. The model output was the time series of the changes in regional cerebral blood flow (CBF). We show that this model can provide excellent fit of the CBF responses for stimulus durations of up to 16 s. The structure of the model consisted of two coupled components representing vascular dilation and constriction. The complex temporal characteristics of the CBF time series were reproduced by the relatively simple balance of these two components. We show that the impulse response obtained under the 16-s duration stimulation condition generalised to provide a good prediction to the data from the shorter duration stimulation conditions. Furthermore, by optimising three out of the total of nine model parameters, the variability in the data can be well accounted for over a wide range of stimulus conditions. By establishing linearity, classic system analysis methods can be used to generate and explore a range of equivalent model structures (e.g., feed-forward or feedback) to guide the experimental investigation of the control of vascular dilation and constriction following stimulation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present an assessment of how tropical cyclone activity might change due to the influence of increased atmospheric carbon dioxide concentrations, using the UK’s High Resolution Global Environment Model (HiGEM) with N144 resolution (~90 km in the atmosphere and ~40 km in the ocean). Tropical cyclones are identified using a feature tracking algorithm applied to model output. Tropical cyclones from idealized 30-year 2×CO2 (2CO2) and 4×CO2 (4CO2) simulations are compared to those identified in a 150-year present-day simulation, which is separated into a 5-member ensemble of 30-year integrations. Tropical cyclones are shown to decrease in frequency globally by 9% in the 2CO2 and 26% in the 4CO2. Tropical cyclones only become more intese in the 4CO2, however uncoupled time slice experiments reveal an increase in intensity in the 2CO2. An investigation into the large-scale environmental conditions, known to influence tropical cyclone activity in the main development regions, is used to determine the response of tropical cyclone activity to increased atmospheric CO2. A weaker Walker circulation and a reduction in zonally averaged regions of updrafts lead to a shift in the location of tropical cyclones in the northern hemisphere. A decrease in mean ascent at 500 hPa contributes to the reduction of tropical cyclones in the 2CO2 in most basins. The larger reduction of tropical cyclones in the 4CO2 arises from further reduction of mean ascent at 500 hPa and a large enhancement of vertical wind shear, especially in the southern hemisphere, North Atlantic and North East Pacific.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years several methodologies have been developed to combine and interpret ensembles of climate models with the aim of quantifying uncertainties in climate projections. Constrained climate model forecasts have been generated by combining various choices of metrics used to weight individual ensemble members, with diverse approaches to sampling the ensemble. The forecasts obtained are often significantly different, even when based on the same model output. Therefore, a climate model forecast classification system can serve two roles: to provide a way for forecast producers to self-classify their forecasts; and to provide information on the methodological assumptions underlying the forecast generation and its uncertainty when forecasts are used for impacts studies. In this review we propose a possible classification system based on choices of metrics and sampling strategies. We illustrate the impact of some of the possible choices in the uncertainty quantification of large scale projections of temperature and precipitation changes, and briefly discuss possible connections between climate forecast uncertainty quantification and decision making approaches in the climate change context.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The canopy interception capacity is a small but key part of the surface hydrology, which affects the amount of water intercepted by vegetation and therefore the partitioning of evaporation and transpiration. However, little research with climate models has been done to understand the effects of a range of possible canopy interception capacity parameter values. This is in part due to the assumption that it does not significantly affect climate. Near global evapotranspiration products now make evaluation of canopy interception capacity parameterisations possible. We use a range of canopy water interception capacity values from the literature to investigate the effect on climate within the climate model HadCM3. We find that the global mean temperature is affected by up to -0.64 K globally and -1.9 K regionally. These temperature impacts are predominantly due to changes in the evaporative fraction and top of atmosphere albedo. In the tropics, the variations in evapotranspiration affect precipitation, significantly enhancing rainfall. Comparing the model output to measurements, we find that the default canopy interception capacity parameterisation overestimates canopy interception loss (i.e. canopy evaporation) and underestimates transpiration. Overall, decreasing canopy interception capacity improves the evapotranspiration partitioning in HadCM3, though the measurement literature more strongly supports an increase. The high sensitivity of climate to the parameterisation of canopy interception capacity is partially due to the high number of light rain-days in the climate model that means that interception is overestimated. This work highlights the hitherto underestimated importance of canopy interception capacity in climate model hydroclimatology and the need to acknowledge the role of precipitation representation limitations in determining parameterisations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. We compared the baseline phosphorus (P) concentrations inferred by diatom-P transfer functions and export coefficient models at 62 lakes in Great Britain to assess whether the techniques produce similar estimates of historical nutrient status. 2. There was a strong linear relationship between the two sets of values over the whole total P (TP) gradient (2-200 mu g TP L-1). However, a systematic bias was observed with the diatom model producing the higher values in 46 lakes (of which values differed by more than 10 mu g TP L-1 in 21). The export coefficient model gave the higher values in 10 lakes (of which the values differed by more than 10 mu g TP L-1 in only 4). 3. The difference between baseline and present-day TP concentrations was calculated to compare the extent of eutrophication inferred by the two sets of model output. There was generally poor agreement between the amounts of change estimated by the two approaches. The discrepancy in both the baseline values and the degree of change inferred by the models was greatest in the shallow and more productive sites. 4. Both approaches were applied to two lakes in the English Lake District where long-term P data exist, to assess how well the models track measured P concentrations since approximately 1850. There was good agreement between the pre-enrichment TP concentrations generated by the models. The diatom model paralleled the steeper rise in maximum soluble reactive P (SRP) more closely than the gradual increase in annual mean TP in both lakes. The export coefficient model produced a closer fit to observed annual mean TP concentrations for both sites, tracking the changes in total external nutrient loading. 5. A combined approach is recommended, with the diatom model employed to reflect the nature and timing of the in-lake response to changes in nutrient loading, and the export coefficient model used to establish the origins and extent of changes in the external load and to assess potential reduction in loading under different management scenarios. 6. However, caution must be exercised when applying these models to shallow lakes where the export coefficient model TP estimate will not include internal P loading from lake sediments and where the diatom TP inferences may over-estimate TP concentrations because of the high abundance of benthic taxa, many of which are poor indicators of trophic state.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An investigation using the Stepping Out model of early hominin dispersal out of Africa is presented here. The late arrival of early hominins into Europe, as deduced from the fossil record, is shown to be consistent with poor ability of these hominins to survive in the Eurasian landscape. The present study also extends the understanding of modelling results from the original study by Mithen and Reed (2002. Stepping out: a computer simulation of hominid dispersal from Africa. J. Hum. Evol. 43, 433-462). The representation of climate and vegetation patterns has been improved through the use of climate model output. This study demonstrates that interpretative confidence may be strengthened, and new insights gained when climate models and hominin dispersal models are integrated. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper analyses historic records of agricultural land use and management for England and Wales from 1931 and 1991 and uses export coefficient modelling to hindcast the impact of these practices on the rates of diffuse nitrogen (N) and phosphorus (P) export to water bodies for each of the major geo-climatic regions of England and Wales. Key trends indicate the importance of animal agriculture as a contributor to the total diffuse agricultural nutrient loading on waters, and the need to bring these sources under control if conditions suitable for sustaining 'Good Ecological Status' under the Water Framework Directive are to be generated. The analysis highlights the importance of measuring changes in nutrient loading in relation to the catchment-specific baseline state for different water bodies. The approach is also used to forecast the likely impact of broad regional scale scenarios on nutrient export to waters and highlights the need to take sensitive land out of production, introduce ceilings on fertilizer use and stocking densities, and controls on agricultural practice in higher risk areas where intensive agriculture is combined with a low intrinsic nutrient retention capacity, although the uncertainties associated with the modelling applied at this scale should be taken into account in the interpretation of model output. The paper advocates the need for a two-tiered approach to nutrient management, combining broad regional policies with targeted management in high risk areas at the catchment and farm scale.