890 resultados para Nelson and Siegel model
Resumo:
Results are presented from a new web application called OceanDIVA - Ocean Data Intercomparison and Visualization Application. This tool reads hydrographic profiles and ocean model output and presents the data on either depth levels or isotherms for viewing in Google Earth, or as probability density functions (PDFs) of regional model-data misfits. As part of the CLIVAR Global Synthesis and Observations Panel, an intercomparison of water mass properties of various ocean syntheses has been undertaken using OceanDIVA. Analysis of model-data misfits reveals significant differences between the water mass properties of the syntheses, such as the ability to capture mode water properties.
Resumo:
Severe wind storms are one of the major natural hazards in the extratropics and inflict substantial economic damages and even casualties. Insured storm-related losses depend on (i) the frequency, nature and dynamics of storms, (ii) the vulnerability of the values at risk, (iii) the geographical distribution of these values, and (iv) the particular conditions of the risk transfer. It is thus of great importance to assess the impact of climate change on future storm losses. To this end, the current study employs—to our knowledge for the first time—a coupled approach, using output from high-resolution regional climate model scenarios for the European sector to drive an operational insurance loss model. An ensemble of coupled climate-damage scenarios is used to provide an estimate of the inherent uncertainties. Output of two state-of-the-art global climate models (HadAM3, ECHAM5) is used for present (1961–1990) and future climates (2071–2100, SRES A2 scenario). These serve as boundary data for two nested regional climate models with a sophisticated gust parametrizations (CLM, CHRM). For validation and calibration purposes, an additional simulation is undertaken with the CHRM driven by the ERA40 reanalysis. The operational insurance model (Swiss Re) uses a European-wide damage function, an average vulnerability curve for all risk types, and contains the actual value distribution of a complete European market portfolio. The coupling between climate and damage models is based on daily maxima of 10 m gust winds, and the strategy adopted consists of three main steps: (i) development and application of a pragmatic selection criterion to retrieve significant storm events, (ii) generation of a probabilistic event set using a Monte-Carlo approach in the hazard module of the insurance model, and (iii) calibration of the simulated annual expected losses with a historic loss data base. The climate models considered agree regarding an increase in the intensity of extreme storms in a band across central Europe (stretching from southern UK and northern France to Denmark, northern Germany into eastern Europe). This effect increases with event strength, and rare storms show the largest climate change sensitivity, but are also beset with the largest uncertainties. Wind gusts decrease over northern Scandinavia and Southern Europe. Highest intra-ensemble variability is simulated for Ireland, the UK, the Mediterranean, and parts of Eastern Europe. The resulting changes on European-wide losses over the 110-year period are positive for all layers and all model runs considered and amount to 44% (annual expected loss), 23% (10 years loss), 50% (30 years loss), and 104% (100 years loss). There is a disproportionate increase in losses for rare high-impact events. The changes result from increases in both severity and frequency of wind gusts. Considerable geographical variability of the expected losses exists, with Denmark and Germany experiencing the largest loss increases (116% and 114%, respectively). All countries considered except for Ireland (−22%) experience some loss increases. Some ramifications of these results for the socio-economic sector are discussed, and future avenues for research are highlighted. The technique introduced in this study and its application to realistic market portfolios offer exciting prospects for future research on the impact of climate change that is relevant for policy makers, scientists and economists.
Resumo:
Cross-hole anisotropic electrical and seismic tomograms of fractured metamorphic rock have been obtained at a test site where extensive hydrological data were available. A strong correlation between electrical resistivity anisotropy and seismic compressional-wave velocity anisotropy has been observed. Analysis of core samples from the site reveal that the shale-rich rocks have fabric-related average velocity anisotropy of between 10% and 30%. The cross-hole seismic data are consistent with these values, indicating that observed anisotropy might be principally due to the inherent rock fabric rather than to the aligned sets of open fractures. One region with velocity anisotropy greater than 30% has been modelled as aligned open fractures within an anisotropic rock matrix and this model is consistent with available fracture density and hydraulic transmissivity data from the boreholes and the cross-hole resistivity tomography data. However, in general the study highlights the uncertainties that can arise, due to the relative influence of rock fabric and fluid-filled fractures, when using geophysical techniques for hydrological investigations.
Resumo:
This paper describes a new bio-indicator method for assessing wetland ecosystem health: as such, the study is particularly relevant to current legislation such as the EU Water Framework Directive, which provides a baseline of the current status Of Surface waters. Seven wetland sites were monitored across northern Britain, with model construction data for predicting, eco-hydroloplical relationships collected from five sites during 1999, Two new sites and one repeat site were monitored during 2000 to provide model test data. The main growing season for the vegetation, and hence the sampling period, was May-August during both years. Seasonal mean concentrations of nitrate (NO3-) in surface and soil water samples during 1999 ranged from 0.01 to 14.07 mg N 1(-1), with a mean value of 1.01 mg N 1(-1). During 2000, concentrations ranged from trace level (<0.01 m- N 1(-1)) to 9.43 mg N 1(-1), with a mean of 2.73 mg N 1(.)(-1) Surface and soil-water nitrate concentrations did not influence plant species composition significantly across representative tall herb fen and mire communities. Predictive relationships were found between nitrate concentrations and structural characteristics of the wetland vegetation, and a model was developed which predicted nitrate concentrations from measures of plant diversity, canopy structure and density of reproductive structures. Two further models, which predicted stem density and density of reproductive structures respectively, utilised nitrate concentration as one of the independent predictor variables. Where appropriate, the models were tested using data collected during 2000. This approach is complementary to species-based monitoring, representing a useful and simple too] to assess ecological status in target wetland systems and has potential for bio-indication purposes.
Resumo:
The water quality of the Pang and Lambourn, tributaries of the River Thames, in south-eastern England, is described in relation to spatial and temporal dimensions. The river waters are supplied mainly from Chalk-fed aquifer sources and are, therefore, of a calcium-bicarbonate type. The major, minor and trace element chemistry of the rivers is controlled by a combination of atmospheric and pollutant inputs from agriculture and sewage sources superimposed on a background water quality signal linked to geological sources. Water quality does not vary greatly over time or space. However. in detail, there are differences in water quality between the Pang and Lambourn and between sites along the Pang and the Lambourn. These differences reflect hydrological processes, water flow pathways and water quality input fluxes. The Pangs pattern of water quality change is more variable than that of the Lambourn. The flow hydrograph also shows both a cyclical and 'uniform pattern' characteristic of aquifer drainage with, superimposed, a series of 'flashier' spiked responses characteristic of karstic systems. The Lambourn, in contrast, shows simpler features without the 'flashier' responses, The results are discussed in relation to the newly developed UK community programme LOCAR dealing with Lowland Catchment Research. A descriptive and box model structure is provided to describe the key features of water quality variations in relation to soil, unsaturated and groundwater flows and storage both away from and close to the river.
Resumo:
Acid mine drainage (AMD) is a widespread environmental problem associated with both working and abandoned mining operations. As part of an overall strategy to determine a long-term treatment option for AMD, a pilot passive treatment plant was constructed in 1994 at Wheal Jane Mine in Cornwall, UK. The plant consists of three separate systems, each containing aerobic reed beds, anaerobic cell and rock filters, and represents the largest European experimental facility of its kind. The systems only differ by the type of pretreatment utilised to increase the pH of the influent minewater (pH <4): lime dosed (LD), anoxic limestone drain (ALD) and lime free (LF), which receives no form of pretreatment. Historical data (1994-1997) indicate median Fe reduction between 55% and 92%, sulphate removal in the range of 3-38% and removal of target metals (cadmium, copper and zinc) below detection limits, depending on pretreatment and flow rates through the system. A new model to simulate the processes and dynamics of the wetlands systems is described, as well as the application of the model to experimental data collected at the pilot plant. The model is process based, and utilises reaction kinetic approaches based on experimental microbial techniques rather than an equilibrium approach to metal precipitation. The model is dynamic and utilises numerical integration routines to solve a set of differential equations that describe the behaviour of 20 variables over the 17 pilot plant cells on a daily basis. The model outputs at each cell boundary are evaluated and compared with the measured data, and the model is demonstrated to provide a good representation of the complex behaviour of the wetland system for a wide range of variables. (C) 2004 Elsevier B.V/ All rights reserved.
Resumo:
Climate change science is increasingly concerned with methods for managing and integrating sources of uncertainty from emission storylines, climate model projections, and ecosystem model parameterizations. In tropical ecosystems, regional climate projections and modeled ecosystem responses vary greatly, leading to a significant source of uncertainty in global biogeochemical accounting and possible future climate feedbacks. Here, we combine an ensemble of IPCC-AR4 climate change projections for the Amazon Basin (eight general circulation models) with alternative ecosystem parameter sets for the dynamic global vegetation model, LPJmL. We evaluate LPJmL simulations of carbon stocks and fluxes against flux tower and aboveground biomass datasets for individual sites and the entire basin. Variability in LPJmL model sensitivity to future climate change is primarily related to light and water limitations through biochemical and water-balance-related parameters. Temperature-dependent parameters related to plant respiration and photosynthesis appear to be less important than vegetation dynamics (and their parameters) for determining the magnitude of ecosystem response to climate change. Variance partitioning approaches reveal that relationships between uncertainty from ecosystem dynamics and climate projections are dependent on geographic location and the targeted ecosystem process. Parameter uncertainty from the LPJmL model does not affect the trajectory of ecosystem response for a given climate change scenario and the primary source of uncertainty for Amazon 'dieback' results from the uncertainty among climate projections. Our approach for describing uncertainty is applicable for informing and prioritizing policy options related to mitigation and adaptation where long-term investments are required.
Resumo:
The impacts of climate change on crop productivity are often assessed using simulations from a numerical climate model as an input to a crop simulation model. The precision of these predictions reflects the uncertainty in both models. We examined how uncertainty in a climate (HadAM3) and crop General Large-Area Model (GLAM) for annual crops model affects the mean and standard deviation of crop yield simulations in present and doubled carbon dioxide (CO2) climates by perturbation of parameters in each model. The climate sensitivity parameter (lambda, the equilibrium response of global mean surface temperature to doubled CO2) was used to define the control climate. Observed 1966-1989 mean yields of groundnut (Arachis hypogaea L.) in India were simulated well by the crop model using the control climate and climates with values of lambda near the control value. The simulations were used to measure the contribution to uncertainty of key crop and climate model parameters. The standard deviation of yield was more affected by perturbation of climate parameters than crop model parameters in both the present-day and doubled CO2 climates. Climate uncertainty was higher in the doubled CO2 climate than in the present-day climate. Crop transpiration efficiency was key to crop model uncertainty in both present-day and doubled CO2 climates. The response of crop development to mean temperature contributed little uncertainty in the present-day simulations but was among the largest contributors under doubled CO2. The ensemble methods used here to quantify physical and biological uncertainty offer a method to improve model estimates of the impacts of climate change.
Resumo:
The farm-level success of Bt-cotton in developing countries is well documented. However, the literature has only recently begun to recognise the importance of accounting for the effects of the technology on production risk, in addition to the mean effect estimated by previous studies. The risk effects of the technology are likely very important to smallholder farmers in the developing world due to their risk-aversion. We advance the emergent literature on Bt-cotton and production risk by using panel data methods to control for possible endogeneity of Bt-adoption. We estimate two models, the first a fixed-effects version of the Just and Pope model with additive individual and time effects, and the second a variation of the model in which inputs and variety choice are allowed to affect the variance of the time effect and its correlation with the idiosyncratic error. The models are applied to panel data on smallholder cotton production in India and South Africa. Our results suggest a risk-reducing effect of Bt-cotton in India, but an inconclusive picture in South Africa.
Resumo:
This note presents a robust method for estimating response surfaces that consist of linear response regimes and a linear plateau. The linear response-and-plateau model has fascinated production scientists since von Liebig (1855) and, as Upton and Dalton indicated, some years ago in this Journal, the response-and-plateau model seems to fit the data in many empirical studies. The estimation algorithm evolves from Bayesian implementation of a switching-regression (finite mixtures) model and demonstrates routine application of Gibbs sampling and data augmentation-techniques that are now in widespread application in other disciplines.
Resumo:
The assumption that ignoring irrelevant sound in a serial recall situation is identical to ignoring a non-target channel in dichotic listening is challenged. Dichotic listening is open to moderating effects of working memory capacity (Conway et al., 2001) whereas irrelevant sound effects (ISE) are not (Beaman, 2004). A right ear processing bias is apparent in dichotic listening, whereas the bias is to the left ear in the ISE (Hadlington et al., 2004). Positron emission tomography (PET) imaging data (Scott et al., 2004, submitted) show bilateral activation of the superior temporal gyrus (STG) in the presence of intelligible, but ignored, background speech and right hemisphere activation of the STG in the presence of unintelligible background speech. It is suggested that the right STG may be involved in the ISE and a particularly strong left ear effect might occur because of the contralateral connections in audition. It is further suggested that left STG activity is associated with dichotic listening effects and may be influenced by working memory span capacity. The relationship of this functional and neuroanatomical model to known neural correlates of working memory is considered.
Resumo:
A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
Several theories of the mechanisms linking perception and action require that the links are bidirectional, but there is a lack of consensus on the effects that action has on perception. We investigated this by measuring visual event-related brain potentials to observed hand actions while participants prepared responses that were spatially compatible (e.g., both were on the left side of the body) or incompatible and action type compatible (e.g., both were finger taps) or incompatible, with observed actions. An early enhanced processing of spatially compatible stimuli was observed, which is likely due to spatial attention. This was followed by an attenuation of processing for both spatially and action type compatible stimuli, likely to be driven by efference copy signals that attenuate processing of predicted sensory consequences of actions. Attenuation was not response-modality specific; it was found for manual stimuli when participants prepared manual and vocal responses, in line with the hypothesis that action control is hierarchically organized. These results indicate that spatial attention and forward model prediction mechanisms have opposite, but temporally distinct, effects on perception. This hypothesis can explain the inconsistency of recent findings on action-perception links and thereby supports the view that sensorimotor links are bidirectional. Such effects of action on perception are likely to be crucial, not only for the control of our own actions but also in sociocultural interaction, allowing us to predict the reactions of others to our own actions.
Resumo:
This paper introduces a new fast, effective and practical model structure construction algorithm for a mixture of experts network system utilising only process data. The algorithm is based on a novel forward constrained regression procedure. Given a full set of the experts as potential model bases, the structure construction algorithm, formed on the forward constrained regression procedure, selects the most significant model base one by one so as to minimise the overall system approximation error at each iteration, while the gate parameters in the mixture of experts network system are accordingly adjusted so as to satisfy the convex constraints required in the derivation of the forward constrained regression procedure. The procedure continues until a proper system model is constructed that utilises some or all of the experts. A pruning algorithm of the consequent mixture of experts network system is also derived to generate an overall parsimonious construction algorithm. Numerical examples are provided to demonstrate the effectiveness of the new algorithms. The mixture of experts network framework can be applied to a wide variety of applications ranging from multiple model controller synthesis to multi-sensor data fusion.
Resumo:
Techniques for modelling urban microclimates and urban block surfaces temperatures are desired by urban planners and architects for strategic urban designs at the early design stages. This paper introduces a simplified mathematical model for urban simulations (UMsim) including urban surfaces temperatures and microclimates. The nodal network model has been developed by integrating coupled thermal and airflow model. Direct solar radiation, diffuse radiation, reflected radiation, long-wave radiation, heat convection in air and heat transfer in the exterior walls and ground within the complex have been taken into account. The relevant equations have been solved using the finite difference method under the Matlab platform. Comparisons have been conducted between the data produced from the simulation and that from an urban experimental study carried out in a real architectural complex on the campus of Chongqing University, China in July 2005 and January 2006. The results show a satisfactory agreement between the two sets of data. The UMsim can be used to simulate the microclimates, in particular the surface temperatures of urban blocks, therefore it can be used to assess the impact of urban surfaces properties on urban microclimates. The UMsim will be able to produce robust data and images of urban environments for sustainable urban design.