934 resultados para calibration of rainfall-runoff models
Resumo:
We performed an ensemble of twelve five-year experiments using a coupled climate-carbon-cycle model with scenarios of prescribed atmospheric carbon dioxide concentration; CO2 was instantaneously doubled or quadrupled at the start of the experiments. Within these five years, climate feedback is not significantly influenced by the effects of climate change on the carbon system. However, rapid changes take place, within much less than a year, due to the physiological effect of CO2 on plant stomatal conductance, leading to adjustment in the shortwave cloud radiative effect over land, due to a reduction in low cloud cover. This causes a 10% enhancement to the radiative forcing due to CO2, which leads to an increase in the equilibrium warming of 0.4 and 0.7 K for doubling and quadrupling. The implications for calibration of energy-balance models are discussed.
Resumo:
Although accuracy of digital elevation models (DEMs) can be quantified and measured in different ways, each is influenced by three main factors: terrain character, sampling strategy and interpolation method. These parameters, and their interaction, are discussed. The generation of DEMs from digitised contours is emphasised because this is the major source of DEMs, particularly within member countries of OEEPE. Such DEMs often exhibit unwelcome artifacts, depending on the interpolation method employed. The origin and magnitude of these effects and how they can be reduced to improve the accuracy of the DEMs are also discussed.
Resumo:
A new method of measuring the total conductivity of atmospheric air is described. It depends on determination of the electrical relaxation time of a horizontal wire, mounted between two insulators, which is initially grounded and then allowed to charge freely. The total air conductivity derived is compared with that from an ion mobility spectrometer. Results from the two techniques agreed to within 1.2 fS m(-1). (c) 2006 American Institute of Physics.
Resumo:
Improvements in the resolution of satellite imagery have enabled extraction of water surface elevations at the margins of the flood. Comparison between modelled and observed water surface elevations provides a new means for calibrating and validating flood inundation models, however the uncertainty in this observed data has yet to be addressed. Here a flood inundation model is calibrated using a probabilistic treatment of the observed data. A LiDAR guided snake algorithm is used to determine an outline of a flood event in 2006 on the River Dee, North Wales, UK, using a 12.5m ERS-1 image. Points at approximately 100m intervals along this outline are selected, and the water surface elevation recorded as the LiDAR DEM elevation at each point. With a planar water surface from the gauged upstream to downstream water elevations as an approximation, the water surface elevations at points along this flooded extent are compared to their ‘expected’ value. The pattern of errors between the two show a roughly normal distribution, however when plotted against coordinates there is obvious spatial autocorrelation. The source of this spatial dependency is investigated by comparing errors to the slope gradient and aspect of the LiDAR DEM. A LISFLOOD-FP model of the flood event is set-up to investigate the effect of observed data uncertainty on the calibration of flood inundation models. Multiple simulations are run using different combinations of friction parameters, from which the optimum parameter set will be selected. For each simulation a T-test is used to quantify the fit between modelled and observed water surface elevations. The points chosen for use in this T-test are selected based on their error. The criteria for selection enables evaluation of the sensitivity of the choice of optimum parameter set to uncertainty in the observed data. This work explores the observed data in detail and highlights possible causes of error. The identification of significant error (RMSE = 0.8m) between approximate expected and actual observed elevations from the remotely sensed data emphasises the limitations of using this data in a deterministic manner within the calibration process. These limitations are addressed by developing a new probabilistic approach to using the observed data.
Resumo:
Satellite observed data for flood events have been used to calibrate and validate flood inundation models, providing valuable information on the spatial extent of the flood. Improvements in the resolution of this satellite imagery have enabled indirect remote sensing of water levels by using an underlying LiDAR DEM to extract the water surface elevation at the flood margin. Further to comparison of the spatial extent, this now allows for direct comparison between modelled and observed water surface elevations. Using a 12.5m ERS-1 image of a flood event in 2006 on the River Dee, North Wales, UK, both of these data types are extracted and each assessed for their value in the calibration of flood inundation models. A LiDAR guided snake algorithm is used to extract an outline of the flood from the satellite image. From the extracted outline a binary grid of wet / dry cells is created at the same resolution as the model, using this the spatial extent of the modelled and observed flood can be compared using a measure of fit between the two binary patterns of flooding. Water heights are extracted using points at intervals of approximately 100m along the extracted outline, and the students T-test is used to compare modelled and observed water surface elevations. A LISFLOOD-FP model of the catchment is set up using LiDAR topographic data resampled to the 12.5m resolution of the satellite image, and calibration of the friction parameter in the model is undertaken using each of the two approaches. Comparison between the two approaches highlights the sensitivity of the spatial measure of fit to uncertainty in the observed data and the potential drawbacks of using the spatial extent when parts of the flood are contained by the topography.
Resumo:
Carbendazim is highly toxic to earthworms and is used as a standard control substance when running field-based trials of pesticides, but results using carbendazim are highly variable. In the present study, impacts of timing of rainfall events following carbendazim application on earthworms were investigated. Lumbricus terrestris were maintained in soil columns to which carbendazim and then deionized water (a rainfall substitute) were applied. Carbendazim was applied at 4 kg/ha, the rate recommended in pesticide field trials. Three rainfall regimes were investigated: initial and delayed heavy rainfall 24 h and 6 d after carbendazim application, and frequent rainfall every 48 h. Earthworm mortality and movement of carbendazim through the soil was assessed 14 d after carbendazim application. No detectable movement of carbendazim occurred through the soil in any of the treatments or controls. Mortality in the initial heavy and frequent rainfall was significantly higher (approximately 55%) than in the delayed rainfall treatment (approximately 25%). This was due to reduced bioavailability of carbendazim in the latter treatment due to a prolonged period of sorption of carbendazim to soil particles before rainfall events. The impact of carbendazim application on earthworm surface activity was assessed using video cameras. Carbendazim applications significantly reduced surface activity due to avoidance behavior of the earthworms. Surface activity reductions were least in the delayed rainfall treatment due to the reduced bioavailability of the carbendazim. The nature of rainfall events' impacts on the response of earthworms to carbendazim applications, and details of rainfall events preceding and following applications during field trials should be made at a higher level of resolution than is currently practiced according to standard International Organization for Standardization protocols.
Resumo:
This chapter introduces ABMs, their construction, and the pros and cons of their use. Although relatively new, agent-basedmodels (ABMs) have great potential for use in ecotoxicological research – their primary advantage being the realistic simulations that can be constructed and particularly their explicit handling of space and time in simulations. Examples are provided of their use in ecotoxicology primarily exemplified by different implementations of the ALMaSS system. These examples presented demonstrate how multiple stressors, landscape structure, details regarding toxicology, animal behavior, and socioeconomic effects can and should be taken into account when constructing simulations for risk assessment. Like ecological systems, in ABMs the behavior at the system level is not simply the mean of the component responses, but the sum of the often nonlinear interactions between components in the system; hence this modeling approach opens the door to implementing and testing much more realistic and holistic ecotoxicological models than are currently used.
Resumo:
With the current concern over climate change, descriptions of how rainfall patterns are changing over time can be useful. Observations of daily rainfall data over the last few decades provide information on these trends. Generalized linear models are typically used to model patterns in the occurrence and intensity of rainfall. These models describe rainfall patterns for an average year but are more limited when describing long-term trends, particularly when these are potentially non-linear. Generalized additive models (GAMS) provide a framework for modelling non-linear relationships by fitting smooth functions to the data. This paper describes how GAMS can extend the flexibility of models to describe seasonal patterns and long-term trends in the occurrence and intensity of daily rainfall using data from Mauritius from 1962 to 2001. Smoothed estimates from the models provide useful graphical descriptions of changing rainfall patterns over the last 40 years at this location. GAMS are particularly helpful when exploring non-linear relationships in the data. Care is needed to ensure the choice of smooth functions is appropriate for the data and modelling objectives. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The reliable assessment of the quality of protein structural models is fundamental to the progress of structural bioinformatics. The ModFOLD server provides access to two accurate techniques for the global and local prediction of the quality of 3D models of proteins. Firstly ModFOLD, which is a fast Model Quality Assessment Program (MQAP) used for the global assessment of either single or multiple models. Secondly ModFOLDclust, which is a more intensive method that carries out clustering of multiple models and provides per-residue local quality assessment.
Resumo:
We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
Resistant starch type 2 (RS2) and type 3 (RS3) containing preparations were digested using a batch (a) and a dynamic in vitro model (b). Furthermore, in vivo obtained indigestible fractions from ileostomy patients were used (c). Subsequently these samples were fermented with human feces with a batch and a dynamic in vitro method. The fermentation supernatants were used to treat CAC02 cells. Cytotoxicity, anti-genotoxicity against hydrogen peroxide (comet assay) and the effect on barrier function measured by trans-epithelial electrical resistance were determine. Dynamically fermented samples led to high cytotoxic activity, probably due to additional compounds added during in vitro fermentation. As a consequence only batch fermented samples were investigated further. Batch fermentation of RS resulted in an anti-genotoxic activity ranging from 9-30% decrease in DNA damage for all the samples, except for RS2-b. It is assumed that the changes in RS2 structures due to dynamic digestion resulted in a different fermentation profile not leading to any anti-genotoxic effect. Additionally, in vitro batch fermentation of RS caused an improvement in integrity across the intestinal barrier by approximately 22% for all the samples. We have demonstrated that batch in vitro fermentation of RS2 and RS3 preparations differently pre-digested are capable of inhibiting the initiation and promotion stage in colon carcinogenesis in vitro.
Resumo:
Hidden Markov Models (HMMs) have been successfully applied to different modelling and classification problems from different areas over the recent years. An important step in using HMMs is the initialisation of the parameters of the model as the subsequent learning of HMM’s parameters will be dependent on these values. This initialisation should take into account the knowledge about the addressed problem and also optimisation techniques to estimate the best initial parameters given a cost function, and consequently, to estimate the best log-likelihood. This paper proposes the initialisation of Hidden Markov Models parameters using the optimisation algorithm Differential Evolution with the aim to obtain the best log-likelihood.
Resumo:
This paper presents an approach for automatic classification of pulsed Terahertz (THz), or T-ray, signals highlighting their potential in biomedical, pharmaceutical and security applications. T-ray classification systems supply a wealth of information about test samples and make possible the discrimination of heterogeneous layers within an object. In this paper, a novel technique involving the use of Auto Regressive (AR) and Auto Regressive Moving Average (ARMA) models on the wavelet transforms of measured T-ray pulse data is presented. Two example applications are examined - the classi. cation of normal human bone (NHB) osteoblasts against human osteosarcoma (HOS) cells and the identification of six different powder samples. A variety of model types and orders are used to generate descriptive features for subsequent classification. Wavelet-based de-noising with soft threshold shrinkage is applied to the measured T-ray signals prior to modeling. For classi. cation, a simple Mahalanobis distance classi. er is used. After feature extraction, classi. cation accuracy for cancerous and normal cell types is 93%, whereas for powders, it is 98%.
Resumo:
Tracer gas techniques have been the most appropriate experimental method of determining airflows and ventilation rates in houses. However, current trends to reduce greenhouse gas effects have prompted the need for alternative techniques, such as passive sampling. In this research passive sampling techniques have been used to demonstrate the potential to fulfil these requirements by using solutions of volatile organic compounds (VOCs) and solid phase microextraction (SPME) fibres. These passive sampling techniques have been calibrated against tracer gas decay techniques and measurements from a standard orifice plate. Two constant sources of volatile organic compounds were diffused into two sections of a humidity chamber and sampled using SPME fibres. From a total of four SPME fibres (two in each section), reproducible results were obtained. Emission rates and air movement from one section to the other were predicted using developed algorithms. Comparison of the SPME fibre technique with that of the tracer gas technique and measurements from an orifice plate showed similar results with good precision and accuracy. With these fibres, infiltration rates can be measured over grab samples in a time weighted averaged period lasting from 10 minutes up to several days. Key words: passive samplers, solid phase microextraction fibre, tracer gas techniques, airflow, air infiltration, houses.