128 resultados para mismatched uncertainties
Resumo:
Although uncertainties in material properties have been addressed in the design of flexible pavements, most current modeling techniques assume that pavement layers are homogeneous. The paper addresses the influence of the spatial variability of the resilient moduli of pavement layers by evaluating the effect of the variance and correlation length on the pavement responses to loading. The integration of the spatially varying log-normal random field with the finite-difference method has been achieved through an exponential autocorrelation function. The variation in the correlation length was found to have a marginal effect on the mean values of the critical strains and a noticeable effect on the standard deviation which decreases with decreases in correlation length. This reduction in the variance arises because of the spatial averaging phenomenon over the softer and stiffer zones generated because of spatial variability. The increase in the mean value of critical strains with decreasing correlation length, although minor, illustrates that pavement performance is adversely affected by the presence of spatially varying layers. The study also confirmed that the higher the variability in the pavement layer moduli, introduced through a higher value of coefficient of variation (COV), the higher the variability in the pavement response. The study concludes that ignoring spatial variability by modeling the pavement layers as homogeneous that have very short correlation lengths can result in the underestimation of the critical strains and thus an inaccurate assessment of the pavement performance. (C) 2014 American Society of Civil Engineers.
Resumo:
Developments in the statistical extreme value theory, which allow non-stationary modeling of changes in the frequency and severity of extremes, are explored to analyze changes in return levels of droughts for the Colorado River. The transient future return levels (conditional quantiles) derived from regional drought projections using appropriate extreme value models, are compared with those from observed naturalized streamflows. The time of detection is computed as the time at which significant differences exist between the observed and future extreme drought levels, accounting for the uncertainties in their estimates. Projections from multiple climate model-scenario combinations are considered; no uniform pattern of changes in drought quantiles is observed across all the projections. While some projections indicate shifting to another stationary regime, for many projections which are found to be non-stationary, detection of change in tail quantiles of droughts occurs within the 21st century with no unanimity in the time of detection. Earlier detection is observed in droughts levels of higher probability of exceedance. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
High wind poses a number of hazards in different areas such as structural safety, aviation, and wind energy-where low wind speed is also a concern, pollutant transport, to name a few. Therefore, usage of a good prediction tool for wind speed is necessary in these areas. Like many other natural processes, behavior of wind is also associated with considerable uncertainties stemming from different sources. Therefore, to develop a reliable prediction tool for wind speed, these uncertainties should be taken into account. In this work, we propose a probabilistic framework for prediction of wind speed from measured spatio-temporal data. The framework is based on decompositions of spatio-temporal covariance and simulation using these decompositions. A novel simulation method based on a tensor decomposition is used here in this context. The proposed framework is composed of a set of four modules, and the modules have flexibility to accommodate further modifications. This framework is applied on measured data on wind speed in Ireland. Both short-and long-term predictions are addressed.
Resumo:
Advances in forest carbon mapping have the potential to greatly reduce uncertainties in the global carbon budget and to facilitate effective emissions mitigation strategies such as REDD+ (Reducing Emissions from Deforestation and Forest Degradation). Though broad-scale mapping is based primarily on remote sensing data, the accuracy of resulting forest carbon stock estimates depends critically on the quality of field measurements and calibration procedures. The mismatch in spatial scales between field inventory plots and larger pixels of current and planned remote sensing products for forest biomass mapping is of particular concern, as it has the potential to introduce errors, especially if forest biomass shows strong local spatial variation. Here, we used 30 large (8-50 ha) globally distributed permanent forest plots to quantify the spatial variability in aboveground biomass density (AGBD in Mgha(-1)) at spatial scales ranging from 5 to 250m (0.025-6.25 ha), and to evaluate the implications of this variability for calibrating remote sensing products using simulated remote sensing footprints. We found that local spatial variability in AGBD is large for standard plot sizes, averaging 46.3% for replicate 0.1 ha subplots within a single large plot, and 16.6% for 1 ha subplots. AGBD showed weak spatial autocorrelation at distances of 20-400 m, with autocorrelation higher in sites with higher topographic variability and statistically significant in half of the sites. We further show that when field calibration plots are smaller than the remote sensing pixels, the high local spatial variability in AGBD leads to a substantial ``dilution'' bias in calibration parameters, a bias that cannot be removed with standard statistical methods. Our results suggest that topography should be explicitly accounted for in future sampling strategies and that much care must be taken in designing calibration schemes if remote sensing of forest carbon is to achieve its promise.
Resumo:
The estimation of water and solute transit times in catchments is crucial for predicting the response of hydrosystems to external forcings (climatic or anthropogenic). The hydrogeochemical signatures of tracers (either natural or anthropogenic) in streams have been widely used to estimate transit times in catchments as they integrate the various processes at stake. However, most of these tracers are well suited for catchments with mean transit times lower than about 4-5 years. Since the second half of the 20th century, the intensification of agriculture led to a general increase of the nitrogen load in rivers. As nitrate is mainly transported by groundwater in agricultural catchments, this signal can be used to estimate transit times greater than several years, even if nitrate is not a conservative tracer. Conceptual hydrological models can be used to estimate catchment transit times provided their consistency is demonstrated, based on their ability to simulate the stream chemical signatures at various time scales and catchment internal processes such as N storage in groundwater. The objective of this study was to assess if a conceptual lumped model was able to simulate the observed patterns of nitrogen concentration, at various time scales, from seasonal to pluriannual and thus if it was relevant to estimate the nitrogen transit times in headwater catchments. A conceptual lumped model, representing shallow groundwater flow as two parallel linear stores with double porosity, and riparian processes by a constant nitrogen removal function, was applied on two paired agricultural catchments which belong to the Research Observatory ORE AgrHys. The Global Likelihood Uncertainty Estimation (GLUE) approach was used to estimate parameter values and uncertainties. The model performance was assessed on (i) its ability to simulate the contrasted patterns of stream flow and stream nitrate concentrations at seasonal and inter-annual time scales, (ii) its ability to simulate the patterns observed in groundwater at the same temporal scales, and (iii) the consistency of long-term simulations using the calibrated model and the general pattern of the nitrate concentration increase in the region since the beginning of the intensification of agriculture in the 1960s. The simulated nitrate transit times were found more sensitive to climate variability than to parameter uncertainty, and average values were found to be consistent with results from others studies in the same region involving modeling and groundwater dating. This study shows that a simple model can be used to simulate the main dynamics of nitrogen in an intensively polluted catchment and then be used to estimate the transit times of these pollutants in the system which is crucial to guide mitigation plans design and assessment. (C) 2015 Elsevier B.V. All rights reserved.
Resumo:
Quantifying distributional behavior of extreme events is crucial in hydrologic designs. Intensity Duration Frequency (IDF) relationships are used extensively in engineering especially in urban hydrology, to obtain return level of extreme rainfall event for a specified return period and duration. Major sources of uncertainty in the IDF relationships are due to insufficient quantity and quality of data leading to parameter uncertainty due to the distribution fitted to the data and uncertainty as a result of using multiple GCMs. It is important to study these uncertainties and propagate them to future for accurate assessment of return levels for future. The objective of this study is to quantify the uncertainties arising from parameters of the distribution fitted to data and the multiple GCM models using Bayesian approach. Posterior distribution of parameters is obtained from Bayes rule and the parameters are transformed to obtain return levels for a specified return period. Markov Chain Monte Carlo (MCMC) method using Metropolis Hastings algorithm is used to obtain the posterior distribution of parameters. Twenty six CMIP5 GCMs along with four RCP scenarios are considered for studying the effects of climate change and to obtain projected IDF relationships for the case study of Bangalore city in India. GCM uncertainty due to the use of multiple GCMs is treated using Reliability Ensemble Averaging (REA) technique along with the parameter uncertainty. Scale invariance theory is employed for obtaining short duration return levels from daily data. It is observed that the uncertainty in short duration rainfall return levels is high when compared to the longer durations. Further it is observed that parameter uncertainty is large compared to the model uncertainty. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Most of the cities in India are undergoing rapid development in recent decades, and many rural localities are undergoing transformation to urban hotspots. These developments have associated land use/land cover (LULC) change that effects runoff response from catchments, which is often evident in the form of increase in runoff peaks, volume and velocity in drain network. Often most of the existing storm water drains are in dilapidated stage owing to improper maintenance or inadequate design. The drains are conventionally designed using procedures that are based on some anticipated future conditions. Further, values of parameters/variables associated with design of the network are traditionally considered to be deterministic. However, in reality, the parameters/variables have uncertainty due to natural and/or inherent randomness. There is a need to consider the uncertainties for designing a storm water drain network that can effectively convey the discharge. The present study evaluates performance of an existing storm water drain network in Bangalore, India, through reliability analysis by Advance First Order Second Moment (AFOSM) method. In the reliability analysis, parameters that are considered to be random variables are roughness coefficient, slope and conduit dimensions. Performance of the existing network is evaluated considering three failure modes. The first failure mode occurs when runoff exceeds capacity of the storm water drain network, while the second failure mode occurs when the actual flow velocity in the storm water drain network exceeds the maximum allowable velocity for erosion control, whereas the third failure mode occurs when the minimum flow velocity is less than the minimum allowable velocity for deposition control. In the analysis, runoff generated from subcatchments of the study area and flow velocity in storm water drains are estimated using Storm Water Management Model (SWMM). Results from the study are presented and discussed. The reliability values are low under the three failure modes, indicating a need to redesign several of the conduits to improve their reliability. This study finds use in devising plans for expansion of the Bangalore storm water drain system. (C) 2015 The Authors. Published by Elsevier B.V.
Resumo:
We explore the prospects for observing CP violation in the minimal supersymmetric extension of the Standard Model (MSSM) with six CP-violating parameters, three gaugino mass phases and three phases in trilinear soft supersymmetry-breaking parameters, using the CPsuperH code combined with a geometric approach to maximise CP-violating observables subject to the experimental upper bounds on electric dipole moments. We also implement CP-conserving constraints from Higgs physics, flavour physics and the upper limits on the cosmological dark matter density and spin-independent scattering. We study possible values of observables within the constrained MSSM (CMSSM), the non-universal Higgs model (NUHM), the CPX scenario and a variant of the phenomenological MSSM (pMSSM). We find values of the CP-violating asymmetry A(CP) in b -> s gamma decay that may be as large as 3 %, so future measurements of ACP may provide independent information about CP violation in the MSSM. We find that CP-violating MSSM contributions to the B-s meson mass mixing term Delta M-Bs are in general below the present upper limit, which is dominated by theoretical uncertainties. If these could be reduced, Delta M-Bs could also provide an interesting and complementary constraint on the six CP-violating MSSM phases, enabling them all to be determined experimentally, in principle. We also find that CP violation in the h(2,3)tau(+)tau(-) and h(2,3) (t) over bart couplings can be quite large, and so may offer interesting prospects for future pp, e(+) e(-), mu(+) mu(-) and gamma gamma colliders.
Resumo:
Climate change is most likely to introduce an additional stress to already stressed water systems in developing countries. Climate change is inherently linked with the hydrological cycle and is expected to cause significant alterations in regional water resources systems necessitating measures for adaptation and mitigation. Increasing temperatures, for example, are likely to change precipitation patterns resulting in alterations of regional water availability, evapotranspirative water demand of crops and vegetation, extremes of floods and droughts, and water quality. A comprehensive assessment of regional hydrological impacts of climate change is thus necessary. Global climate model simulations provide future projections of the climate system taking into consideration changes in external forcings, such as atmospheric carbon-dioxide and aerosols, especially those resulting from anthropogenic emissions. However, such simulations are typically run at a coarse scale, and are not equipped to reproduce regional hydrological processes. This paper summarizes recent research on the assessment of climate change impacts on regional hydrology, addressing the scale and physical processes mismatch issues. Particular attention is given to changes in water availability, irrigation demands and water quality. This paper also includes description of the methodologies developed to address uncertainties in the projections resulting from incomplete knowledge about future evolution of the human-induced emissions and from using multiple climate models. Approaches for investigating possible causes of historically observed changes in regional hydrological variables are also discussed. Illustrations of all the above-mentioned methods are provided for Indian regions with a view to specifically aiding water management in India.
Resumo:
Bioenergy deployment offers significant potential for climate change mitigation, but also carries considerable risks. In this review, we bring together perspectives of various communities involved in the research and regulation of bioenergy deployment in the context of climate change mitigation: Land-use and energy experts, land-use and integrated assessment modelers, human geographers, ecosystem researchers, climate scientists and two different strands of life-cycle assessment experts. We summarize technological options, outline the state-of-the-art knowledge on various climate effects, provide an update on estimates of technical resource potential and comprehensively identify sustainability effects. Cellulosic feedstocks, increased end-use efficiency, improved land carbon-stock management and residue use, and, when fully developed, BECCS appear as the most promising options, depending on development costs, implementation, learning, and risk management. Combined heat and power, efficient biomass cookstoves and small-scale power generation for rural areas can help to promote energy access and sustainable development, along with reduced emissions. We estimate the sustainable technical potential as up to 100EJ: high agreement; 100-300EJ: medium agreement; above 300EJ: low agreement. Stabilization scenarios indicate that bioenergy may supply from 10 to 245EJyr(-1) to global primary energy supply by 2050. Models indicate that, if technological and governance preconditions are met, large-scale deployment (>200EJ), together with BECCS, could help to keep global warming below 2 degrees degrees of preindustrial levels; but such high deployment of land-intensive bioenergy feedstocks could also lead to detrimental climate effects, negatively impact ecosystems, biodiversity and livelihoods. The integration of bioenergy systems into agriculture and forest landscapes can improve land and water use efficiency and help address concerns about environmental impacts. We conclude that the high variability in pathways, uncertainties in technological development and ambiguity in political decision render forecasts on deployment levels and climate effects very difficult. However, uncertainty about projections should not preclude pursuing beneficial bioenergy options.
Resumo:
Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative alpha-entropies (denoted I-alpha), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative alpha-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimizers of these relative alpha-entropies on closed and convex sets are shown to exist. Such minimizations generalize the maximum Renyi or Tsallis entropy principle. The minimizing probability distribution (termed forward I-alpha-projection) for a linear family is shown to obey a power-law. Other results in connection with statistical inference, namely subspace transitivity and iterated projections, are also established. In a companion paper, a related minimization problem of interest in robust statistics that leads to a reverse I-alpha-projection is studied.
Resumo:
Groundwater management involves conflicting objectives as maximization of discharge contradicts the criteria of minimum pumping cost and minimum piping cost. In addition, available data contains uncertainties such as market fluctuations, variations in water levels of wells and variations of ground water policies. A fuzzy model is to be evolved to tackle the uncertainties, and a multiobjective optimization is to be conducted to simultaneously satisfy the contradicting objectives. Towards this end, a multiobjective fuzzy optimization model is evolved. To get at the upper and lower bounds of the individual objectives, particle Swarm optimization (PSO) is adopted. The analytic element method (AEM) is employed to obtain the operating potentio metric head. In this study, a multiobjective fuzzy optimization model considering three conflicting objectives is developed using PSO and AEM methods for obtaining a sustainable groundwater management policy. The developed model is applied to a case study, and it is demonstrated that the compromise solution satisfies all the objectives with adequate levels of satisfaction. Sensitivity analysis is carried out by varying the parameters, and it is shown that the effect of any such variation is quite significant. Copyright (c) 2015 John Wiley & Sons, Ltd.
Resumo:
Quantifying the isolated and integrated impacts of land use (LU) and climate change on streamflow is challenging as well as crucial to optimally manage water resources in river basins. This paper presents a simple hydrologic modeling-based approach to segregate the impacts of land use and climate change on the streamflow of a river basin. The upper Ganga basin (UGB) in India is selected as the case study to carry out the analysis. Streamflow in the river basin is modeled using a calibrated variable infiltration capacity (VIC) hydrologic model. The approach involves development of three scenarios to understand the influence of land use and climate on streamflow. The first scenario assesses the sensitivity of streamflow to land use changes under invariant climate. The second scenario determines the change in streamflow due to change in climate assuming constant land use. The third scenario estimates the combined effect of changing land use and climate over the streamflow of the basin. Based on the results obtained from the three scenarios, quantification of isolated impacts of land use and climate change on streamflow is addressed. Future projections of climate are obtained from dynamically downscaled simulations of six general circulation models (GCMs) available from the Coordinated Regional Downscaling Experiment (CORDEX) project. Uncertainties associated with the GCMs and emission scenarios are quantified in the analysis. Results for the case study indicate that streamflow is highly sensitive to change in urban areas and moderately sensitive to change in cropland areas. However, variations in streamflow generally reproduce the variations in precipitation. The combined effect of land use and climate on streamflow is observed to be more pronounced compared to their individual impacts in the basin. It is observed from the isolated effects of land use and climate change that climate has a more dominant impact on streamflow in the region. The approach proposed in this paper is applicable to any river basin to isolate the impacts of land use change and climate change on the streamflow.
Resumo:
The spatial error structure of daily precipitation derived from the latest version 7 (v7) tropical rainfall measuring mission (TRMM) level 2 data products are studied through comparison with the Asian precipitation highly resolved observational data integration toward evaluation of the water resources (APHRODITE) data over a subtropical region of the Indian subcontinent for the seasonal rainfall over 6 years from June 2002 to September 2007. The data products examined include v7 data from the TRMM radiometer Microwave Imager (TMI) and radar precipitation radar (PR), namely, 2A12, 2A25, and 2B31 (combined data from PR and TMI). The spatial distribution of uncertainty from these data products were quantified based on performance metrics derived from the contingency table. For the seasonal daily precipitation over a subtropical basin in India, the data product of 2A12 showed greater skill in detecting and quantifying the volume of rainfall when compared with the 2A25 and 2B31 data products. Error characterization using various error models revealed that random errors from multiplicative error models were homoscedastic and that they better represented rainfall estimates from 2A12 algorithm. Error decomposition techniques performed to disentangle systematic and random errors verify that the multiplicative error model representing rainfall from 2A12 algorithm successfully estimated a greater percentage of systematic error than 2A25 or 2B31 algorithms. Results verify that although the radiometer derived 2A12 rainfall data is known to suffer from many sources of uncertainties, spatial analysis over the case study region of India testifies that the 2A12 rainfall estimates are in a very good agreement with the reference estimates for the data period considered.
Resumo:
Response analysis of a linear structure with uncertainties in both structural parameters and external excitation is considered here. When such an analysis is carried out using the spectral stochastic finite element method (SSFEM), often the computational cost tends to be prohibitive due to the rapid growth of the number of spectral bases with the number of random variables and the order of expansion. For instance, if the excitation contains a random frequency, or if it is a general random process, then a good approximation of these excitations using polynomial chaos expansion (PCE) involves a large number of terms, which leads to very high cost. To address this issue of high computational cost, a hybrid method is proposed in this work. In this method, first the random eigenvalue problem is solved using the weak formulation of SSFEM, which involves solving a system of deterministic nonlinear algebraic equations to estimate the PCE coefficients of the random eigenvalues and eigenvectors. Then the response is estimated using a Monte Carlo (MC) simulation, where the modal bases are sampled from the PCE of the random eigenvectors estimated in the previous step, followed by a numerical time integration. It is observed through numerical studies that this proposed method successfully reduces the computational burden compared with either a pure SSFEM of a pure MC simulation and more accurate than a perturbation method. The computational gain improves as the problem size in terms of degrees of freedom grows. It also improves as the timespan of interest reduces.