189 resultados para Köppen climate classification

em Indian Institute of Science - Bangalore - Índia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Downscaling to station-scale hydrologic variables from large-scale atmospheric variables simulated by general circulation models (GCMs) is usually necessary to assess the hydrologic impact of climate change. This work presents CRF-downscaling, a new probabilistic downscaling method that represents the daily precipitation sequence as a conditional random field (CRF). The conditional distribution of the precipitation sequence at a site, given the daily atmospheric (large-scale) variable sequence, is modeled as a linear chain CRF. CRFs do not make assumptions on independence of observations, which gives them flexibility in using high-dimensional feature vectors. Maximum likelihood parameter estimation for the model is performed using limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) optimization. Maximum a posteriori estimation is used to determine the most likely precipitation sequence for a given set of atmospheric input variables using the Viterbi algorithm. Direct classification of dry/wet days as well as precipitation amount is achieved within a single modeling framework. The model is used to project the future cumulative distribution function of precipitation. Uncertainty in precipitation prediction is addressed through a modified Viterbi algorithm that predicts the n most likely sequences. The model is applied for downscaling monsoon (June-September) daily precipitation at eight sites in the Mahanadi basin in Orissa, India, using the MIROC3.2 medium-resolution GCM. The predicted distributions at all sites show an increase in the number of wet days, and also an increase in wet day precipitation amounts. A comparison of current and future predicted probability density functions for daily precipitation shows a change in shape of the density function with decreasing probability of lower precipitation and increasing probability of higher precipitation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forests play a critical role in addressing climate change concerns in the broader context of global change and sustainable development. Forests are linked to climate change in three ways. i) Forests are a source of greenhouse gas (GHG) emissions: ii) Forests offer mitigation opportunities to stabilise GHG concentrations: iii) Forests are impacted by climate change. This paper reviews studies related to climate change and forests in India: first, the studies estimating carbon inventory for the Indian land use change and forestry sector (LUCF), then the different models and mitigation potential estimates for the LUCF sector in India. Finally it reviews the studies on the impact of climate change on forest ecosystems in India, identifying the implications for net primary productivity and bio-diversity. The paper highlights data, modelling and research gaps relevant to the GHG inventory, mitigation potential and vulnerability and impact assessments for the forest sector in India.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The accelerated rate of increase in atmospheric CO2 concentration in recent years has revived the idea of stabilizing the global climate through geoengineering schemes. Majority of the proposed geoengineering schemes will attempt to reduce the amount of solar radiation absorbed by our planet. Climate modelling studies of these so called 'sunshade geoengineering schemes' show that global warming from increasing concentrations of CO2 can be mitigated by intentionally manipulating the amount of sunlight absorbed by the climate system. These studies also suggest that the residual changes could be large on regional scales, so that climate change may not be mitigated on a local basis. More recent modelling studies have shown that these schemes could lead to a slow-down in the global hydrological cycle. Other problems such as changes in the terrestrial carbon cycle and ocean acidification remain unsolved by sunshade geoengineering schemes. In this article, I review the proposed geoengineering schemes, results from climate models and discuss why geoengineering is not the best option to deal with climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Remote sensing provides a lucid and effective means for crop coverage identification. Crop coverage identification is a very important technique, as it provides vital information on the type and extent of crop cultivated in a particular area. This information has immense potential in the planning for further cultivation activities and for optimal usage of the available fertile land. As the frontiers of space technology advance, the knowledge derived from the satellite data has also grown in sophistication. Further, image classification forms the core of the solution to the crop coverage identification problem. No single classifier can prove to satisfactorily classify all the basic crop cover mapping problems of a cultivated region. We present in this paper the experimental results of multiple classification techniques for the problem of crop cover mapping of a cultivated region. A detailed comparison of the algorithms inspired by social behaviour of insects and conventional statistical method for crop classification is presented in this paper. These include the Maximum Likelihood Classifier (MLC), Particle Swarm Optimisation (PSO) and Ant Colony Optimisation (ACO) techniques. The high resolution satellite image has been used for the experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A complete list of homogeneous operators in the Cowen-Douglas class B-n(D) is given. This classification is obtained from an explicit realization of all the homogeneous Hermitian holomorphic vector bundles on the unit disc under the action of the universal covering group of the bi-holomorphic automorphism group of the unit disc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A global climate model experiment is performed to evaluate the effect of irrigation on temperatures in several major irrigated regions of the world. The Community Atmosphere Model, version 3.3, was modified to represent irrigation for the fraction of each grid cell equipped for irrigation according to datasets from the Food and Agriculture Organization. Results indicate substantial regional differences in the magnitude of irrigation-induced cooling, which are attributed to three primary factors: differences in extent of the irrigated area, differences in the simulated soil moisture for the control simulation (without irrigation), and the nature of cloud response to irrigation. The last factor appeared especially important for the dry season in India, although further analysis with other models and observations are needed to verify this feedback. Comparison with observed temperatures revealed substantially lower biases in several regions for the simulation with irrigation than for the control, suggesting that the lack of irrigation may be an important component of temperature bias in this model or that irrigation compensates for other biases. The results of this study should help to translate the results from past regional efforts, which have largely focused on the United States, to regions in the developing world that in many cases continue to experience significant expansion of irrigated land.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ninety-two strong-motion earthquake records from the California region, U.S.A., have been statistically studied using principal component analysis in terms of twelve important standardized strong-motion characteristics. The first two principal components account for about 57 per cent of the total variance. Based on these two components the earthquake records are classified into nine groups in a two-dimensional principal component plane. Also a unidimensional engineering rating scale is proposed. The procedure can be used as an objective approach for classifying and rating future earthquakes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing concentrations of atmospheric CO2 decrease stomatal conductance of plants and thus suppress canopy transpiration. The climate response to this CO2-physiological forcing is investigated using the Community Atmosphere Model version 3.1 coupled to Community Land Model version 3.0. In response to the physiological effect of doubling CO2, simulations show a decrease in canopy transpiration of 8%, a mean warming of 0.1K over the land surface, and negligible changes in the hydrological cycle. These climate responses are much smaller than what were found in previous modeling studies. This is largely a result of unrealistic partitioning of evapotranspiration in our model control simulation with a greatly underestimated contribution from canopy transpiration and overestimated contributions from canopy and soil evaporation. This study highlights the importance of a realistic simulation of the hydrological cycle, especially the individual components of evapotranspiration, in reducing the uncertainty in our estimation of climatic response to CO2-physiological forcing. Citation: Cao, L., G. Bala, K. Caldeira, R. Nemani, and G.Ban-Weiss (2009), Climate response to physiological forcing of carbon dioxide simulated by the coupled Community Atmosphere Model (CAM3.1) and Community Land Model (CLM3.0).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the site classification of Bangalore Mahanagar Palike (BMP) area using geophysical data and the evaluation of spectral acceleration at ground level using probabilistic approach. Site classification has been carried out using experimental data from the shallow geophysical method of Multichannel Analysis of Surface wave (MASW). One-dimensional (1-D) MASW survey has been carried out at 58 locations and respective velocity profiles are obtained. The average shear wave velocity for 30 m depth (Vs(30)) has been calculated and is used for the site classification of the BMP area as per NEHRP (National Earthquake Hazards Reduction Program). Based on the Vs(30) values major part of the BMP area can be classified as ``site class D'', and ``site class C'. A smaller portion of the study area, in and around Lalbagh Park, is classified as ``site class B''. Further, probabilistic seismic hazard analysis has been carried out to map the seismic hazard in terms spectral acceleration (S-a) at rock and the ground level considering the site classes and six seismogenic sources identified. The mean annual rate of exceedance and cumulative probability hazard curve for S. have been generated. The quantified hazard values in terms of spectral acceleration for short period and long period are mapped for rock, site class C and D with 10% probability of exceedance in 50 years on a grid size of 0.5 km. In addition to this, the Uniform Hazard Response Spectrum (UHRS) at surface level has been developed for the 5% damping and 10% probability of exceedance in 50 years for rock, site class C and D These spectral acceleration and uniform hazard spectrums can be used to assess the design force for important structures and also to develop the design spectrum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a Chance-constraint Programming approach for constructing maximum-margin classifiers which are robust to interval-valued uncertainty in training examples. The methodology ensures that uncertain examples are classified correctly with high probability by employing chance-constraints. The main contribution of the paper is to pose the resultant optimization problem as a Second Order Cone Program by using large deviation inequalities, due to Bernstein. Apart from support and mean of the uncertain examples these Bernstein based relaxations make no further assumptions on the underlying uncertainty. Classifiers built using the proposed approach are less conservative, yield higher margins and hence are expected to generalize better than existing methods. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle interval-valued uncertainty than state-of-the-art.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrologic impacts of climate change are usually assessed by downscaling the General Circulation Model (GCM) output of large-scale climate variables to local-scale hydrologic variables. Such an assessment is characterized by uncertainty resulting from the ensembles of projections generated with multiple GCMs, which is known as intermodel or GCM uncertainty. Ensemble averaging with the assignment of weights to GCMs based on model evaluation is one of the methods to address such uncertainty and is used in the present study for regional-scale impact assessment. GCM outputs of large-scale climate variables are downscaled to subdivisional-scale monsoon rainfall. Weights are assigned to the GCMs on the basis of model performance and model convergence, which are evaluated with the Cumulative Distribution Functions (CDFs) generated from the downscaled GCM output (for both 20th Century [20C3M] and future scenarios) and observed data. Ensemble averaging approach, with the assignment of weights to GCMs, is characterized by the uncertainty caused by partial ignorance, which stems from nonavailability of the outputs of some of the GCMs for a few scenarios (in Intergovernmental Panel on Climate Change [IPCC] data distribution center for Assessment Report 4 [AR4]). This uncertainty is modeled with imprecise probability, i.e., the probability being represented as an interval gray number. Furthermore, the CDF generated with one GCM is entirely different from that with another and therefore the use of multiple GCMs results in a band of CDFs. Representing this band of CDFs with a single valued weighted mean CDF may be misleading. Such a band of CDFs can only be represented with an envelope that contains all the CDFs generated with a number of GCMs. Imprecise CDF represents such an envelope, which not only contains the CDFs generated with all the available GCMs but also to an extent accounts for the uncertainty resulting from the missing GCM output. This concept of imprecise probability is also validated in the present study. The imprecise CDFs of monsoon rainfall are derived for three 30-year time slices, 2020s, 2050s and 2080s, with A1B, A2 and B1 scenarios. The model is demonstrated with the prediction of monsoon rainfall in Orissa meteorological subdivision, which shows a possible decreasing trend in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a novel technique for robust voiced/unvoiced segment detection in noisy speech, based on local polynomial regression. The local polynomial model is well-suited for voiced segments in speech. The unvoiced segments are noise-like and do not exhibit any smooth structure. This property of smoothness is used for devising a new metric called the variance ratio metric, which, after thresholding, indicates the voiced/unvoiced boundaries with 75% accuracy for 0dB global signal-to-noise ratio (SNR). A novelty of our algorithm is that it processes the signal continuously, sample-by-sample rather than frame-by-frame. Simulation results on TIMIT speech database (downsampled to 8kHz) for various SNRs are presented to illustrate the performance of the new algorithm. Results indicate that the algorithm is robust even in high noise levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents two algorithms for smoothing and feature extraction for fingerprint classification. Deutsch's(2) Thinning algorithm (rectangular array) is used for thinning the digitized fingerprint (binary version). A simple algorithm is also suggested for classifying the fingerprints. Experimental results obtained using such algorithms are presented.