946 resultados para markov random field


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study sensor networks with energy harvesting nodes. The generated energy at a node can be stored in a buffer. A sensor node periodically senses a random field and generates a packet. These packets are stored in a queue and transmitted using the energy available at that time at the node. For such networks we develop efficient energy management policies. First, for a single node, we obtain policies that are throughput optimal, i.e., the data queue stays stable for the largest possible data rate. Next we obtain energy management policies which minimize the mean delay in the queue. We also compare performance of several easily implementable suboptimal policies. A greedy policy is identified which, in low SNR regime, is throughput optimal and also minimizes mean delay. Next using the results for a single node, we develop efficient MAC policies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The behaviour of laterally loaded piles is considerably influenced by the uncertainties in soil properties. Hence probabilistic models for assessment of allowable lateral load are necessary. Cone penetration test (CPT) data are often used to determine soil strength parameters, whereby the allowable lateral load of the pile is computed. In the present study, the maximum lateral displacement and moment of the pile are obtained based on the coefficient of subgrade reaction approach, considering the nonlinear soil behaviour in undrained clay. The coefficient of subgrade reaction is related to the undrained shear strength of soil, which can be obtained from CPT data. The soil medium is modelled as a one-dimensional random field along the depth, and it is described by the standard deviation and scale of fluctuation of the undrained shear strength of soil. Inherent soil variability, measurement uncertainty and transformation uncertainty are taken into consideration. The statistics of maximum lateral deflection and moment are obtained using the first-order, second-moment technique. Hasofer-Lind reliability indices for component and system failure criteria, based on the allowable lateral displacement and moment capacity of the pile section, are evaluated. The geotechnical database from the Konaseema site in India is used as a case example. It is shown that the reliability-based design approach for pile foundations, considering the spatial variability of soil, permits a rational choice of allowable lateral loads.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, spatial variability modeling of soil parameters using random field theory has gained distinct importance in geotechnical analysis. In the present Study, commercially available finite difference numerical code FLAC 5.0 is used for modeling the permeability parameter as spatially correlated log-normally distributed random variable and its influence on the steady state seepage flow and on the slope stability analysis are studied. Considering the case of a 5.0 m high cohesive-frictional soil slope of 30 degrees, a range of coefficients of variation (CoV%) from 60 to 90% in the permeability Values, and taking different values of correlation distance in the range of 0.5-15 m, parametric studies, using Monte Carlo simulations, are performed to study the following three aspects, i.e., (i) effect ostochastic soil permeability on the statistics of seepage flow in comparison to the analytic (Dupuit's) solution available for the uniformly constant permeability property; (ii) strain and deformation pattern, and (iii) stability of the given slope assessed in terms of factor of safety (FS). The results obtained in this study are useful to understand the role of permeability variations in slope stability analysis under different slope conditions and material properties. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we address the problem of distributed transmission of functions of correlated sources over a fast fading multiple access channel (MAC). This is a basic building block in a hierarchical sensor network used in estimating a random field where the cluster head is interested only in estimating a function of the observations. The observations are transmitted to the cluster head through a fast fading MAC. We provide sufficient conditions for lossy transmission when the encoders and decoders are provided with partial information about the channel state. Furthermore signal side information maybe available at the encoders and the decoder. Various previous studies are shown as special cases. Efficient joint-source channel coding schemes are discussed for transmission of discrete and continuous alphabet sources to recover function values.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Regional impacts of climate change remain subject to large uncertainties accumulating from various sources, including those due to choice of general circulation models (GCMs), scenarios, and downscaling methods. Objective constraints to reduce the uncertainty in regional predictions have proven elusive. In most studies to date the nature of the downscaling relationship (DSR) used for such regional predictions has been assumed to remain unchanged in a future climate. However,studies have shown that climate change may manifest in terms of changes in frequencies of occurrence of the leading modes of variability, and hence, stationarity of DSRs is not really a valid assumption in regional climate impact assessment. This work presents an uncertainty modeling framework where, in addition to GCM and scenario uncertainty, uncertainty in the nature of the DSR is explored by linking downscaling with changes in frequencies of such modes of natural variability. Future projections of the regional hydrologic variable obtained by training a conditional random field (CRF) model on each natural cluster are combined using the weighted Dempster-Shafer (D-S) theory of evidence combination. Each projection is weighted with the future projected frequency of occurrence of that cluster (''cluster linking'') and scaled by the GCM performance with respect to the associated cluster for the present period (''frequency scaling''). The D-S theory was chosen for its ability to express beliefs in some hypotheses, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The methodology is tested for predicting monsoon streamflow of the Mahanadi River at Hirakud Reservoir in Orissa, India. The results show an increasing probability of extreme, severe, and moderate droughts due to limate change. Significantly improved agreement between GCM predictions owing to cluster linking and frequency scaling is seen, suggesting that by linking regional impacts to natural regime frequencies, uncertainty in regional predictions can be realistically quantified. Additionally, by using a measure of GCM performance in simulating natural regimes, this uncertainty can be effectively constrained.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study a sensor node with an energy harvesting source. In any slot,the sensor node is in one of two modes: Wake or Sleep. The generated energy is stored in a buffer. The sensor node senses a random field and generates a packet when it is awake. These packets are stored in a queue and transmitted in the wake mode using the energy available in the energy buffer. We obtain energy management policies which minimize a linear combination of the mean queue length and the mean data loss rate. Then, we obtain two easily implementable suboptimal policies and compare their performance to that of the optimal policy. Next, we extend the Throughput Optimal policy developed in our previous work to sensors with two modes. Via this policy, we can increase the through put substantially and stabilize the data queue by allowing the node to sleep in some slots and to drop some generated packets. This policy requires minimal statistical knowledge of the system. We also modify this policy to decrease the switching costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study wireless multihop energy harvesting sensor networks employed for random field estimation. The sensors sense the random field and generate data that is to be sent to a fusion node for estimation. Each sensor has an energy harvesting source and can operate in two modes: Wake and Sleep. We consider the problem of obtaining jointly optimal power control, routing and scheduling policies that ensure a fair utilization of network resources. This problem has a high computational complexity. Therefore, we develop a computationally efficient suboptimal approach to obtain good solutions to this problem. We study the optimal solution and performance of the suboptimal approach through some numerical examples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nonconservatively loaded columns. which have stochastically distributed material property values and stochastic loadings in space are considered. Young's modulus and mass density are treated to constitute random fields. The support stiffness coefficient and tip follower load are considered to be random variables. The fluctuations of external and distributed loadings are considered to constitute a random field. The variational formulation is adopted to get the differential equation and boundary conditions. The non self-adjoint operators are used at the boundary of the regularity domain. The statistics of vibration frequencies and modes are obtained using the standard perturbation method, by treating the fluctuations to be stochastic perturbations. Linear dependence of vibration and stability parameters over property value fluctuations and loading fluctuations are assumed. Bounds for the statistics of vibration frequencies are obtained. The critical load is first evaluated for the averaged problem and the corresponding eigenvalue statistics are sought. Then, the frequency equation is employed to transform the eigenvalue statistics to critical load statistics. Specialization of the general procedure to Beck, Leipholz and Pfluger columns is carried out. For Pfluger column, nonlinear transformations are avoided by directly expressing the critical load statistics in terms of input variable statistics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Leipholz column which is having the Young modulus and mass per unit length as stochastic processes and also the distributed tangential follower load behaving stochastically is considered. The non self-adjoint differential equation and boundary conditions are considered to have random field coefficients. The standard perturbation method is employed. The non self-adjoint operators are used within the regularity domain. Full covariance structure of the free vibration eigenvalues and critical loads is derived in terms of second order properties of input random fields characterizing the system parameter fluctuations. The mean value of critical load is calculated using the averaged problem and the corresponding eigenvalue statistics are sought. Through the frequency equation a transformation is done to yield load parameter statistics. A numerical study incorporating commonly observed correlation models is reported which illustrates the full potentials of the derived expressions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we consider the application of belief propagation (BP) to achieve near-optimal signal detection in large multiple-input multiple-output (MIMO) systems at low complexities. Large-MIMO architectures based on spatial multiplexing (V-BLAST) as well as non-orthogonal space-time block codes(STBC) from cyclic division algebra (CDA) are considered. We adopt graphical models based on Markov random fields (MRF) and factor graphs (FG). In the MRF based approach, we use pairwise compatibility functions although the graphical models of MIMO systems are fully/densely connected. In the FG approach, we employ a Gaussian approximation (GA) of the multi-antenna interference, which significantly reduces the complexity while achieving very good performance for large dimensions. We show that i) both MRF and FG based BP approaches exhibit large-system behavior, where increasingly closer to optimal performance is achieved with increasing number of dimensions, and ii) damping of messages/beliefs significantly improves the bit error performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a typical sensor network scenario a goal is to monitor a spatio-temporal process through a number of inexpensive sensing nodes, the key parameter being the fidelity at which the process has to be estimated at distant locations. We study such a scenario in which multiple encoders transmit their correlated data at finite rates to a distant and common decoder. In particular, we derive inner and outer bounds on the rate region for the random field to be estimated with a given mean distortion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many downscaling techniques have been developed in the past few years for projection of station-scale hydrological variables from large-scale atmospheric variables simulated by general circulation models (GCMs) to assess the hydrological impacts of climate change. This article compares the performances of three downscaling methods, viz. conditional random field (CRF), K-nearest neighbour (KNN) and support vector machine (SVM) methods in downscaling precipitation in the Punjab region of India, belonging to the monsoon regime. The CRF model is a recently developed method for downscaling hydrological variables in a probabilistic framework, while the SVM model is a popular machine learning tool useful in terms of its ability to generalize and capture nonlinear relationships between predictors and predictand. The KNN model is an analogue-type method that queries days similar to a given feature vector from the training data and classifies future days by random sampling from a weighted set of K closest training examples. The models are applied for downscaling monsoon (June to September) daily precipitation at six locations in Punjab. Model performances with respect to reproduction of various statistics such as dry and wet spell length distributions, daily rainfall distribution, and intersite correlations are examined. It is found that the CRF and KNN models perform slightly better than the SVM model in reproducing most daily rainfall statistics. These models are then used to project future precipitation at the six locations. Output from the Canadian global climate model (CGCM3) GCM for three scenarios, viz. A1B, A2, and B1 is used for projection of future precipitation. The projections show a change in probability density functions of daily rainfall amount and changes in the wet and dry spell distributions of daily precipitation. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study a sensor node with an energy harvesting source. The generated energy can be stored in a buffer. The sensor node periodically senses a random field and generates a packet. These packets are stored in a queue and transmitted using the energy available at that time. We obtain energy management policies that are throughput optimal, i.e., the data queue stays stable for the largest possible data rate. Next we obtain energy management policies which minimize the mean delay in the queue. We also compare performance of several easily implementable sub-optimal energy management policies. A greedy policy is identified which, in low SNR regime, is throughput optimal and also minimizes mean delay.