31 resultados para likelihood ratio test

em Indian Institute of Science - Bangalore - Índia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers the problem of spectrum sensing, i.e., the detection of whether or not a primary user is transmitting data by a cognitive radio. The Bayesian framework is adopted, with the performance measure being the probability of detection error. A decentralized setup, where N sensors use M observations each to arrive at individual decisions that are combined at a fusion center to form the overall decision is considered. The unknown fading channel between the primary sensor and the cognitive radios makes the individual decision rule computationally complex, hence, a generalized likelihood ratio test (GLRT)-based approach is adopted. Analysis of the probabilities of false alarm and miss detection of the proposed method reveals that the error exponent with respect to M is zero. Also, the fusion of N individual decisions offers a diversity advantage, similar to diversity reception in communication systems, and a tight bound on the error exponent is presented. Through an analysis in the low power regime, the number of observations needed as a function of received power, to achieve a given probability of error is determined. Monte-Carlo simulations confirm the accuracy of the analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the formulation and performance analysis of four techniques for detection of a narrowband acoustic source in a shallow range-independent ocean using an acoustic vector sensor (AVS) array. The array signal vector is not known due to the unknown location of the source. Hence all detectors are based on a generalized likelihood ratio test (GLRT) which involves estimation of the array signal vector. One non-parametric and three parametric (model-based) signal estimators are presented. It is shown that there is a strong correlation between the detector performance and the mean-square signal estimation error. Theoretical expressions for probability of false alarm and probability of detection are derived for all the detectors, and the theoretical predictions are compared with simulation results. It is shown that the detection performance of an AVS array with a certain number of sensors is equal to or slightly better than that of a conventional acoustic pressure sensor array with thrice as many sensors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selection of relevant features is an open problem in Brain-computer interfacing (BCI) research. Sometimes, features extracted from brain signals are high dimensional which in turn affects the accuracy of the classifier. Selection of the most relevant features improves the performance of the classifier and reduces the computational cost of the system. In this study, we have used a combination of Bacterial Foraging Optimization and Learning Automata to determine the best subset of features from a given motor imagery electroencephalography (EEG) based BCI dataset. Here, we have employed Discrete Wavelet Transform to obtain a high dimensional feature set and classified it by Distance Likelihood Ratio Test. Our proposed feature selector produced an accuracy of 80.291% in 216 seconds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Speech enhancement in stationary noise is addressed using the ideal channel selection framework. In order to estimate the binary mask, we propose to classify each time-frequency (T-F) bin of the noisy signal as speech or noise using Discriminative Random Fields (DRF). The DRF function contains two terms - an enhancement function and a smoothing term. On each T-F bin, we propose to use an enhancement function based on likelihood ratio test for speech presence, while Ising model is used as smoothing function for spectro-temporal continuity in the estimated binary mask. The effect of the smoothing function over successive iterations is found to reduce musical noise as opposed to using only enhancement function. The binary mask is inferred from the noisy signal using Iterated Conditional Modes (ICM) algorithm. Sentences from NOIZEUS corpus are evaluated from 0 dB to 15 dB Signal to Noise Ratio (SNR) in 4 kinds of additive noise settings: additive white Gaussian noise, car noise, street noise and pink noise. The reconstructed speech using the proposed technique is evaluated in terms of average segmental SNR, Perceptual Evaluation of Speech Quality (PESQ) and Mean opinion Score (MOS).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper considers the problem of weak signal detection in the presence of navigation data bits for Global Navigation Satellite System (GNSS) receivers. Typically, a set of partial coherent integration outputs are non-coherently accumulated to combat the effects of model uncertainties such as the presence of navigation data-bits and/or frequency uncertainty, resulting in a sub-optimal test statistic. In this work, the test-statistic for weak signal detection is derived in the presence of navigation data-bits from the likelihood ratio. It is highlighted that averaging the likelihood ratio based test-statistic over the prior distributions of the unknown data bits and the carrier phase uncertainty leads to the conventional Post Detection Integration (PDI) technique for detection. To improve the performance in the presence of model uncertainties, a novel cyclostationarity based sub-optimal PDI technique is proposed. The test statistic is analytically characterized, and shown to be robust to the presence of navigation data-bits, frequency, phase and noise uncertainties. Monte Carlo simulation results illustrate the validity of the theoretical results and the superior performance offered by the proposed detector in the presence of model uncertainties.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Merton's model views equity as a call option on the asset of the firm. Thus the asset is partially observed through the equity. Then using nonlinear filtering an explicit expression for likelihood ratio for underlying parameters in terms of the nonlinear filter is obtained. As the evolution of the filter itself depends on the parameters in question, this does not permit direct maximum likelihood estimation, but does pave the way for the `Expectation-Maximization' method for estimating parameters. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Low density parity-check (LDPC) codes are a class of linear block codes that are decoded by running belief propagation (BP) algorithm or log-likelihood ratio belief propagation (LLR-BP) over the factor graph of the code. One of the disadvantages of LDPC codes is the onset of an error floor at high values of signal to noise ratio caused by trapping sets. In this paper, we propose a two stage decoder to deal with different types of trapping sets. Oscillating trapping sets are taken care by the first stage of the decoder and the elementary trapping sets are handled by the second stage of the decoder. Simulation results on the regular PEG (504,252,3,6) code and the irregular PEG (1024,518,15,8) code shows that the proposed two stage decoder performs significantly better than the standard decoder.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider cooperative spectrum sensing for cognitive radios. We develop an energy efficient detector with low detection delay using sequential hypothesis testing. Sequential Probability Ratio Test (SPRT) is used at both the local nodes and the fusion center. We also analyse the performance of this algorithm and compare with the simulations. Modelling uncertainties in the distribution parameters are considered. Slow fading with and without perfect channel state information at the cognitive radios is taken into account.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers cooperative spectrum sensing in Cognitive Radios. In our previous work we have developed DualSPRT, a distributed algorithm for cooperative spectrum sensing using Sequential Probability Ratio Test (SPRT) at the Cognitive Radios as well as at the fusion center. This algorithm works well, but is not optimal. In this paper we propose an improved algorithm- SPRT-CSPRT, which is motivated from Cumulative Sum Procedures (CUSUM). We analyse it theoretically. We also modify this algorithm to handle uncertainties in SNR's and fading.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider cooperative spectrum sensing for cognitive radios. We develop an energy efficient detector with low detection delay using sequential hypothesis testing. Sequential Probability Ratio Test (SPRT) is used at both the local nodes and the fusion center. We also analyse the performance of this algorithm and compare with the simulations. Modelling uncertainties in the distribution parameters are considered. Slow fading with and without perfect channel state information at the cognitive radios is taken into account.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers cooperative spectrum sensing algorithms for Cognitive Radios which focus on reducing the number of samples to make a reliable detection. We propose algorithms based on decentralized sequential hypothesis testing in which the Cognitive Radios sequentially collect the observations, make local decisions and send them to the fusion center for further processing to make a final decision on spectrum usage. The reporting channel between the Cognitive Radios and the fusion center is assumed more realistically as a Multiple Access Channel (MAC) with receiver noise. Furthermore the communication for reporting is limited, thereby reducing the communication cost. We start with an algorithm where the fusion center uses an SPRT-like (Sequential Probability Ratio Test) procedure and theoretically analyze its performance. Asymptotically, its performance is close to the optimal centralized test without fusion center noise. We further modify this algorithm to improve its performance at practical operating points. Later we generalize these algorithms to handle uncertainties in SNR and fading. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Restricted Boltzmann Machines (RBM) can be used either as classifiers or as generative models. The quality of the generative RBM is measured through the average log-likelihood on test data. Due to the high computational complexity of evaluating the partition function, exact calculation of test log-likelihood is very difficult. In recent years some estimation methods are suggested for approximate computation of test log-likelihood. In this paper we present an empirical comparison of the main estimation methods, namely, the AIS algorithm for estimating the partition function, the CSL method for directly estimating the log-likelihood, and the RAISE algorithm that combines these two ideas.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article presents frequentist inference of accelerated life test data of series systems with independent log-normal component lifetimes. The means of the component log-lifetimes are assumed to depend on the stress variables through a linear stress translation function that can accommodate the standard stress translation functions in the literature. An expectation-maximization algorithm is developed to obtain the maximum likelihood estimates of model parameters. The maximum likelihood estimates are then further refined by bootstrap, which is also used to infer about the component and system reliability metrics at usage stresses. The developed methodology is illustrated by analyzing a real as well as a simulated dataset. A simulation study is also carried out to judge the effectiveness of the bootstrap. It is found that in this model, application of bootstrap results in significant improvement over the simple maximum likelihood estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of the test gas on the flow field around a 120degrees apex angle blunt cone has been investigated in a shock tunnel at a nominal Mach number of 5.75. The shock standoff distance around the blunt cone was measured by an electrical discharge technique using both carbon dioxide and air as test gases. The forebody laminar convective heat transfer to the blunt cone was measured with platinum thin-film sensors in both air and carbon dioxide environments. An increase of 10 to 15% in the measured heat transfer values was observed with carbon dioxide as the test gas in comparison to air. The measured thickness of the shock layer along the stagnation streamline was 3.57 +/- 0.17 mm in air and 3.29 +/- 0.26 mm in carbon dioxide. The computed thickness of the shock layer for air and carbon dioxide were 3.98 mm and 3.02 mm, respectively. The observed increase in the measured heat transfer rates in carbon dioxide compared to air was due to the higher density ratio across the bow shock wave and the reduced shock layer thickness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the behaviour of compacted expansive soils under swell-shrink cycles. Laboratory cyclic swell-shrink tests were conducted on compacted specimens of two expansive soils at surcharge pressures of 6.25, 50.00, and 100.00 kPa. The void ratio and water content of the specimens at several intermediate stages during swelling until the end of swelling and during shrinkage until the end of shrinkage were determined to trace the water content versus void ratio paths with an increasing number of swell-shrink cycles. The test results showed that the swell-shrink path was reversible once the soil reached an equilibrium stage where the vertical deformations during swelling and shrinkage were the same. This usually occurred after about four swell-shrink cycles. The swelling and shrinkage path of each specimen subjected to full swelling - full shrinkage cycles showed an S-shaped curve (two curvilinear portions and a linear portion). However, the swelling and shrinkage path occurred as a part of the S-shaped curve, when the specimen was subjected to full swelling - partial shrinkage cycles. More than 80% of the total volumetric change and more than 50% of the total vertical deformation occurred in the central linear portion of the S-shaped curve. The volumetric change was essentially parallel to the saturation line within a degree of saturation range of 50-80% for the equilibrium cycle. The primary value of the swell-shrink path is to provide information regarding the void ratio change that would occur for a given change in water content for any possible swell-shrink pattern. It is suggested that these swell-shrink paths can be established with a limited number of tests in the laboratory.