87 resultados para Probabilities.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is well known that extremely long low-density parity-check (LDPC) codes perform exceptionally well for error correction applications, short-length codes are preferable in practical applications. However, short-length LDPC codes suffer from performance degradation owing to graph-based impairments such as short cycles, trapping sets and stopping sets and so on in the bipartite graph of the LDPC matrix. In particular, performance degradation at moderate to high E-b/N-0 is caused by the oscillations in bit node a posteriori probabilities induced by short cycles and trapping sets in bipartite graphs. In this study, a computationally efficient algorithm is proposed to improve the performance of short-length LDPC codes at moderate to high E-b/N-0. This algorithm makes use of the information generated by the belief propagation (BP) algorithm in previous iterations before a decoding failure occurs. Using this information, a reliability-based estimation is performed on each bit node to supplement the BP algorithm. The proposed algorithm gives an appreciable coding gain as compared with BP decoding for LDPC codes of a code rate equal to or less than 1/2 rate coding. The coding gains are modest to significant in the case of optimised (for bipartite graph conditioning) regular LDPC codes, whereas the coding gains are huge in the case of unoptimised codes. Hence, this algorithm is useful for relaxing some stringent constraints on the graphical structure of the LDPC code and for developing hardware-friendly designs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report an experimental study of recently formulated entropic Leggett-Garg inequality (ELGI) by Usha Devi et al. Phys. Rev. A 87, 052103 (2013)]. This inequality places a bound on the statistical measurement outcomes of dynamical observables describing a macrorealistic system. Such a bound is not necessarily obeyed by quantum systems, and therefore provides an important way to distinguish quantumness from classical behavior. Here we study ELGI using a two-qubit nuclear magnetic resonance system. To perform the noninvasive measurements required for the ELGI study, we prepare the system qubit in a maximally mixed state as well as use the ``ideal negative result measurement'' procedure with the help of an ancilla qubit. The experimental results show a clear violation of ELGI by over four standard deviations. These results agree with the predictions of quantum theory. The violation of ELGI is attributed to the fact that certain joint probabilities are not legitimate in the quantum scenario, in the sense they do not reproduce all the marginal probabilities. Using a three-qubit system, we also demonstrate that three-time joint probabilities do not reproduce certain two-time marginal probabilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The pressure dependences of Cl-35 nuclear quadrupole resonance (NQR) frequency, temperature and pressure variation of spin lattice relaxation time (T-1) were investigated in 3,4-dichlorophenol. T-1 was measured in the temperature range 77-300 K. Furthermore, the NQR frequency and T-1 for these compounds were measured as a function of pressure up to 5 kbar at 300 K. The temperature dependence of the average torsional lifetimes of the molecules and the transition probabilities W-1 and W-2 for the Delta m = +/- 1 and Delta m = +/- 2 transitions were also obtained. A nonlinear variation of NQR frequency with pressure has been observed and the pressure coefficients were observed to be positive. A thermodynamic analysis of the data was carried out to determine the constant volume temperature coefficients of the NQR frequency. An attempt is made to compare the torsional frequencies evaluated from NQR data with those obtained by IR spectra. On selecting the appropriate mode from IR spectra, a good agreement with torsional frequency obtained from NQR data is observed. The previously mentioned approach is a good illustration of the supplementary nature of the data from IR studies, in relation to NQR studies of compounds in solid state.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimating program worst case execution time(WCET) accurately and efficiently is a challenging task. Several programs exhibit phase behavior wherein cycles per instruction (CPI) varies in phases during execution. Recent work has suggested the use of phases in such programs to estimate WCET with minimal instrumentation. However the suggested model uses a function of mean CPI that has no probabilistic guarantees. We propose to use Chebyshev's inequality that can be applied to any arbitrary distribution of CPI samples, to probabilistically bound CPI of a phase. Applying Chebyshev's inequality to phases that exhibit high CPI variation leads to pessimistic upper bounds. We propose a mechanism that refines such phases into sub-phases based on program counter(PC) signatures collected using profiling and also allows the user to control variance of CPI within a sub-phase. We describe a WCET analyzer built on these lines and evaluate it with standard WCET and embedded benchmark suites on two different architectures for three chosen probabilities, p={0.9, 0.95 and 0.99}. For p= 0.99, refinement based on PC signatures alone, reduces average pessimism of WCET estimate by 36%(77%) on Arch1 (Arch2). Compared to Chronos, an open source static WCET analyzer, the average improvement in estimates obtained by refinement is 5%(125%) on Arch1 (Arch2). On limiting variance of CPI within a sub-phase to {50%, 10%, 5% and 1%} of its original value, average accuracy of WCET estimate improves further to {9%, 11%, 12% and 13%} respectively, on Arch1. On Arch2, average accuracy of WCET improves to 159% when CPI variance is limited to 50% of its original value and improvement is marginal beyond that point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Synfire waves are propagating spike packets in synfire chains, which are feedforward chains embedded in random networks. Although synfire waves have proved to be effective quantification for network activity with clear relations to network structure, their utilities are largely limited to feedforward networks with low background activity. To overcome these shortcomings, we describe a novel generalisation of synfire waves, and define `synconset wave' as a cascade of first spikes within a synchronisation event. Synconset waves would occur in `synconset chains', which are feedforward chains embedded in possibly heavily recurrent networks with heavy background activity. We probed the utility of synconset waves using simulation of single compartment neuron network models with biophysically realistic conductances, and demonstrated that the spread of synconset waves directly follows from the network connectivity matrix and is modulated by top-down inputs and the resultant oscillations. Such synconset profiles lend intuitive insights into network organisation in terms of connection probabilities between various network regions rather than an adjacency matrix. To test this intuition, we develop a Bayesian likelihood function that quantifies the probability that an observed synfire wave was caused by a given network. Further, we demonstrate it's utility in the inverse problem of identifying the network that caused a given synfire wave. This method was effective even in highly subsampled networks where only a small subset of neurons were accessible, thus showing it's utility in experimental estimation of connectomes in real neuronal-networks. Together, we propose synconset chains/waves as an effective framework for understanding the impact of network structure on function, and as a step towards developing physiology-driven network identification methods. Finally, as synconset chains extend the utilities of synfire chains to arbitrary networks, we suggest utilities of our framework to several aspects of network physiology including cell assemblies, population codes, and oscillatory synchrony.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the problem of finding optimal power control policies for wireless energy harvesting sensor (EHS) nodes with automatic repeat request (ARQ)-based packet transmissions. The EHS harvests energy from the environment according to a Bernoulli process; and it is required to operate within the constraint of energy neutrality. The EHS obtains partial channel state information (CSI) at the transmitter through the link-layer ARQ protocol, via the ACK/NACK feedback messages, and uses it to adapt the transmission power for the packet (re)transmission attempts. The underlying wireless fading channel is modeled as a finite state Markov chain with known transition probabilities. Thus, the goal of the power management policy is to determine the best power setting for the current packet transmission attempt, so as to maximize a long-run expected reward such as the expected outage probability. The problem is addressed in a decision-theoretic framework by casting it as a partially observable Markov decision process (POMDP). Due to the large size of the state-space, the exact solution to the POMDP is computationally expensive. Hence, two popular approximate solutions are considered, which yield good power management policies for the transmission attempts. Monte Carlo simulation results illustrate the efficacy of the approach and show that the approximate solutions significantly outperform conventional approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of the initial height on the temporal persistence probability of steady-state height fluctuations in up-down symmetric linear models of surface growth are investigated. We study the (1 + 1)-dimensional Family model and the (1 + 1)-and (2 + 1)-dimensional larger curvature (LC) model. Both the Family and LC models have up-down symmetry, so the positive and negative persistence probabilities in the steady state, averaged over all values of the initial height h(0), are equal to each other. However, these two probabilities are not equal if one considers a fixed nonzero value of h(0). Plots of the positive persistence probability for negative initial height versus time exhibit power-law behavior if the magnitude of the initial height is larger than the interface width at saturation. By symmetry, the negative persistence probability for positive initial height also exhibits the same behavior. The persistence exponent that describes this power-law decay decreases as the magnitude of the initial height is increased. The dependence of the persistence probability on the initial height, the system size, and the discrete sampling time is found to exhibit scaling behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop an approximate analytical technique for evaluating the performance of multi-hop networks based on beacon-less CSMA/CA as standardised in IEEE 802.15.4, a popular standard for wireless sensor networks. The network comprises sensor nodes, which generate measurement packets, and relay nodes which only forward packets. We consider a detailed stochastic process at each node, and analyse this process taking into account the interaction with neighbouring nodes via certain unknown variables (e.g., channel sensing rates, collision probabilities, etc.). By coupling these analyses of the various nodes, we obtain fixed point equations that can be solved numerically to obtain the unknown variables, thereby yielding approximations of time average performance measures, such as packet discard probabilities and average queueing delays. Different analyses arise for networks with no hidden nodes and networks with hidden nodes. We apply this approach to the performance analysis of tree networks rooted at a data sink. Finally, we provide a validation of our analysis technique against simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real world biological systems such as the human brain are inherently nonlinear and difficult to model. However, most of the previous studies have either employed linear models or parametric nonlinear models for investigating brain function. In this paper, a novel application of a nonlinear measure of phase synchronization based on recurrences, correlation between probabilities of recurrence (CPR), to study connectivity in the brain has been proposed. Being non-parametric, this method makes very few assumptions, making it suitable for investigating brain function in a data-driven way. CPR's utility with application to multichannel electroencephalographic (EEG) signals has been demonstrated. Brain connectivity obtained using thresholded CPR matrix of multichannel EEG signals showed clear differences in the number and pattern of connections in brain connectivity between (a) epileptic seizure and pre-seizure and (b) eyes open and eyes closed states. Corresponding brain headmaps provide meaningful insights about synchronization in the brain in those states. K-means clustering of connectivity parameters of CPR and linear correlation obtained from global epileptic seizure and pre-seizure showed significantly larger cluster centroid distances for CPR as opposed to linear correlation, thereby demonstrating the superior ability of CPR for discriminating seizure from pre-seizure. The headmap in the case of focal epilepsy clearly enables us to identify the focus of the epilepsy which provides certain diagnostic value. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anthropogenic fires in seasonally dry tropical forests are a regular occurrence during the dry season. Forest managers in India, who presently follow a fire suppression policy in such forests, would benefit from a system of assessing the potential risk to fire on a particular day. We examined the relationship between weather variables (seasonal rainfall, relative humidity, temperature) and days of fire during the dry seasons of 2004-2010, based on MODIS fire incident data in the seasonally dry tropical forests of Mudumalai in the Western Ghats, southern India. Logistic regression analysis showed that high probabilities of a fire day, indicating successful ignition of litter and grass fuel on the forest floor, were associated with low levels of early dry season rainfall, low daily average relative humidity and high daily average temperatures. These weather conditions are representative of low moisture levels of fine fuels, suggesting that the occurrence of fire is moderated by environmental conditions that reduce the flammability of fine fuels in the dry tropics. We propose a quantitative framework for assessing risk of a fire day to assist forest managers in anticipating fire occurrences in this seasonally dry tropical forest, and possibly for those across South Asia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Complex biological systems such as the human brain can be expected to be inherently nonlinear and hence difficult to model. Most of the previous studies on investigations of brain function have either used linear models or parametric nonlinear models. In this paper, we propose a novel application of a nonlinear measure of phase synchronization based on recurrences, correlation between probabilities of recurrence (CPR), to study seizures in the brain. The advantage of this nonparametric method is that it makes very few assumptions thus making it possible to investigate brain functioning in a data-driven way. We have demonstrated the utility of CPR measure for the study of phase synchronization in multichannel seizure EEG recorded from patients with global as well as focal epilepsy. For the case of global epilepsy, brain synchronization using thresholded CPR matrix of multichannel EEG signals showed clear differences in results obtained for epileptic seizure and pre-seizure. Brain headmaps obtained for seizure and preseizure cases provide meaningful insights about synchronization in the brain in those states. The headmap in the case of focal epilepsy clearly enables us to identify the focus of the epilepsy which provides certain diagnostic value. Comparative studies with linear correlation have shown that the nonlinear measure CPR outperforms the linear correlation measure. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The status of the endemic and endangered lion-tailed macaque (Macaca silenus) has not been properly assessed in several regions of the Western Ghats of southern India. We conducted a study in Parambikulam Forest Reserve in the state of Kerala to determine the distribution, demography, and status of lion-tailed macaques. We laid 5km(2) grid cells on the map of the study area (644km(2)) and made four replicated walks in each grid cell using GPS. We gathered data on lion-tailed macaque group locations, demography, and site covariates including trail length, duration of walk, proportion of evergreen forest, height of tallest trees, and human disturbance index. We also performed occupancy modeling using PRESENCE ver. 3.0. We estimated a minimum of 17 groups of macaques in these hills. Low detection and occupancy probabilities indicated a low density of lion-tailed macaques in the study area. Height of the tallest trees correlated positively whereas human disturbance and proportion of evergreen forest correlated negatively with occupancy in grid cells. We also used data from earlier studies carried out in the surrounding Anamalai Tiger Reserve and Nelliyampathy Hills to discuss the conservation status in the large Anamalai Hills Landscape. This landscape harbors an estimated population of 1108 individuals of lion-tailed macaques, which is about one third of the entire estimated wild population of this species. A conservation plan for this landscape could be used as a model for conservation in other regions of the Western Ghats.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The structural annotation of proteins with no detectable homologs of known 3D structure identified using sequence-search methods is a major challenge today. We propose an original method that computes the conditional probabilities for the amino-acid sequence of a protein to fit to known protein 3D structures using a structural alphabet, known as Protein Blocks (PBs). PBs constitute a library of 16 local structural prototypes that approximate every part of protein backbone structures. It is used to encode 3D protein structures into 1D PB sequences and to capture sequence to structure relationships. Our method relies on amino acid occurrence matrices, one for each PB, to score global and local threading of query amino acid sequences to protein folds encoded into PB sequences. It does not use any information from residue contacts or sequence-search methods or explicit incorporation of hydrophobic effect. The performance of the method was assessed with independent test datasets derived from SCOP 1.75A. With a Z-score cutoff that achieved 95% specificity (i.e., less than 5% false positives), global and local threading showed sensitivity of 64.1% and 34.2%, respectively. We further tested its performance on 57 difficult CASP10 targets that had no known homologs in PDB: 38 compatible templates were identified by our approach and 66% of these hits yielded correctly predicted structures. This method scales-up well and offers promising perspectives for structural annotations at genomic level. It has been implemented in the form of a web-server that is freely available at http://www.bo-protscience.fr/forsa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large-scale estimates of the area of terrestrial surface waters have greatly improved over time, in particular through the development of multi-satellite methodologies, but the generally coarse spatial resolution (tens of kms) of global observations is still inadequate for many ecological applications. The goal of this study is to introduce a new, globally applicable downscaling method and to demonstrate its applicability to derive fine resolution results from coarse global inundation estimates. The downscaling procedure predicts the location of surface water cover with an inundation probability map that was generated by bagged derision trees using globally available topographic and hydrographic information from the SRTM-derived HydroSHEDS database and trained on the wetland extent of the GLC2000 global land cover map. We applied the downscaling technique to the Global Inundation Extent from Multi-Satellites (GIEMS) dataset to produce a new high-resolution inundation map at a pixel size of 15 arc-seconds, termed GIEMS-D15. GIEMS-D15 represents three states of land surface inundation extents: mean annual minimum (total area, 6.5 x 10(6) km(2)), mean annual maximum (12.1 x 10(6) km(2)), and long-term maximum (173 x 10(6) km(2)); the latter depicts the largest surface water area of any global map to date. While the accuracy of GIEMS-D15 reflects distribution errors introduced by the downscaling process as well as errors from the original satellite estimates, overall accuracy is good yet spatially variable. A comparison against regional wetland cover maps generated by independent observations shows that the results adequately represent large floodplains and wetlands. GIEMS-D15 offers a higher resolution delineation of inundated areas than previously available for the assessment of global freshwater resources and the study of large floodplain and wetland ecosystems. The technique of applying inundation probabilities also allows for coupling with coarse-scale hydro-climatological model simulations. (C) 2014 Elsevier Inc All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi temporal land use information were derived using two decades remote sensing data and simulated for 2012 and 2020 with Cellular Automata (CA) considering scenarios, change probabilities (through Markov chain) and Multi Criteria Evaluation (MCE). Agents and constraints were considered for modeling the urbanization process. Agents were nornmlized through fiizzyfication and priority weights were assigned through Analytical Hierarchical Process (AHP) pairwise comparison for each factor (in MCE) to derive behavior-oriented rules of transition for each land use class. Simulation shows a good agreement with the classified data. Fuzzy and AHP helped in analyzing the effects of agents of growth clearly and CA-Markov proved as a powerful tool in modelling and helped in capturing and visualizing the spatiotemporal patterns of urbanization. This provided rapid land evaluation framework with the essential insights of the urban trajectory for effective sustainable city planning.