113 resultados para penalized likelihood


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Generalized Distributive Law (GDL) is a message passing algorithm which can efficiently solve a certain class of computational problems, and includes as special cases the Viterbi's algorithm, the BCJR algorithm, the Fast-Fourier Transform, Turbo and LDPC decoding algorithms. In this paper GDL based maximum-likelihood (ML) decoding of Space-Time Block Codes (STBCs) is introduced and a sufficient condition for an STBC to admit low GDL decoding complexity is given. Fast-decoding and multigroup decoding are the two algorithms used in the literature to ML decode STBCs with low complexity. An algorithm which exploits the advantages of both these two is called Conditional ML (CML) decoding. It is shown in this paper that the GDL decoding complexity of any STBC is upper bounded by its CML decoding complexity, and that there exist codes for which the GDL complexity is strictly less than the CML complexity. Explicit examples of two such families of STBCs is given in this paper. Thus the CML is in general suboptimal in reducing the ML decoding complexity of a code, and one should design codes with low GDL complexity rather than low CML complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been shown recently that the maximum rate of a 2-real-symbol (single-complex-symbol) maximum likelihood (ML) decodable, square space-time block codes (STBCs) with unitary weight matrices is 2a/2a complex symbols per channel use (cspcu) for 2a number of transmit antennas [1]. These STBCs are obtained from Unitary Weight Designs (UWDs). In this paper, we show that the maximum rates for 3- and 4-real-symbol (2-complex-symbol) ML decodable square STBCs from UWDs, for 2a transmit antennas, are 3(a-1)/2a and 4(a-1)/2a cspcu, respectively. STBCs achieving this maximum rate are constructed. A set of sufficient conditions on the signal set, required for these codes to achieve full-diversity are derived along with expressions for their coding gain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For a family/sequence of Space-Time Block Codes (STBCs) C1, C2,⋯, with increasing number of transmit antennas Ni, with rates Ri complex symbols per channel use (cspcu), i = 1,2,⋯, the asymptotic normalized rate is defined as limi→∞ Ri/Ni. A family of STBCs is said to be asymptotically-good if the asymptotic normalized rate is non-zero, i.e., when the rate scales as a non-zero fraction of the number of transmit antennas, and the family of STBCs is said to be asymptotically-optimal if the asymptotic normalized rate is 1, which is the maximum possible value. In this paper, we construct a new class of full-diversity STBCs that have the least maximum-likelihood (ML) decoding complexity among all known codes for any number of transmit antennas N>;1 and rates R>;1 cspcu. For a large set of (R,N) pairs, the new codes have lower ML decoding complexity than the codes already available in the literature. Among the new codes, the class of full-rate codes (R=N) are asymptotically-optimal and fast-decodable, and for N>;5 have lower ML decoding complexity than all other families of asymptotically-optimal, fast-decodable, full-diversity STBCs available in the literature. The construction of the new STBCs is facilitated by the following further contributions of this paper: (i) Construction of a new class of asymptotically-good, full-diversity multigroup ML decodable codes, that not only includes STBCs for a larger set of antennas, but also either matches in rate or contains as a proper subset all other high-rate or asymptotically-good, delay-optimal, multigroup ML decodable codes available in the literature. (ii) Construction of a new class of fast-group-decodable codes (codes that combine the low ML decoding complexity properties of multigroup ML decodable codes and fast-decodable codes) for all even number of transmit antennas and rates 1 <; R ≤ 5/4.- - (iii) Given a design with full-rank linear dispersion matrices, we show that a full-diversity STBC can be constructed from this design by encoding the real symbols independently using only regular PAM constellations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Land cover (LC) and land use (LU) dynamics induced by human and natural processes play a major role in global as well as regional patterns of landscapes influencing biodiversity, hydrology, ecology and climate. Changes in LC features resulting in forest fragmentations have posed direct threats to biodiversity, endangering the sustainability of ecological goods and services. Habitat fragmentation is of added concern as the residual spatial patterns mitigate or exacerbate edge effects. LU dynamics are obtained by classifying temporal remotely sensed satellite imagery of different spatial and spectral resolutions. This paper reviews five different image classification algorithms using spatio-temporal data of a temperate watershed in Himachal Pradesh, India. Gaussian Maximum Likelihood classifier was found to be apt for analysing spatial pattern at regional scale based on accuracy assessment through error matrix and ROC (receiver operating characteristic) curves. The LU information thus derived was then used to assess spatial changes from temporal data using principal component analysis and correspondence analysis based image differencing. The forest area dynamics was further studied by analysing the different types of fragmentation through forest fragmentation models. The computed forest fragmentation and landscape metrics show a decline of interior intact forests with a substantial increase in patch forest during 1972-2007.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background & objectives: There is a need to develop an affordable and reliable tool for hearing screening of neonates in resource constrained, medically underserved areas of developing nations. This study valuates a strategy of health worker based screening of neonates using a low cost mechanical calibrated noisemaker followed up with parental monitoring of age appropriate auditory milestones for detecting severe-profound hearing impairment in infants by 6 months of age. Methods: A trained health worker under the supervision of a qualified audiologist screened 425 neonates of whom 20 had confirmed severe-profound hearing impairment. Mechanical calibrated noisemakers of 50, 60, 70 and 80 dB (A) were used to elicit the behavioural responses. The parents of screened neonates were instructed to monitor the normal language and auditory milestones till 6 months of age. This strategy was validated against the reference standard consisting of a battery of tests - namely, auditory brain stem response (ABR), otoacoustic emissions (OAE) and behavioural assessment at 2 years of age. Bayesian prevalence weighted measures of screening were calculated. Results: The sensitivity and specificity was high with least false positive referrals for. 70 and 80 dB (A) noisemakers. All the noisemakers had 100 per cent negative predictive value. 70 and 80 dB (A) noisemakers had high positive likelihood ratios of 19 and 34, respectively. The probability differences for pre- and post- test positive was 43 and 58 for 70 and 80 dB (A) noisemakers, respectively. Interpretation & conclusions: In a controlled setting, health workers with primary education can be trained to use a mechanical calibrated noisemaker made of locally available material to reliably screen for severe-profound hearing loss in neonates. The monitoring of auditory responses could be done by informed parents. Multi-centre field trials of this strategy need to be carried out to examine the feasibility of community health care workers using it in resource constrained settings of developing nations to implement an effective national neonatal hearing screening programme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The constraint complexity of a graphical realization of a linear code is the maximum dimension of the local constraint codes in the realization. The treewidth of a linear code is the least constraint complexity of any of its cycle-free graphical realizations. This notion provides a useful parameterization of the maximum-likelihood decoding complexity for linear codes. In this paper, we show the surprising fact that for maximum distance separable codes and Reed-Muller codes, treewidth equals trelliswidth, which, for a code, is defined to be the least constraint complexity (or branch complexity) of any of its trellis realizations. From this, we obtain exact expressions for the treewidth of these codes, which constitute the only known explicit expressions for the treewidth of algebraic codes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The magnetorotational instability (MRI) is a crucial mechanism of angular momentum transport in a variety of astrophysical accretion disks. In systems accreting at well below the Eddington rate, such as the central black hole in the Milky Way (Sgr A*), the plasma in the disk is essentially collisionless. We present a nonlinear study of the collisionless MRI using first-principles particle-in-cell plasma simulations. We focus on local two-dimensional (axisymmetric) simulations, deferring more realistic three-dimensional simulations to future work. For simulations with net vertical magnetic flux, the MRI continuously amplifies the magnetic field, B, until the Alfven velocity, v(A), is comparable to the speed of light, c (independent of the initial value of v(A)/c). This is consistent with the lack of saturation of MRI channel modes in analogous axisymmetric MHD simulations. The amplification of the magnetic field by the MRI generates a significant pressure anisotropy in the plasma (with the pressure perpendicular to B being larger than the parallel pressure). We find that this pressure anisotropy in turn excites mirror modes and that the volume-averaged pressure anisotropy remains near the threshold for mirror mode excitation. Particle energization is due to both reconnection and viscous heating associated with the pressure anisotropy. Reconnection produces a distinctive power-law component in the energy distribution function of the particles, indicating the likelihood of non-thermal ion and electron acceleration in collisionless accretion disks. This has important implications for interpreting the observed emission-from the radio to the gamma-rays-of systems such as Sgr A*.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trypanosomatids cause deadly diseases in humans. Of the various biochemical pathways in trypanosomatids, glycolysis, has received special attention because of being sequestered in peroxisome like organelles critical for the survival of the parasites. This study focuses on phosphoglycerate kinase (PGK) from Leishmania spp. which, exists in two isoforms, the cytoplasmic PGKB and glycosomal PGKC differing in their biochemical properties. Computational analysis predicted the likelihood of a transmembrane helix only in the glycosomal isoform PGKC, of approximate length 20 residues in the 62-residue extension, ending at, arginine residues R471 and R472. From experimental studies using circular dichroism and NMR with deuterated sodium dodecyl sulfate, we find that the transmembrane helix spans residues 448 +/- 2 to 476 in Leishmania mexicana PGKC. The significance of this observation is discussed in the context of glycosomal transport and substrate tunneling. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We reconsider standard uniaxial fatigue test data obtained from handbooks. Many S-N curve fits to such data represent the median life and exclude load-dependent variance in life. Presently available approaches for incorporating probabilistic aspects explicitly within the S-N curves have some shortcomings, which we discuss. We propose a new linear S-N fit with a prespecified failure probability, load-dependent variance, and reasonable behavior at extreme loads. We fit our parameters using maximum likelihood, show the reasonableness of the fit using Q-Q plots, and obtain standard error estimates via Monte Carlo simulations. The proposed fitting method may be used for obtaining S-N curves from the same data as already available, with the same mathematical form, but in cases in which the failure probability is smaller, say, 10 % instead of 50 %, and in which the fitted line is not parallel to the 50 % (median) line.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid disruption of tropical forests probably imperils global biodiversity more than any other contemporary phenomenon(1-3). With deforestation advancing quickly, protected areas are increasingly becoming final refuges for threatened species and natural ecosystem processes. However, many protected areas in the tropics are themselves vulnerable to human encroachment and other environmental stresses(4-9). As pressures mount, it is vital to know whether existing reserves can sustain their biodiversity. A critical constraint in addressing this question has been that data describing a broad array of biodiversity groups have been unavailable for a sufficiently large and representative sample of reserves. Here we present a uniquely comprehensive data set on changes over the past 20 to 30 years in 31 functional groups of species and 21 potential drivers of environmental change, for 60 protected areas stratified across the world's major tropical regions. Our analysis reveals great variation in reserve `health': about half of all reserves have been effective or performed passably, but the rest are experiencing an erosion of biodiversity that is often alarmingly widespread taxonomically and functionally. Habitat disruption, hunting and forest-product exploitation were the strongest predictors of declining reserve health. Crucially, environmental changes immediately outside reserves seemed nearly as important as those inside in determining their ecological fate, with changes inside reserves strongly mirroring those occurring around them. These findings suggest that tropical protected areas are often intimately linked ecologically to their surrounding habitats, and that a failure to stem broad-scale loss and degradation of such habitats could sharply increase the likelihood of serious biodiversity declines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-time image reconstruction is essential for improving the temporal resolution of fluorescence microscopy. A number of unavoidable processes such as, optical aberration, noise and scattering degrade image quality, thereby making image reconstruction an ill-posed problem. Maximum likelihood is an attractive technique for data reconstruction especially when the problem is ill-posed. Iterative nature of the maximum likelihood technique eludes real-time imaging. Here we propose and demonstrate a compute unified device architecture (CUDA) based fast computing engine for real-time 3D fluorescence imaging. A maximum performance boost of 210x is reported. Easy availability of powerful computing engines is a boon and may accelerate to realize real-time 3D fluorescence imaging. Copyright 2012 Author(s). This article is distributed under a Creative Commons Attribution 3.0 Unported License. http://dx.doi.org/10.1063/1.4754604]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose an iterative data reconstruction technique specifically designed for multi-dimensional multi-color fluorescence imaging. Markov random field is employed (for modeling the multi-color image field) in conjunction with the classical maximum likelihood method. It is noted that, ill-posed nature of the inverse problem associated with multi-color fluorescence imaging forces iterative data reconstruction. Reconstruction of three-dimensional (3D) two-color images (obtained from nanobeads and cultured cell samples) show significant reduction in the background noise (improved signal-to-noise ratio) with an impressive overall improvement in the spatial resolution (approximate to 250 nm) of the imaging system. Proposed data reconstruction technique may find immediate application in 3D in vivo and in vitro multi-color fluorescence imaging of biological specimens. (C) 2012 American Institute of Physics. http://dx.doi.org/10.1063/1.4769058]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low density parity-check (LDPC) codes are a class of linear block codes that are decoded by running belief propagation (BP) algorithm or log-likelihood ratio belief propagation (LLR-BP) over the factor graph of the code. One of the disadvantages of LDPC codes is the onset of an error floor at high values of signal to noise ratio caused by trapping sets. In this paper, we propose a two stage decoder to deal with different types of trapping sets. Oscillating trapping sets are taken care by the first stage of the decoder and the elementary trapping sets are handled by the second stage of the decoder. Simulation results on the regular PEG (504,252,3,6) code and the irregular PEG (1024,518,15,8) code shows that the proposed two stage decoder performs significantly better than the standard decoder.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial modulation (SM) and space shift keying (SSK) are relatively new modulation techniques which are attractive in multi-antenna communications. Single carrier (SC) systems can avoid the peak-to-average power ratio (PAPR) problem encountered in multicarrier systems. In this paper, we study SM and SSK signaling in cyclic-prefixed SC (CPSC) systems on MIMO-ISI channels. We present a diversity analysis of MIMO-CPSC systems under SSK and SM signaling. Our analysis shows that the diversity order achieved by (n(t), n(r)) SSK scheme and (n(t), n(r), Theta(M)) SM scheme in MIMO-CPSC systems under maximum-likelihood (ML) detection is n(r), where n(t), n(r) denote the number of transmit and receive antennas and Theta(M) denotes the modulation alphabet of size M. Bit error rate (BER) simulation results validate this predicted diversity order. Simulation results also show that MIMO-CPSC with SM and SSK achieves much better performance than MIMO-OFDM with SM and SSK.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A low complexity, essentially-ML decoding technique for the Golden code and the three antenna Perfect code was introduced by Sirianunpiboon, Howard and Calderbank. Though no theoretical analysis of the decoder was given, the simulations showed that this decoding technique has almost maximum-likelihood (ML) performance. Inspired by this technique, in this paper we introduce two new low complexity decoders for Space-Time Block Codes (STBCs)-the Adaptive Conditional Zero-Forcing (ACZF) decoder and the ACZF decoder with successive interference cancellation (ACZF-SIC), which include as a special case the decoding technique of Sirianunpiboon et al. We show that both ACZF and ACZF-SIC decoders are capable of achieving full-diversity, and we give a set of sufficient conditions for an STBC to give full-diversity with these decoders. We then show that the Golden code, the three and four antenna Perfect codes, the three antenna Threaded Algebraic Space-Time code and the four antenna rate 2 code of Srinath and Rajan are all full-diversity ACZF/ACZF-SIC decodable with complexity strictly less than that of their ML decoders. Simulations show that the proposed decoding method performs identical to ML decoding for all these five codes. These STBCs along with the proposed decoding algorithm have the least decoding complexity and best error performance among all known codes for transmit antennas. We further provide a lower bound on the complexity of full-diversity ACZF/ACZF-SIC decoding. All the five codes listed above achieve this lower bound and hence are optimal in terms of minimizing the ACZF/ACZF-SIC decoding complexity. Both ACZF and ACZF-SIC decoders are amenable to sphere decoding implementation.