186 resultados para Burg, A.
Resumo:
Mode of access: Internet.
Resumo:
Thesis (doctoral)--Rheinische Friedrich-Wilhelms-Universitat, Bonn.
Resumo:
This paper considers the applicability of the least mean fourth (LM F) power gradient adaptation criteria with 'advantage' for signals associated with gaussian noise, the associated noise power estimate not being known. The proposed method, as an adaptive spectral estimator, is found to provide superior performance than the least mean square (LMS) adaptation for the same (or even lower) speed of convergence for signals having sufficiently high signal-to-gaussian noise ratio. The results include comparison of the performance of the LMS-tapped delay line, LMF-tapped delay line, LMS-lattice and LMF-lattice algorithms, with the Burg's block data method as reference. The signals, like sinusoids with noise and stochastic signals like EEG, are considered in this study.
Resumo:
The initial boundary value problem for the Burgers equation in the domain x greater-or-equal, slanted 0, t > 0 with flux boundary condition at x = 0 has been solved exactly. The behaviour of the solution as t tends to infinity is studied and the “asymptotic profile at infinity” is obtained. In addition, the uniqueness of the solution of the initial boundary value problem is proved and its inviscid limit as var epsilon → 0 is obtained.
Resumo:
We address the problem of robust formant tracking in continuous speech in the presence of additive noise. We propose a new approach based on mixture modeling of the formant contours. Our approach consists of two main steps: (i) Computation of a pyknogram based on multiband amplitude-modulation/frequency-modulation (AM/FM) decomposition of the input speech; and (ii) Statistical modeling of the pyknogram using mixture models. We experiment with both Gaussian mixture model (GMM) and Student's-t mixture model (tMM) and show that the latter is robust with respect to handling outliers in the pyknogram data, parameter selection, accuracy, and smoothness of the estimated formant contours. Experimental results on simulated data as well as noisy speech data show that the proposed tMM-based approach is also robust to additive noise. We present performance comparisons with a recently developed adaptive filterbank technique proposed in the literature and the classical Burg's spectral estimator technique, which show that the proposed technique is more robust to noise.
Resumo:
Individuals with genetic defects in CD40 ligand (CD40L) or B-cell antigen receptor coreceptor molecules CD19 and CD81 suffer from an antibody deficiency. Still, these patients carry low levels of memory B cells and serum antibodies.
Resumo:
Immunoglobulin superfamily (IgSF) domains are conserved structures present in many proteins in eukaryotes and prokaryotes. These domains are well-capable of facilitating sequence variation, which is most clearly illustrated by the variable regions in immunoglobulins (Igs) and T cell receptors (TRs). We studied an antibody-deficient patient suffering from recurrent respiratory infections and with impaired antibody responses to vaccinations. Patient's B cells showed impaired Ca(2+) influx upon stimulation with anti-IgM and lacked detectable CD19 membrane expression. CD19 sequence analysis revealed a homozygous missense mutation resulting in a tryptophan to cystein (W52C) amino acid change. The affected tryptophan is CONSERVED-TRP 41 located on the C-strand of the first extracellular IgSF domain of CD19 and was found to be highly conserved, not only in mammalian CD19 proteins, but in nearly all characterized IgSF domains. Furthermore, the tryptophan is present in all variable domains in Ig and TR and was not mutated in 117 Ig class-switched transcripts of B cells from controls, despite an overall 10% amino acid change frequency. In vitro complementation studies and CD19 western blotting of patient's B cells demonstrated that the mutated protein remained immaturely glycosylated. This first missense mutation resulting in a CD19 deficiency demonstrates the crucial role of a highly conserved tryptophan in proper folding or stability of IgSF domains.
Resumo:
News
Resumo:
info:eu-repo/semantics/nonPublished
Resumo:
info:eu-repo/semantics/nonPublished
Resumo:
info:eu-repo/semantics/nonPublished
Resumo:
An extension of approximate computing, significance-based computing exploits applications' inherent error resiliency and offers a new structural paradigm that strategically relaxes full computational precision to provide significant energy savings with minimal performance degradation.
Resumo:
Polar codes are one of the most recent advancements in coding theory and they have attracted significant interest. While they are provably capacity achieving over various channels, they have seen limited practical applications. Unfortunately, the successive nature of successive cancellation based decoders hinders fine-grained adaptation of the decoding complexity to design constraints and operating conditions. In this paper, we propose a systematic method for enabling complexity-performance trade-offs by constructing polar codes based on an optimization problem which minimizes the complexity under a suitably defined mutual information based performance constraint. Moreover, a low-complexity greedy algorithm is proposed in order to solve the optimization problem efficiently for very large code lengths.
Resumo:
Hardware designers and engineers typically need to explore a multi-parametric design space in order to find the best configuration for their designs using simulations that can take weeks to months to complete. For example, designers of special purpose chips need to explore parameters such as the optimal bitwidth and data representation. This is the case for the development of complex algorithms such as Low-Density Parity-Check (LDPC) decoders used in modern communication systems. Currently, high-performance computing offers a wide set of acceleration options, that range from multicore CPUs to graphics processing units (GPUs) and FPGAs. Depending on the simulation requirements, the ideal architecture to use can vary. In this paper we propose a new design flow based on OpenCL, a unified multiplatform programming model, which accelerates LDPC decoding simulations, thereby significantly reducing architectural exploration and design time. OpenCL-based parallel kernels are used without modifications or code tuning on multicore CPUs, GPUs and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL for mapping the simulations into FPGAs. To the best of our knowledge, this is the first time that a single, unmodified OpenCL code is used to target those three different platforms. We show that, depending on the design parameters to be explored in the simulation, on the dimension and phase of the design, the GPU or the FPGA may suit different purposes more conveniently, providing different acceleration factors. For example, although simulations can typically execute more than 3x faster on FPGAs than on GPUs, the overhead of circuit synthesis often outweighs the benefits of FPGA-accelerated execution.
Resumo:
In this paper, we investigate the impact of circuit misbehavior due to parametric variations and voltage scaling on the performance of wireless communication systems. Our study reveals the inherent error resilience of such systems and argues that sufficiently reliable operation can be maintained even in the presence of unreliable circuits and manufacturing defects. We further show how selective application of more robust circuit design techniques is sufficient to deal with high defect rates at low overhead and improve energy efficiency with negligible system performance degradation.