13 resultados para Error-correcting codes (Information theory)

em National Center for Biotechnology Information - NCBI


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A molecular model of poorly understood hydrophobic effects is heuristically developed using the methods of information theory. Because primitive hydrophobic effects can be tied to the probability of observing a molecular-sized cavity in the solvent, the probability distribution of the number of solvent centers in a cavity volume is modeled on the basis of the two moments available from the density and radial distribution of oxygen atoms in liquid water. The modeled distribution then yields the probability that no solvent centers are found in the cavity volume. This model is shown to account quantitatively for the central hydrophobic phenomena of cavity formation and association of inert gas solutes. The connection of information theory to statistical thermodynamics provides a basis for clarification of hydrophobic effects. The simplicity and flexibility of the approach suggest that it should permit applications to conformational equilibria of nonpolar solutes and hydrophobic residues in biopolymers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuronal responses are conspicuously variable. We focus on one particular aspect of that variability: the precision of action potential timing. We show that for common models of noisy spike generation, elementary considerations imply that such variability is a function of the input, and can be made arbitrarily large or small by a suitable choice of inputs. Our considerations are expected to extend to virtually any mechanism of spike generation, and we illustrate them with data from the visual pathway. Thus, a simplification usually made in the application of information theory to neural processing is violated: noise is not independent of the message. However, we also show the existence of error-correcting topologies, which can achieve better timing reliability than their components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Averaged event-related potential (ERP) data recorded from the human scalp reveal electroencephalographic (EEG) activity that is reliably time-locked and phase-locked to experimental events. We report here the application of a method based on information theory that decomposes one or more ERPs recorded at multiple scalp sensors into a sum of components with fixed scalp distributions and sparsely activated, maximally independent time courses. Independent component analysis (ICA) decomposes ERP data into a number of components equal to the number of sensors. The derived components have distinct but not necessarily orthogonal scalp projections. Unlike dipole-fitting methods, the algorithm does not model the locations of their generators in the head. Unlike methods that remove second-order correlations, such as principal component analysis (PCA), ICA also minimizes higher-order dependencies. Applied to detected—and undetected—target ERPs from an auditory vigilance experiment, the algorithm derived ten components that decomposed each of the major response peaks into one or more ICA components with relatively simple scalp distributions. Three of these components were active only when the subject detected the targets, three other components only when the target went undetected, and one in both cases. Three additional components accounted for the steady-state brain response to a 39-Hz background click train. Major features of the decomposition proved robust across sessions and changes in sensor number and placement. This method of ERP analysis can be used to compare responses from multiple stimuli, task conditions, and subject states.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decade, two tools, one drawn from information theory and the other from artificial neural networks, have proven particularly useful in many different areas of sequence analysis. The work presented herein indicates that these two approaches can be joined in a general fashion to produce a very powerful search engine that is capable of locating members of a given nucleic acid sequence family in either local or global sequence searches. This program can, in turn, be queried for its definition of the motif under investigation, ranking each base in context for its contribution to membership in the motif family. In principle, the method used can be applied to any binding motif, including both DNA and RNA sequence families, given sufficient family size.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Defects in the XPG DNA repair endonuclease gene can result in the cancer-prone disorders xeroderma pigmentosum (XP) or the XP–Cockayne syndrome complex. While the XPG cDNA sequence was known, determination of the genomic sequence was required to understand its different functions. In cells from normal donors, we found that the genomic sequence of the human XPG gene spans 30 kb, contains 15 exons that range from 61 to 1074 bp and 14 introns that range from 250 to 5763 bp. Analysis of the splice donor and acceptor sites using an information theory-based approach revealed three splice sites with low information content, which are components of the minor (U12) spliceosome. We identified six alternatively spliced XPG mRNA isoforms in cells from normal donors and from XPG patients: partial deletion of exon 8, partial retention of intron 8, two with alternative exons (in introns 1 and 6) and two that retained complete introns (introns 3 and 9). The amount of alternatively spliced XPG mRNA isoforms varied in different tissues. Most alternative splice donor and acceptor sites had a relatively high information content, but one has the U12 spliceosome sequence. A single nucleotide polymorphism has allele frequencies of 0.74 for 3507G and 0.26 for 3507C in 91 donors. The human XPG gene contains multiple splice sites with low information content in association with multiple alternatively spliced isoforms of XPG mRNA.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Multielectrode recording techniques were used to record ensemble activity from 10 to 16 simultaneously active CA1 and CA3 neurons in the rat hippocampus during performance of a spatial delayed-nonmatch-to-sample task. Extracted sources of variance were used to assess the nature of two different types of errors that accounted for 30% of total trials. The two types of errors included ensemble “miscodes” of sample phase information and errors associated with delay-dependent corruption or disappearance of sample information at the time of the nonmatch response. Statistical assessment of trial sequences and associated “strength” of hippocampal ensemble codes revealed that miscoded error trials always followed delay-dependent error trials in which encoding was “weak,” indicating that the two types of errors were “linked.” It was determined that the occurrence of weakly encoded, delay-dependent error trials initiated an ensemble encoding “strategy” that increased the chances of being correct on the next trial and avoided the occurrence of further delay-dependent errors. Unexpectedly, the strategy involved “strongly” encoding response position information from the prior (delay-dependent) error trial and carrying it forward to the sample phase of the next trial. This produced a miscode type error on trials in which the “carried over” information obliterated encoding of the sample phase response on the next trial. Application of this strategy, irrespective of outcome, was sufficient to reorient the animal to the proper between trial sequence of response contingencies (nonmatch-to-sample) and boost performance to 73% correct on subsequent trials. The capacity for ensemble analyses of strength of information encoding combined with statistical assessment of trial sequences therefore provided unique insight into the “dynamic” nature of the role hippocampus plays in delay type memory tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last few years have witnessed a significant decrease in the gap between the Shannon channel capacity limit and what is practically achievable. Progress has resulted from novel extensions of previously known coding techniques involving interleaved concatenated codes. A considerable body of simulation results is now available, supported by an important but limited theoretical basis. This paper presents a computational technique which further ties simulation results to the known theory and reveals a considerable reduction in the complexity required to approach the Shannon limit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine decision making in two-person extensive form game trees using nine treatments that vary matching protocol, payoffs, and payoff information. Our objective is to establish replicable principles of cooperative versus noncooperative behavior that involve the use of signaling, reciprocity, and backward induction strategies, depending on the availability of dominated direct punishing strategies and the probability of repeated interaction with the same partner. Contrary to the predictions of game theory, we find substantial support for cooperation under complete information even in various single-play treatments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A theory is provided for the detection efficiency of diffuse light whose frequency is modulated by an acoustical wave. We derive expressions for the speckle pattern of the modulated light, as well as an expression for the signal-to-noise ratio for the detector. The aim is to develop a new imaging technology for detection of tumors in humans. The acoustic wave is focused into a small geometrical volume, which provides the spatial resolution for the imaging. The wavelength of the light wave can be selected to provide information regarding the kind of tumor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dichotomy between two groups of workers on neuroelectrical activity is retarding progress. To study the interrelations between neuronal unit spike activity and compound field potentials of cell populations is both unfashionable and technically challenging. Neither of the mutual disparagements is justified: that spikes are to higher functions as the alphabet is to Shakespeare and that slow field potentials are irrelevant epiphenomena. Spikes are not the basis of the neural code but of multiple codes that coexist with nonspike codes. Field potentials are mainly information-rich signs of underlying processes, but sometimes they are also signals for neighboring cells, that is, they exert influence. This paper concerns opportunities for new research with many channels of wide-band (spike and slow wave) recording. A wealth of structure in time and three-dimensional space is different at each scale—micro-, meso-, and macroactivity. The depth of our ignorance is emphasized to underline the opportunities for uncovering new principles. We cannot currently estimate the relative importance of spikes and synaptic communication vs. extrasynaptic graded signals. In spite of a preponderance of literature on the former, we must consider the latter as probably important. We are in a primitive stage of looking at the time series of wide-band voltages in the compound, local field, potentials and of choosing descriptors that discriminate appropriately among brain loci, states (functions), stages (ontogeny, senescence), and taxa (evolution). This is not surprising, since the brains in higher species are surely the most complex systems known. They must be the greatest reservoir of new discoveries in nature. The complexity should not deter us, but a dose of humility can stimulate the flow of imaginative juices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational neuroscience has contributed significantly to our understanding of higher brain function by combining experimental neurobiology, psychophysics, modeling, and mathematical analysis. This article reviews recent advances in a key area: neural coding and information processing. It is shown that synapses are capable of supporting computations based on highly structured temporal codes. Such codes could provide a substrate for unambiguous representations of complex stimuli and be used to solve difficult cognitive tasks, such as the binding problem. Unsupervised learning rules could generate the circuitry required for precise temporal codes. Together, these results indicate that neural systems perform a rich repertoire of computations based on action potential timing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RNA viruses evolve rapidly. One source of this ability to rapidly change is the apparently high mutation frequency in RNA virus populations. A high mutation frequency is a central tenet of the quasispecies theory. A corollary of the quasispecies theory postulates that, given their high mutation frequency, animal RNA viruses may be susceptible to error catastrophe, where they undergo a sharp drop in viability after a modest increase in mutation frequency. We recently showed that the important broad-spectrum antiviral drug ribavirin (currently used to treat hepatitis C virus infections, among others) is an RNA virus mutagen, and we proposed that ribavirin's antiviral effect is by forcing RNA viruses into error catastrophe. However, a direct demonstration of error catastrophe has not been made for ribavirin or any RNA virus mutagen. Here we describe a direct demonstration of error catastrophe by using ribavirin as the mutagen and poliovirus as a model RNA virus. We demonstrate that ribavirin's antiviral activity is exerted directly through lethal mutagenesis of the viral genetic material. A 99.3% loss in viral genome infectivity is observed after a single round of virus infection in ribavirin concentrations sufficient to cause a 9.7-fold increase in mutagenesis. Compiling data on both the mutation levels and the specific infectivities of poliovirus genomes produced in the presence of ribavirin, we have constructed a graph of error catastrophe showing that normal poliovirus indeed exists at the edge of viability. These data suggest that RNA virus mutagens may represent a promising new class of antiviral drugs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a unifying theory of hypoxia tolerance based on information from two cell level models (brain cortical cells and isolated hepatocytes) from the highly anoxia tolerant aquatic turtle and from other more hypoxia sensitive systems. We propose that the response of hypoxia tolerant systems to oxygen lack occurs in two phases (defense and rescue). The first lines of defense against hypoxia include a balanced suppression of ATP-demand and ATP-supply pathways; this regulation stabilizes (adenylates) at new steady-state levels even while ATP turnover rates greatly decline. The ATP demands of ion pumping are down-regulated by generalized "channel" arrest in hepatocytes and by "spike" arrest in neurons. Hypoxic ATP demands of protein synthesis are down-regulated probably by translational arrest. In hypoxia sensitive cells this translational arrest seems irreversible, but hypoxia-tolerant systems activate "rescue" mechanisms if the period of oxygen lack is extended by preferentially regulating the expression of several proteins. In these cells, a cascade of processes underpinning hypoxia rescue and defense begins with an oxygen sensor (a heme protein) and a signal-transduction pathway, which leads to significant gene-based metabolic reprogramming-the rescue process-with maintained down-regulation of energy-demand and energy-supply pathways in metabolism throughout the hypoxic period. This recent work begins to clarify how normoxic maintenance ATP turnover rates can be drastically (10-fold) down-regulated to a new hypometabolic steady state, which is prerequisite for surviving prolonged hypoxia or anoxia. The implications of these developments are extensive in biology and medicine.