978 resultados para Neural stimulation.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The ability to recreate an optimal cellular microenvironment is critical to understand neuronal behavior and functionality in vitro. An organized neural extracellular matrix (nECM) promotes neural cell adhesion, proliferation and differentiation. Here, we expanded previous observations on the ability of nECM to support in vitro neuronal differentiation, with the following goals: (i) to recreate complex neuronal networks of embryonic rat hippocampal cells, and (ii) to achieve improved levels of dopaminergic differentiation of subventricular zone (SVZ) neural progenitor cells. Methods: Hippocampal cells from E18 rat embryos were seeded on PLL- and nECM-coated substrates. Neurosphere cultures were prepared from the SVZ of P4-P7 rat pups, and differentiation of neurospheres assayed on PLL- and nECM-coated substrates. Results: When seeded on nECM-coated substrates, both hippocampal cells and SVZ progenitor cells showed neural expression patterns that were similar to their poly-L-lysine-seeded counterparts. However, nECM-based cultures of both hippocampal neurons and SVZ progenitor cells could be maintained for longer times as compared to poly-L-lysine-based cultures. As a result, nECM-based cultures gave rise to a more branched neurite arborization of hippocampal neurons. Interestingly, the prolonged differentiation time of SVZ progenitor cells in nECM allowed us to obtain a purer population of dopaminergic neurons. Conclusions: We conclude that nECM-based coating is an efficient substrate to culture neural cells at different stages of differentiation. In addition, neural ECM-coated substrates increased neuronal survival and neuronal differentiation efficiency as compared to cationic polymers such as poly-L-lysine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the use of artificial neural networks (ANNs) for predicting the received power/path loss in both outdoor and indoor links. The approach followed has been a combined use of ANNs and ray-tracing, the latter allowing the identification and parameterization of the so-called dominant path. A complete description of the process for creating and training an ANN-based model is presented with special emphasis on the training process. More specifically, we will be discussing various techniques to arrive at valid predictions focusing on an optimum selection of the training set. A quantitative analysis based on results from two narrowband measurement campaigns, one outdoors and the other indoors, is also presented.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neurons in the songbird forebrain nucleus HVc are highly sensitive to auditory temporal context and have some of the most complex auditory tuning properties yet discovered. HVc is crucial for learning, perceiving, and producing song, thus it is important to understand the neural circuitry and mechanisms that give rise to these remarkable auditory response properties. This thesis investigates these issues experimentally and computationally.

Extracellular studies reported here compare the auditory context sensitivity of neurons in HV c with neurons in the afferent areas of field L. These demonstrate that there is a substantial increase in the auditory temporal context sensitivity from the areas of field L to HVc. Whole-cell recordings of HVc neurons from acute brain slices are described which show that excitatory synaptic transmission between HVc neurons involve the release of glutamate and the activation of both AMPA/kainate and NMDA-type glutamate receptors. Additionally, widespread inhibitory interactions exist between HVc neurons that are mediated by postsynaptic GABA_A receptors. Intracellular recordings of HVc auditory neurons in vivo provides evidence that HV c neurons encode information about temporal structure using a variety of cellular and synaptic mechanisms including syllable-specific inhibition, excitatory post-synaptic potentials with a range of different time courses, and burst-firing, and song-specific hyperpolarization.

The final part of this thesis presents two computational approaches for representing and learning temporal structure. The first method utilizes comput ational elements that are analogous to temporal combination sensitive neurons in HVc. A network of these elements can learn using local information and lateral inhibition. The second method presents a more general framework which allows a network to discover mixtures of temporal features in a continuous stream of input.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The brain is perhaps the most complex system to have ever been subjected to rigorous scientific investigation. The scale is staggering: over 10^11 neurons, each making an average of 10^3 synapses, with computation occurring on scales ranging from a single dendritic spine, to an entire cortical area. Slowly, we are beginning to acquire experimental tools that can gather the massive amounts of data needed to characterize this system. However, to understand and interpret these data will also require substantial strides in inferential and statistical techniques. This dissertation attempts to meet this need, extending and applying the modern tools of latent variable modeling to problems in neural data analysis.

It is divided into two parts. The first begins with an exposition of the general techniques of latent variable modeling. A new, extremely general, optimization algorithm is proposed - called Relaxation Expectation Maximization (REM) - that may be used to learn the optimal parameter values of arbitrary latent variable models. This algorithm appears to alleviate the common problem of convergence to local, sub-optimal, likelihood maxima. REM leads to a natural framework for model size selection; in combination with standard model selection techniques the quality of fits may be further improved, while the appropriate model size is automatically and efficiently determined. Next, a new latent variable model, the mixture of sparse hidden Markov models, is introduced, and approximate inference and learning algorithms are derived for it. This model is applied in the second part of the thesis.

The second part brings the technology of part I to bear on two important problems in experimental neuroscience. The first is known as spike sorting; this is the problem of separating the spikes from different neurons embedded within an extracellular recording. The dissertation offers the first thorough statistical analysis of this problem, which then yields the first powerful probabilistic solution. The second problem addressed is that of characterizing the distribution of spike trains recorded from the same neuron under identical experimental conditions. A latent variable model is proposed. Inference and learning in this model leads to new principled algorithms for smoothing and clustering of spike data.