3 resultados para scientific computation
em CaltechTHESIS
Resumo:
The brain is perhaps the most complex system to have ever been subjected to rigorous scientific investigation. The scale is staggering: over 10^11 neurons, each making an average of 10^3 synapses, with computation occurring on scales ranging from a single dendritic spine, to an entire cortical area. Slowly, we are beginning to acquire experimental tools that can gather the massive amounts of data needed to characterize this system. However, to understand and interpret these data will also require substantial strides in inferential and statistical techniques. This dissertation attempts to meet this need, extending and applying the modern tools of latent variable modeling to problems in neural data analysis.
It is divided into two parts. The first begins with an exposition of the general techniques of latent variable modeling. A new, extremely general, optimization algorithm is proposed - called Relaxation Expectation Maximization (REM) - that may be used to learn the optimal parameter values of arbitrary latent variable models. This algorithm appears to alleviate the common problem of convergence to local, sub-optimal, likelihood maxima. REM leads to a natural framework for model size selection; in combination with standard model selection techniques the quality of fits may be further improved, while the appropriate model size is automatically and efficiently determined. Next, a new latent variable model, the mixture of sparse hidden Markov models, is introduced, and approximate inference and learning algorithms are derived for it. This model is applied in the second part of the thesis.
The second part brings the technology of part I to bear on two important problems in experimental neuroscience. The first is known as spike sorting; this is the problem of separating the spikes from different neurons embedded within an extracellular recording. The dissertation offers the first thorough statistical analysis of this problem, which then yields the first powerful probabilistic solution. The second problem addressed is that of characterizing the distribution of spike trains recorded from the same neuron under identical experimental conditions. A latent variable model is proposed. Inference and learning in this model leads to new principled algorithms for smoothing and clustering of spike data.
Resumo:
We develop new algorithms which combine the rigorous theory of mathematical elasticity with the geometric underpinnings and computational attractiveness of modern tools in geometry processing. We develop a simple elastic energy based on the Biot strain measure, which improves on state-of-the-art methods in geometry processing. We use this energy within a constrained optimization problem to, for the first time, provide surface parameterization tools which guarantee injectivity and bounded distortion, are user-directable, and which scale to large meshes. With the help of some new generalizations in the computation of matrix functions and their derivative, we extend our methods to a large class of hyperelastic stored energy functions quadratic in piecewise analytic strain measures, including the Hencky (logarithmic) strain, opening up a wide range of possibilities for robust and efficient nonlinear elastic simulation and geometry processing by elastic analogy.
Resumo:
A variety of neural signals have been measured as correlates to consciousness. In particular, late current sinks in layer 1, distributed activity across the cortex, and feedback processing have all been implicated. What are the physiological underpinnings of these signals? What computational role do they play in the brain? Why do they correlate to consciousness? This thesis begins to answer these questions by focusing on the pyramidal neuron. As the primary communicator of long-range feedforward and feedback signals in the cortex, the pyramidal neuron is set up to play an important role in establishing distributed representations. Additionally, the dendritic extent, reaching layer 1, is well situated to receive feedback inputs and contribute to current sinks in the upper layers. An investigation of pyramidal neuron physiology is therefore necessary to understand how the brain creates, and potentially uses, the neural correlates of consciousness. An important part of this thesis will be in establishing the computational role that dendritic physiology plays. In order to do this, a combined experimental and modeling approach is used.
This thesis beings with single-cell experiments in layer 5 and layer 2/3 pyramidal neurons. In both cases, dendritic nonlinearities are characterized and found to be integral regulators of neural output. Particular attention is paid to calcium spikes and NMDA spikes, which both exist in the apical dendrites, considerable distances from the spike initiation zone. These experiments are then used to create detailed multicompartmental models. These models are used to test hypothesis regarding spatial distribution of membrane channels, to quantify the effects of certain experimental manipulations, and to establish the computational properties of the single cell. We find that the pyramidal neuron physiology can carry out a coincidence detection mechanism. Further abstraction of these models reveals potential mechanisms for spike time control, frequency modulation, and tuning. Finally, a set of experiments are carried out to establish the effect of long-range feedback inputs onto the pyramidal neuron. A final discussion then explores a potential way in which the physiology of pyramidal neurons can establish distributed representations, and contribute to consciousness.