913 resultados para Kernel Smoothing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new subcortical structure shape modeling framework using heat kernel smoothing constructed with the Laplace-Beltrami eigenfunctions. The cotan discretization is used to numerically obtain the eigenfunctions of the Laplace-Beltrami operator along the surface of subcortical structures of the brain. The eigenfunctions are then used to construct the heat kernel and used in smoothing out measurements noise along the surface. The proposed framework is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shape. We detected a significant age effect on hippocampus in accordance with the previous studies. In addition, we also detected a significant gender effect on amygdala. Since we did not find any such differences in the traditional volumetric methods, our results demonstrate the benefit of the current framework over traditional volumetric methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The focus of the paper is the nonparametric estimation of an instrumental regression function P defined by conditional moment restrictions stemming from a structural econometric model : E[Y-P(Z)|W]=0 and involving endogenous variables Y and Z and instruments W. The function P is the solution of an ill-posed inverse problem and we propose an estimation procedure based on Tikhonov regularization. The paper analyses identification and overidentification of this model and presents asymptotic properties of the estimated nonparametric instrumental regression function.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We provide new evidence on sea surface temperature (SST) variations and paleoceanographic/paleoenvironmental changes over the past 1500 years for the north Aegean Sea (NE Mediterranean). The reconstructions are based on multiproxy analyses, obtained from the high resolution (decadal to multidecadal) marine record M2 retrieved from the Athos basin. Reconstructed SSTs show an increase from ca. 850 to 950 AD and from ca. 1100 to 1300 AD. A cooling phase of almost 1.5 �C is observed from ca. 1600 AD to 1700 AD. This seems to have been the starting point of a continuous SST warming trend until the end of the reconstructed period, interrupted by two prominent cooling events at 1832 ± 15 AD and 1995 ± 1 AD. Application of an adaptive Kernel smoothing suggests that the current warming in the reconstructed SSTs of the north Aegean might be unprecedented in the context of the past 1500 years. Internal variability in atmospheric/oceanic circulations systems as well as external forcing as solar radiation and volcanic activity could have affected temperature variations in the north Aegean Sea over the past 1500 years. The marked temperature drop of approximately ~2 �C at 1832 ± 15 yr AD could be related to the 1809 АD ‘unknown’ and the 1815 AD Tambora volcanic eruptions. Paleoenvironmental proxy-indices of the M2 record show enhanced riverine/continental inputs in the northern Aegean after ca. 1450 AD. The paleoclimatic evidence derived from the M2 record is combined with a socio-environmental study of the history of the north Aegean region. We show that the cultivation of temperature-sensitive crops, i.e. walnut, vine and olive, co-occurred with stable and warmer temperatures, while its end coincided with a significant episode of cooler temperatures. Periods of agricultural growth in Macedonia coincide with periods of warmer and more stable SSTs, but further exploration is required in order to identify the causal links behind the observed phenomena. The Black Death likely caused major changes in agricultural activity in the north Aegean region, as reflected in the pollen data from land sites of Macedonia and the M2 proxy-reconstructions. Finally, we conclude that the early modern peaks in mountain vegetation in the Rhodope and Macedonia highlands, visible also in the M2 record, were very likely climate-driven.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stochastic models for three-dimensional particles have many applications in applied sciences. Lévy–based particle models are a flexible approach to particle modelling. The structure of the random particles is given by a kernel smoothing of a Lévy basis. The models are easy to simulate but statistical inference procedures have not yet received much attention in the literature. The kernel is not always identifiable and we suggest one approach to remedy this problem. We propose a method to draw inference about the kernel from data often used in local stereology and study the performance of our approach in a simulation study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Data Envelopment Analysis (DEA) efficiency score obtained for an individual firm is a point estimate without any confidence interval around it. In recent years, researchers have resorted to bootstrapping in order to generate empirical distributions of efficiency scores. This procedure assumes that all firms have the same probability of getting an efficiency score from any specified interval within the [0,1] range. We propose a bootstrap procedure that empirically generates the conditional distribution of efficiency for each individual firm given systematic factors that influence its efficiency. Instead of resampling directly from the pooled DEA scores, we first regress these scores on a set of explanatory variables not included at the DEA stage and bootstrap the residuals from this regression. These pseudo-efficiency scores incorporate the systematic effects of unit-specific factors along with the contribution of the randomly drawn residual. Data from the U.S. airline industry are utilized in an empirical application.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We here present a compilation of planktic and benthic 14C reservoir ages for the Last Glacial Maximum (LGM) and early deglacial from 11 key sites of global ocean circulation in the Atlantic and Indo-Pacific Ocean. The ages were obtained by 14C plateau tuning, a robust technique to derive both an absolute chronology for marine sediment records and a high-resolution record of changing reservoir/ventilation ages (Delta14C values) for surface and deep waters by comparing the suite of planktic 14C plateaus of a sediment record with that of the atmospheric 14C record (Sarnthein et al., 2007, doi:10.1029/173GM13). Results published thus far used as atmospheric 14C reference U/Th-dated corals, the Cariaco planktic record, and speleothems (Fairbanks et al., 2005, doi:10.1016/j.quascirev.2005.04.007; Hughen et al., 2006, doi:10.1016/j.quascirev.2006.03.014; Beck et al., 2001, doi:10.1023/A:1008175728826). We have now used the varve-counted atmospheric 14C record of Lake Suigetsu terrestrial macrofossils (Ramsey et al., 2012, doi:10.1126/science.1226660) to recalibrate the boundary ages and reservoir ages of the seven published records directly to an atmospheric 14C record. In addition, the results for four new cores and further planktic results for four published records are given. Main conclusions from the new compilation are: (1) The Suigetsu atmospheric 14C record on its varve counted time scale reflects all 14C plateaus, their internal structures and relative length previously identified, but implies a rise in the average 14C plateau age by 200-700 14C yr during LGM and early deglacial times. (2) Based on different 14C ages of coeval atmospheric and planktic 14C plateaus, marine surface water Delta14C may have temporarily dropped to an equivalent of ~0 yr in low-latitude lagoon waters, but reached >2500 14C yr both in stratified subpolar waters and in upwelled waters such as in the South China Sea. These values differ significantly from a widely assumed constant global planktic Delta14C value of 400 yr. (3) Suites of deglacial planktic Delta14C values are closely reproducible in 14C records measured at neighboring core sites. (4) Apparent deep-water 14C ventilation ages (equivalents of benthic Delta14C), deduced from the sum of planktic Delta14C and coeval benthic-planktic 14C differences, vary from 500 up to >5000 yr in LGM and deglacial ocean basins.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Within the regression framework, we show how different levels of nonlinearity influence the instantaneous firing rate prediction of single neurons. Nonlinearity can be achieved in several ways. In particular, we can enrich the predictor set with basis expansions of the input variables (enlarging the number of inputs) or train a simple but different model for each area of the data domain. Spline-based models are popular within the first category. Kernel smoothing methods fall into the second category. Whereas the first choice is useful for globally characterizing complex functions, the second is very handy for temporal data and is able to include inner-state subject variations. Also, interactions among stimuli are considered. We compare state-of-the-art firing rate prediction methods with some more sophisticated spline-based nonlinear methods: multivariate adaptive regression splines and sparse additive models. We also study the impact of kernel smoothing. Finally, we explore the combination of various local models in an incremental learning procedure. Our goal is to demonstrate that appropriate nonlinearity treatment can greatly improve the results. We test our hypothesis on both synthetic data and real neuronal recordings in cat primary visual cortex, giving a plausible explanation of the results from a biological perspective.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We provide new evidence on sea surface temperature (SST) variations and paleoceanographic/paleoenvironmental changes over the past 1500 years for the north Aegean Sea (NE Mediterranean). The reconstructions are based on multiproxy analyses, obtained from the high resolution (decadal to multi-decadal) marine record M2 retrieved from the Athos basin. Reconstructed SSTs show an increase from ca. 850 to 950 AD and from ca. 1100 to 1300 AD. A cooling phase of almost 1.5 °C is observed from ca. 1600 AD to 1700 AD. This seems to have been the starting point of a continuous SST warming trend until the end of the reconstructed period, interrupted by two prominent cooling events at 1832 ± 15 AD and 1995 ± 1 AD. Application of an adaptive Kernel smoothing suggests that the current warming in the reconstructed SSTs of the north Aegean might be unprecedented in the context of the past 1500 years. Internal variability in atmospheric/oceanic circulations systems as well as external forcing as solar radiation and volcanic activity could have affected temperature variations in the north Aegean Sea over the past 1500 years. The marked temperature drop of approximately ~2 °C at 1832 ± 15 yr AD could be related to the 1809 ?D 'unknown' and the 1815 AD Tambora volcanic eruptions. Paleoenvironmental proxy-indices of the M2 record show enhanced riverine/continental inputs in the northern Aegean after ca. 1450 AD. The paleoclimatic evidence derived from the M2 record is combined with a socio-environmental study of the history of the north Aegean region. We show that the cultivation of temperature-sensitive crops, i.e. walnut, vine and olive, co-occurred with stable and warmer temperatures, while its end coincided with a significant episode of cooler temperatures. Periods of agricultural growth in Macedonia coincide with periods of warmer and more stable SSTs, but further exploration is required in order to identify the causal links behind the observed phenomena. The Black Death likely caused major changes in agricultural activity in the north Aegean region, as reflected in the pollen data from land sites of Macedonia and the M2 proxy-reconstructions. Finally, we conclude that the early modern peaks in mountain vegetation in the Rhodope and Macedonia highlands, visible also in the M2 record, were very likely climate-driven.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: To quantify to what extent the new registration method, DARTEL (Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra), may reduce the smoothing kernel width required and investigate the minimum group size necessary for voxel-based morphometry (VBM) studies. Materials and Methods: A simulated atrophy approach was employed to explore the role of smoothing kernel, group size, and their interactions on VBM detection accuracy. Group sizes of 10, 15, 25, and 50 were compared for kernels between 0–12 mm. Results: A smoothing kernel of 6 mm achieved the highest atrophy detection accuracy for groups with 50 participants and 8–10 mm for the groups of 25 at P < 0.05 with familywise correction. The results further demonstrated that a group size of 25 was the lower limit when two different groups of participants were compared, whereas a group size of 15 was the minimum for longitudinal comparisons but at P < 0.05 with false discovery rate correction. Conclusion: Our data confirmed DARTEL-based VBM generally benefits from smaller kernels and different kernels perform best for different group sizes with a tendency of smaller kernels for larger groups. Importantly, the kernel selection was also affected by the threshold applied. This highlighted that the choice of kernel in relation to group size should be considered with care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Medical image segmentation finds application in computer-aided diagnosis, computer-guided surgery, measuring tissue volumes, locating tumors, and pathologies. One approach to segmentation is to use active contours or snakes. Active contours start from an initialization (often manually specified) and are guided by image-dependent forces to the object boundary. Snakes may also be guided by gradient vector fields associated with an image. The first main result in this direction is that of Xu and Prince, who proposed the notion of gradient vector flow (GVF), which is computed iteratively. We propose a new formalism to compute the vector flow based on the notion of bilateral filtering of the gradient field associated with the edge map - we refer to it as the bilateral vector flow (BVF). The range kernel definition that we employ is different from the one employed in the standard Gaussian bilateral filter. The advantage of the BVF formalism is that smooth gradient vector flow fields with enhanced edge information can be computed noniteratively. The quality of image segmentation turned out to be on par with that obtained using the GVF and in some cases better than the GVF.