909 resultados para estimador Kernel
Resumo:
Among the types of remote sensing acquisitions, optical images are certainly one of the most widely relied upon data sources for Earth observation. They provide detailed measurements of the electromagnetic radiation reflected or emitted by each pixel in the scene. Through a process termed supervised land-cover classification, this allows to automatically yet accurately distinguish objects at the surface of our planet. In this respect, when producing a land-cover map of the surveyed area, the availability of training examples representative of each thematic class is crucial for the success of the classification procedure. However, in real applications, due to several constraints on the sample collection process, labeled pixels are usually scarce. When analyzing an image for which those key samples are unavailable, a viable solution consists in resorting to the ground truth data of other previously acquired images. This option is attractive but several factors such as atmospheric, ground and acquisition conditions can cause radiometric differences between the images, hindering therefore the transfer of knowledge from one image to another. The goal of this Thesis is to supply remote sensing image analysts with suitable processing techniques to ensure a robust portability of the classification models across different images. The ultimate purpose is to map the land-cover classes over large spatial and temporal extents with minimal ground information. To overcome, or simply quantify, the observed shifts in the statistical distribution of the spectra of the materials, we study four approaches issued from the field of machine learning. First, we propose a strategy to intelligently sample the image of interest to collect the labels only in correspondence of the most useful pixels. This iterative routine is based on a constant evaluation of the pertinence to the new image of the initial training data actually belonging to a different image. Second, an approach to reduce the radiometric differences among the images by projecting the respective pixels in a common new data space is presented. We analyze a kernel-based feature extraction framework suited for such problems, showing that, after this relative normalization, the cross-image generalization abilities of a classifier are highly increased. Third, we test a new data-driven measure of distance between probability distributions to assess the distortions caused by differences in the acquisition geometry affecting series of multi-angle images. Also, we gauge the portability of classification models through the sequences. In both exercises, the efficacy of classic physically- and statistically-based normalization methods is discussed. Finally, we explore a new family of approaches based on sparse representations of the samples to reciprocally convert the data space of two images. The projection function bridging the images allows a synthesis of new pixels with more similar characteristics ultimately facilitating the land-cover mapping across images.
Resumo:
[cat] En el domini dels jocs bilaterals d’assignació, es presenta una axiomàtica del nucleolus com l´unica solució que compleix les propietats de consistència respecte del joc derivat definit per Owen (1992) i monotonia de les queixes dels sectors respecte de la seva cardinalitat. Com a conseqüència obtenim una caracterització geomètrica del nucleolus mitjançant una propietat de bisecció més forta que la que satisfan els punts del kernel (Maschler et al, 1979).
Resumo:
Recientemente, ha aumentado mucho el interés por la aplicación de los modelos de memoria larga a variables económicas, sobre todo los modelos ARFIMA. Sin duda , el método más usado para la estimación de estos modelos en el ámbito del análisis económico es el propuesto por Geweke y Portero-Hudak (GPH) aun cuando en trabajos recientes se ha demostrado que, en ciertos casos, este estimador presenta un sesgo muy importante. De ahí que, se propone una extensión de este estimador a partir del modelo exponencial propuesto por Bloomfield, y que permite corregir este sesgo.A continuación, se analiza y compara el comportamiento de ambos estimadores en muestras no muy grandes y se comprueba como el estimador propuesto presenta un error cuadrático medio menor que el estimador GPH
Resumo:
[cat] Aquest treball tracta d’extendre la noció d’equilibri simètric de negociació bilateral introduït per Rochford (1983) a jocs d’assignació multilateral. Un pagament corresponent a un equilibri simètric de negociación multilateral (SMB) és una imputación del core que garanteix que qualsevol agent es troba en equilibri respecte a un procés de negociación entre tots els agents basat en allò que cadascun d’ells podria rebre -i fer servir com a amenaça- en un ’matching’ òptim diferent al que s’ha format. Es prova que, en el cas de jocs d’assignació multilaterals, el conjunt de SMB és sempre no buit i que, a diferència del cas bilateral, no sempre coincideix amb el kernel (Davis and Maschler, 1965). Finalment, responem una pregunta oberta per Rochford (1982) tot introduïnt un conjunt basat en la idea de kernel, que, conjuntament amb el core, ens permet caracteritzar el conjunt de SMB.
Resumo:
Losses of productivity of flooded rice in the State of Rio Grande do Sul, Brazil, may occur in the Coastal Plains and in the Southern region due to the use of saline water from coastal rivers, ponds and the Laguna dos Patos lagoon, and the sensibility of the plants are variable according to its stage of development. The purpose of this research was to evaluate the production of rice grains and its components, spikelet sterility and the phenological development of rice at different levels of salinity in different periods of its cycle. The experiment was conducted in a greenhouse, in pots filled with 11 dm³ of an Albaqualf. The levels of salinity were 0.3 (control), 0.75, 1.5, 3.0 and 4.5 dS m-1 kept in the water layer by adding a salt solution of sodium chloride, except for the control, in different periods of rice development: tillering initiation to panicle initiation; tillering initiation to full flowering; tillering initiation to physiological maturity; panicle initiation to full flowering; panicle initiation to physiological maturity and full flowering to physiological maturity. The number of panicles per pot, the number of spikelets per panicle, the 1,000-kernel weight, the spikelet sterility, the grain yield and phenology were evaluated. All characteristics were negatively affected, in a quadratic manner, with increased salinity in all periods of rice development. Among the yield components evaluated, the one most closely related to grain yields of rice was the spikelet sterility.
Resumo:
We propose an iterative procedure to minimize the sum of squares function which avoids the nonlinear nature of estimating the first order moving average parameter and provides a closed form of the estimator. The asymptotic properties of the method are discussed and the consistency of the linear least squares estimator is proved for the invertible case. We perform various Monte Carlo experiments in order to compare the sample properties of the linear least squares estimator with its nonlinear counterpart for the conditional and unconditional cases. Some examples are also discussed
Resumo:
The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.
Resumo:
Una de las herramientas estadísticas más importantes para el seguimiento y análisis de la evolución de la actividad económica a corto plazo es la disponibilidad de estimaciones de la evolución trimestral de los componentes del PIB, en lo que afecta tanto a la oferta como a la demanda. La necesidad de disponer de esta información con un retraso temporal reducido hace imprescindible la utilización de métodos de trimestralización que permitan desagregar la información anual a trimestral. El método más aplicado, puesto que permite resolver este problema de manera muy elegante bajo un enfoque estadístico de estimador óptimo, es el método de Chow-Lin. Pero este método no garantiza que las estimaciones trimestrales del PIB en lo que respecta a la oferta y a la demanda coincidan, haciendo necesaria la aplicación posterior de algún método de conciliación. En este trabajo se desarrolla una ampliación multivariante del método de Chow-Lin que permite resolver el problema de la estimación de los valores trimestrales de manera óptima, sujeta a un conjunto de restricciones. Una de las aplicaciones potenciales de este método, que hemos denominado método de Chow-Lin restringido, es precisamente la estimación conjunta de valores trimestrales para cada uno de los componentes del PIB en lo que afecta tanto a la demanda como a la oferta condicionada a que ambas estimaciones trimestrales del PIB sean iguales, evitando así la necesidad de aplicar posteriormente métodos de conciliación
Resumo:
In this paper we study, having as theoretical reference the economic model of crime (Becker, 1968; Ehrlich, 1973), which are the socioeconomic and demographic determinants of crime in Spain paying attention on the role of provincial peculiarities. We estimate a crime equation using a panel dataset of Spanish provinces (NUTS3) for the period 1993 to 1999 employing the GMMsystem estimator. Empirical results suggest that lagged crime rate and clear-up rate are correlated to all typologies of crime rate considered. Property crimes are better explained by socioeconomic variables (GDP per capita, GDP growth rate and percentage of population with high school and university degree), while demographic factors reveal important and significant influences, in particular for crimes against the person. These results are obtained using an instrumental variable approach that takes advantage of the dynamic properties of our dataset to control for both measurement errors in crime data and joint endogeneity of the explanatory variables
Resumo:
Uniform-price assignment games are introduced as those assignment markets with the core reduced to a segment. In these games, for all active agents, competitive prices are uniform although products may be non-homogeneous. A characterization in terms of the assignment matrix is given. The only assignment markets where all submarkets are uniform are the Bohm-Bawerk horse markets. We prove that for uniform-price assignment games the kernel, or set of symmetrically-pairwise bargained allocations, either coincides with the core or reduces to the nucleolus
Resumo:
En este trabajo se revisan algunas de las aplicaciones clásicas del bootstrap al análisis de la supervivencia. Se consideran en primer lugar el estimador bootstrap de la varianza y el estimador de la mediana corregido para el sesgo del estimador de Kaplan-Meier de la función de supervivencia. A continuación se consideran algunos aspectos mas recientes, tales como métodos para construir bandas de confianza para el estimador de la funcidn de supervivencia y contrastes aproximados para la comparación de funciones de supervivencia. En ambas situaciones el bootstrap resulta de gran utilidad para la aproximación de 10s valores críticos necesarios.
Resumo:
In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations
Resumo:
A adequada modelagem da infiltração de água no solo é fundamental para estimação do movimento de água, erosão hídrica, recarga e contaminação de aquíferos. Este trabalho apresenta um modelo para estimativa da infiltração de água no solo (GAML-c), com base no modelo de Green-Ampt-Mein-Larson, que provê descrição da geometria e do deslocamento da frente de umedecimento no solo. Testes experimentais foram conduzidos em Latossolo Vermelho-Amarelo para avaliar o GAML-c, usando-se quatro diferentes cenários: considerando a condutividade hidráulica do solo igual à taxa de infiltração estável (Tie) e a umidade máxima do solo igual ao teor de água na zona de transmissão (θw) (TW); condutividade hidráulica do solo igual à do solo saturado (K0) e a umidade máxima do solo igual θw (KW); condutividade hidráulica do solo igual à Tie e a umidade máxima do solo igual ao teor de água na saturação (θs) (TS); e condutividade hidráulica do solo igual a K0 e a umidade máxima do solo igual θs (KS). Verificou-se que o GAML-c no cenário TW foi o melhor estimador do perfil de umidade do solo, resultando em aceitáveis estimativas da infiltração de água.
Resumo:
We investigate the depinning transition occurring in dislocation assemblies. In particular, we consider the cases of regularly spaced pileups and low-angle grain boundaries interacting with a disordered stress landscape provided by solute atoms, or by other immobile dislocations present in nonactive slip systems. Using linear elasticity, we compute the stress originated by small deformations of these assemblies and the corresponding energy cost in two and three dimensions. Contrary to the case of isolated dislocation lines, which are usually approximated as elastic strings with an effective line tension, the deformations of a dislocation assembly cannot be described by local elastic interactions with a constant tension or stiffness. A nonlocal elastic kernel results as a consequence of long-range interactions between dislocations. In light of this result, we revise statistical depinning theories of dislocation assemblies and compare the theoretical results with numerical simulations and experimental data.
Resumo:
We point out that using the heat kernel on a cone to compute the first quantum correction to the entropy of Rindler space does not yield the correct temperature dependence. In order to obtain the physics at arbitrary temperature one must compute the heat kernel in a geometry with different topology (without a conical singularity). This is done in two ways, which are shown to agree with computations performed by other methods. Also, we discuss the ambiguities in the regularization procedure and their physical consequences.