150 resultados para Image space
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.
Resumo:
This article reports on a lossless data hiding scheme for digital images where the data hiding capacity is either determined by minimum acceptable subjective quality or by the demanded capacity. In the proposed method data is hidden within the image prediction errors, where the most well-known prediction algorithms such as the median edge detector (MED), gradient adjacent prediction (GAP) and Jiang prediction are tested for this purpose. In this method, first the histogram of the prediction errors of images are computed and then based on the required capacity or desired image quality, the prediction error values of frequencies larger than this capacity are shifted. The empty space created by such a shift is used for embedding the data. Experimental results show distinct superiority of the image prediction error histogram over the conventional image histogram itself, due to much narrower spectrum of the former over the latter. We have also devised an adaptive method for hiding data, where subjective quality is traded for data hiding capacity. Here the positive and negative error values are chosen such that the sum of their frequencies on the histogram is just above the given capacity or above a certain quality.
Resumo:
This work proposes the detection of red peaches in orchard images based on the definition of different linear color models in the RGB vector color space. The classification and segmentation of the pixels of the image is then performed by comparing the color distance from each pixel to the different previously defined linear color models. The methodology proposed has been tested with images obtained in a real orchard under natural light. The peach variety in the orchard was the paraguayo (Prunus persica var. platycarpa) peach with red skin. The segmentation results showed that the area of the red peaches in the images was detected with an average error of 11.6%; 19.7% in the case of bright illumination; 8.2% in the case of low illumination; 8.6% for occlusion up to 33%; 12.2% in the case of occlusion between 34 and 66%; and 23% for occlusion above 66%. Finally, a methodology was proposed to estimate the diameter of the fruits based on an ellipsoidal fitting. A first diameter was obtained by using all the contour pixels and a second diameter was obtained by rejecting some pixels of the contour. This approach enables a rough estimate of the fruit occlusion percentage range by comparing the two diameter estimates.
Resumo:
En aquest article es fa una descripció dels procediments realitzats per enregistrar dues imatges geomètricament, de forma automàtica, si es pren la primera com a imatge de referència. Es comparen els resultats obtinguts mitjançant tres mètodes. El primer mètode és el d’enregistrament clàssic en domini espacial maximitzant la correlació creuada (MCC)[1]. El segon mètode es basa en aplicar l’enregistrament MCC conjuntament amb un anàlisi multiescala a partir de transformades wavelet [2]. El tercer mètode és una variant de l’anterior que es situa a mig camí dels dos. Per cada un dels mètodes s’obté una estimació dels coeficients de la transformació que relaciona les dues imatges. A continuació es transforma per cada cas la segona imatge i es georeferencia respecte la primera. I per acabar es proposen unes mesures quantitatives que permeten discutir i comparar els resultats obtinguts amb cada mètode.
Resumo:
An algebraic decay rate is derived which bounds the time required for velocities to equilibrate in a spatially homogeneous flow-through model representing the continuum limit of a gas of particles interacting through slightly inelastic collisions. This rate is obtained by reformulating the dynamical problem as the gradient flow of a convex energy on an infinite-dimensional manifold. An abstract theory is developed for gradient flows in length spaces, which shows how degenerate convexity (or even non-convexity) | if uniformly controlled | will quantify contractivity (limit expansivity) of the flow.
Resumo:
We quantify the long-time behavior of a system of (partially) inelastic particles in a stochastic thermostat by means of the contractivity of a suitable metric in the set of probability measures. Existence, uniqueness, boundedness of moments and regularity of a steady state are derived from this basic property. The solutions of the kinetic model are proved to converge exponentially as t→ ∞ to this diffusive equilibrium in this distance metrizing the weak convergence of measures. Then, we prove a uniform bound in time on Sobolev norms of the solution, provided the initial data has a finite norm in the corresponding Sobolev space. These results are then combined, using interpolation inequalities, to obtain exponential convergence to the diffusive equilibrium in the strong L¹-norm, as well as various Sobolev norms.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt".
Resumo:
Report for the scientific sojourn at the Department of Information Technology (INTEC) at the Ghent University, Belgium, from january to june 2007. All-Optical Label Swapping (AOLS) forms a key technology towards the implementation of All-Optical Packet Switching nodes (AOPS) for the future optical Internet. The capital expenditures of the deployment of AOLS increases with the size of the label spaces (i.e. the number of used labels), since a special optical device is needed for each recognized label on every node. Label space sizes are affected by the wayin which demands are routed. For instance, while shortest-path routing leads to the usage of fewer labels but high link utilization, minimum interference routing leads to the opposite. This project studies and proposes All-Optical Label Stacking (AOLStack), which is an extension of the AOLS architecture. AOLStack aims at reducing label spaces while easing the compromise with link utilization. In this project, an Integer Lineal Program is proposed with the objective of analyzing the softening of the aforementioned trade-off due to AOLStack. Furthermore, a heuristic aiming at finding good solutions in polynomial-time is proposed as well. Simulation results show that AOLStack either a) reduces the label spaces with a low increase in the link utilization or, similarly, b) uses better the residual bandwidth to decrease the number of labels even more.
Resumo:
The purpose of this contribution is to draw a picture of the (uneven) distribution of economic activities across the states of the European Union (EU) and the consequences entailed by it. We will briefly summarize the most salient and recent contributions. Then, in the light of the economic geography theory, we will discuss the economic and social advantages and disadvantages associated with a core- periphery structure. In this sense, particular attention will be addressed to the EU financial system of Structural Funds and the effects they produced. Finally, we will formulate some suggestions, relying on the EU experience, that could be of interest to the current Brazilian regional policy.
Resumo:
JPEG2000 és un estàndard de compressió d’imatges que utilitza la transformada wavelet i, posteriorment, una quantificació uniforme dels coeficients amb dead-zone. Els coeficients wavelet presenten certes dependències tant estadístiques com visuals. Les dependències estadístiques es tenen en compte a l'esquema JPEG2000, no obstant, no passa el mateix amb les dependències visuals. En aquest treball, es pretén trobar una representació més adaptada al sistema visual que la que proporciona JPEG2000 directament. Per trobar-la utilitzarem la normalització divisiva dels coeficients, tècnica que ja ha demostrat resultats tant en decorrelació estadística de coeficients com perceptiva. Idealment, el que es voldria fer és reconvertir els coeficients a un espai de valors en els quals un valor més elevat dels coeficients impliqui un valor més elevat d'aportació visual, i utilitzar aquest espai de valors per a codificar. A la pràctica, però, volem que el nostre sistema de codificació estigui integrat a un estàndard. És per això que utilitzarem JPEG2000, estàndard de la ITU que permet una elecció de les distorsions en la codificació, i utilitzarem la distorsió en el domini de coeficients normalitzats com a mesura de distorsió per a escollir quines dades s'envien abans.
Resumo:
This paper aims at providing a Bayesian parametric framework to tackle the accessibility problem across space in urban theory. Adopting continuous variables in a probabilistic setting we are able to associate with the distribution density to the Kendall's tau index and replicate the general issues related to the role of proximity in a more general context. In addition, by referring to the Beta and Gamma distribution, we are able to introduce a differentiation feature in each spatial unit without incurring in any a-priori definition of territorial units. We are also providing an empirical application of our theoretical setting to study the density distribution of the population across Massachusetts.