955 resultados para Generalized Hough transforms
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
We present parallel characterizations of two different values in the framework of restricted cooperation games. The restrictions are introduced as a finite sequence of partitions defined on the player set, each of them being coarser than the previous one, hence forming a structure of different levels of a priori unions. On the one hand, we consider a value first introduced in Ref. [18], which extends the Shapley value to games with different levels of a priori unions. On the other hand, we introduce another solution for the same type of games, which extends the Banzhaf value in the same manner. We characterize these two values using logically comparable properties.
Resumo:
The author studies random walk estimators for radiosity with generalized absorption probabilities. That is, a path will either die or survive on a patch according to an arbitrary probability. The estimators studied so far, the infinite path length estimator and finite path length one, can be considered as particular cases. Practical applications of the random walks with generalized probabilities are given. A necessary and sufficient condition for the existence of the variance is given, together with heuristics to be used in practical cases. The optimal probabilities are also found for the case when one is interested in the whole scene, and are equal to the reflectivities
Resumo:
We study cooperative and competitive solutions for a many- to-many generalization of Shapley and Shubik (1972)'s assignment game. We consider the Core, three other notions of group stability and two al- ternative definitions of competitive equilibrium. We show that (i) each group stable set is closely related with the Core of certain games defined using a proper notion of blocking and (ii) each group stable set contains the set of payoff vectors associated to the two definitions of competitive equilibrium. We also show that all six solutions maintain a strictly nested structure. Moreover, each solution can be identified with a set of ma- trices of (discriminated) prices which indicate how gains from trade are distributed among buyers and sellers. In all cases such matrices arise as solutions of a system of linear inequalities. Hence, all six solutions have the same properties from a structural and computational point of view.
Resumo:
This work is devoted to the development of numerical method to deal with convection diffusion dominated problem with reaction term, non - stiff chemical reaction and stiff chemical reaction. The technique is based on the unifying Eulerian - Lagrangian schemes (particle transport method) under the framework of operator splitting method. In the computational domain, the particle set is assigned to solve the convection reaction subproblem along the characteristic curves created by convective velocity. At each time step, convection, diffusion and reaction terms are solved separately by assuming that, each phenomenon occurs separately in a sequential fashion. Moreover, adaptivities and projection techniques are used to add particles in the regions of high gradients (steep fronts) and discontinuities and transfer a solution from particle set onto grid point respectively. The numerical results show that, the particle transport method has improved the solutions of CDR problems. Nevertheless, the method is time consumer when compared with other classical technique e.g., method of lines. Apart from this advantage, the particle transport method can be used to simulate problems that involve movingsteep/smooth fronts such as separation of two or more elements in the system.
Resumo:
This thesis studies properties of transforms based on parabolic scaling, like Curvelet-, Contourlet-, Shearlet- and Hart-Smith-transform. Essentially, two di erent questions are considered: How these transforms can characterize H older regularity and how non-linear approximation of a piecewise smooth function converges. In study of Hölder regularities, several theorems that relate regularity of a function f : R2 → R to decay properties of its transform are presented. Of particular interest is the case where a function has lower regularity along some line segment than elsewhere. Theorems that give estimates for direction and location of this line, and regularity of the function are presented. Numerical demonstrations suggest also that similar theorems would hold for more general shape of segment of low regularity. Theorems related to uniform and pointwise Hölder regularity are presented as well. Although none of the theorems presented give full characterization of regularity, the su cient and necessary conditions are very similar. Another theme of the thesis is the study of convergence of non-linear M ─term approximation of functions that have discontinuous on some curves and otherwise are smooth. With particular smoothness assumptions, it is well known that squared L2 approximation error is O(M-2(logM)3) for curvelet, shearlet or contourlet bases. Here it is shown that assuming higher smoothness properties, the log-factor can be removed, even if the function still is discontinuous.
Resumo:
The objective of this thesis work is to develop and study the Differential Evolution Algorithm for multi-objective optimization with constraints. Differential Evolution is an evolutionary algorithm that has gained in popularity because of its simplicity and good observed performance. Multi-objective evolutionary algorithms have become popular since they are able to produce a set of compromise solutions during the search process to approximate the Pareto-optimal front. The starting point for this thesis was an idea how Differential Evolution, with simple changes, could be extended for optimization with multiple constraints and objectives. This approach is implemented, experimentally studied, and further developed in the work. Development and study concentrates on the multi-objective optimization aspect. The main outcomes of the work are versions of a method called Generalized Differential Evolution. The versions aim to improve the performance of the method in multi-objective optimization. A diversity preservation technique that is effective and efficient compared to previous diversity preservation techniques is developed. The thesis also studies the influence of control parameters of Differential Evolution in multi-objective optimization. Proposals for initial control parameter value selection are given. Overall, the work contributes to the diversity preservation of solutions in multi-objective optimization.
Resumo:
Affective states influence subsequent attention allocation. We evaluated emotional negativity bias modulation by reappraisal in patients with generalized anxiety disorder (GAD) relative to normal controls. Event-related potential (ERP) recordings were obtained, and changes in P200 and P300 amplitudes in response to negative or neutral words were noted after decreasing negative emotion or establishing a neutral condition. We found that in GAD patients only, the mean P200 amplitude after negative word presentation was much higher than after the presentation of neutral words. In normal controls, after downregulation of negative emotion, the mean P300 amplitude in response to negative words was much lower than after neutral words, and this was significant in both the left and right regions. In GAD patients, the negative bias remained prominent and was not affected by reappraisal at the early stage. Reappraisal was observed to have a lateralized effect at the late stage.
Resumo:
The generalized maximum likelihood method was used to determine binary interaction parameters between carbon dioxide and components of orange essential oil. Vapor-liquid equilibrium was modeled with Peng-Robinson and Soave-Redlich-Kwong equations, using a methodology proposed in 1979 by Asselineau, Bogdanic and Vidal. Experimental vapor-liquid equilibrium data on binary mixtures formed with carbon dioxide and compounds usually found in orange essential oil were used to test the model. These systems were chosen to demonstrate that the maximum likelihood method produces binary interaction parameters for cubic equations of state capable of satisfactorily describing phase equilibrium, even for a binary such as ethanol/CO2. Results corroborate that the Peng-Robinson, as well as the Soave-Redlich-Kwong, equation can be used to describe phase equilibrium for the following systems: components of essential oil of orange/CO2.
Resumo:
In This Paper Several Additional Gmm Specification Tests Are Studied. a First Test Is a Chow-Type Test for Structural Parameter Stability of Gmm Estimates. the Test Is Inspired by the Fact That \"Taste and Technology\" Parameters Are Uncovered. the Second Set of Specification Tests Are Var Encompassing Tests. It Is Assumed That the Dgp Has a Finite Var Representation. the Moment Restrictions Which Are Suggested by Economic Theory and Exploited in the Gmm Procedure Represent One Possible Characterization of the Dgp. the Var Is a Different But Compatible Characterization of the Same Dgp. the Idea of the Var Encompassing Tests Is to Compare Parameter Estimates of the Euler Conditions and Var Representations of the Dgp Obtained Separately with Parameter Estimates of the Euler Conditions and Var Representations Obtained Jointly. There Are Several Ways to Construct Joint Systems Which Are Discussed in the Paper. Several Applications Are Also Discussed.
Resumo:
The goal of this paper is to contribute to the economic literature on ethnic and cultural diversity by proposing a new index that is informationally richer and more flexible than the commonly used ‘ethno-linguistic fractionalization’ (ELF) index. We characterize a measure of diversity among individuals that takes as a primitive the individuals, as opposed to ethnic groups, and uses information on the extent of similarity among them. Compared to existing indices, our measure does not require that individuals are pre-assigned to exogenously determined categories or groups. We show that our generalized index is a natural extension of ELF and is also simple to compute. We also provide an empirical illustration of how our index can be operationalized and what difference it makes as compared to the standard ELF index. This application pertains to the pattern of fractionalization in the United States.
Resumo:
L’aliénation est une problématique clé de la littérature haïtienne. Ce thème, abordé à de nombreuses reprises, est renouvelé dans les œuvres de l’écrivain Gary Victor, plus particulièrement dans les romans À l’angle des rues parallèles et Je sais quand Dieu vient se promener dans mon jardin. Ce mémoire a pour but d’étudier l’aliénation due à l’assimilation des différents discours qui circulent dans la société. Victor ne fait pas que montrer l’aliénation collective et individuelle ainsi que les différents mécanismes qui provoquent cette «folie». Il campe ses récits dans un univers chaotique où les modalités interdiscursives de la représentation des lieux et des milieux viennent renforcer cette impression d’aliénation quasi généralisée, et les personnages les plus fous apparaissent finalement comme étant les plus lucides. Des notions empruntées à la sociocritique nous ont servi de cadre théorique pour cette recherche. Le mémoire est composé de cinq chapitres. Les quatre premiers procèdent à l’analyse des discours qui sont présentés comme les sources de l’aliénation des personnages. Le chapitre un est consacré aux discours qui découlent de la société de consommation, qui ne s’appliquent plus uniquement aux objets, mais bien à la gestion des relations interpersonnelles. Le second chapitre se penche sur la question des croyances religieuses, que ce soient les croyances traditionnelles indigènes ou la religion catholique, et montre comment elles peuvent être potentiellement dangereuses pour ceux et celles qui sont trop crédules et comment elles peuvent devenir une arme pour les personnes malintentionnées. Le troisième chapitre étudie la façon dont les discours politiques et historiques sont devenus des lieux de référence pernicieux pour la société haïtienne et la manière dont les personnes au pouvoir les utilisent afin de manipuler le peuple. Le quatrième chapitre aborde les effets pervers des différents discours des savoirs en sciences humaines, plus particulièrement ceux de la philosophie et de la psychanalyse. Il montre les conséquences désastreuses qu’ils peuvent entraîner lorsqu’ils sont transformés en principes immuables. Le dernier chapitre analyse quelques modalités de cette représentation inédite de l’aliénation. Des lieux hostiles, des personnages violents ainsi que l’utilisation de références littéraires et cinématographiques marquant l’imaginaire social font partie des techniques employées par Victor. Ce chapitre fait ressortir également les différentes figures qui traduisent la résistance à cet univers démentiel. Notre lecture des romans de Victor conduit à une réflexion sur la définition du roman populaire, en lien avec la modernité telle que définie par Alexis Nouss. D’après ce qui se dégage de l’œuvre de Gary Victor, ce genre, objet de nombreuses critiques et défini comme servant uniquement au simple divertissement des lecteurs, peut aussi aider à prévenir les dérives des sociétés en perte de repères.