66 resultados para fractal segmentation
Resumo:
A version of Matheron’s discrete Gaussian model is applied to cell composition data.The examples are for map patterns of felsic metavolcanics in two different areas. Q-Qplots of the model for cell values representing proportion of 10 km x 10 km cell areaunderlain by this rock type are approximately linear, and the line of best fit can be usedto estimate the parameters of the model. It is also shown that felsic metavolcanics in theAbitibi area of the Canadian Shield can be modeled as a fractal
Resumo:
In the context of the round table the following topics related to image colour processing will be discussed: historical point of view. Studies of Aguilonius, Gerritsen, Newton and Maxwell. CIE standard (Commission International de lpsilaEclaraige). Colour models. RGB, HIS, etc. Colour segmentation based on HSI model. Industrial applications. Summary and discussion. At the end, video images showing the robustness of colour in front of B/W images will be presented
Resumo:
Photo-mosaicing techniques have become popular for seafloor mapping in various marine science applications. However, the common methods cannot accurately map regions with high relief and topographical variations. Ortho-mosaicing borrowed from photogrammetry is an alternative technique that enables taking into account the 3-D shape of the terrain. A serious bottleneck is the volume of elevation information that needs to be estimated from the video data, fused, and processed for the generation of a composite ortho-photo that covers a relatively large seafloor area. We present a framework that combines the advantages of dense depth-map and 3-D feature estimation techniques based on visual motion cues. The main goal is to identify and reconstruct certain key terrain feature points that adequately represent the surface with minimal complexity in the form of piecewise planar patches. The proposed implementation utilizes local depth maps for feature selection, while tracking over several views enables 3-D reconstruction by bundle adjustment. Experimental results with synthetic and real data validate the effectiveness of the proposed approach
Resumo:
A recent trend in digital mammography is computer-aided diagnosis systems, which are computerised tools designed to assist radiologists. Most of these systems are used for the automatic detection of abnormalities. However, recent studies have shown that their sensitivity is significantly decreased as the density of the breast increases. This dependence is method specific. In this paper we propose a new approach to the classification of mammographic images according to their breast parenchymal density. Our classification uses information extracted from segmentation results and is based on the underlying breast tissue texture. Classification performance was based on a large set of digitised mammograms. Evaluation involves different classifiers and uses a leave-one-out methodology. Results demonstrate the feasibility of estimating breast density using image processing and analysis techniques
Resumo:
Colour image segmentation based on the hue component presents some problems due to the physical process of image formation. One of that problems is colour clipping, which appear when at least one of the sensor components is saturated. We have designed a system, that works for a trained set of colours, to recover the chromatic information of those pixels on which colour has been clipped. The chromatic correction method is based on the fact that hue and saturation are invariant to the uniform scaling of the three RGB components. The proposed method has been validated by means of a specific colour image processing board that has allowed its execution in real time. We show experimental results of the application of our method
Resumo:
Els objectius del projecte es divideixen en tres blocs: Primerament, realitzar unasegmentació automàtica del contorn d'una imatge on hi ha una massa central. Tot seguit, a partir del contorn trobat, caracteritzar la massa. I finalment, utilitzant les característiques anteriors classificar la massa en benigne o maligne. En el projecte s'utilitza el Matlab com a eina de programació. Concretament les funcions enfocades al processat de imatges del toolbox de Image processing (propi de Matlab) i els classificadors de la PRTools de la Delft University of Technology
Resumo:
In this paper, we explore the connection between labor market segmentation in two sectors, a modern protected formal sector and a traditional- unprotected-informal sector, and overeducation in a developing country. Informality is thought to have negative consequences, primarily through poorer working conditions, lack of social security, as well as low levels of productivity throughout the economy. This paper considers an aspect that has not been previously addressed, namely the fact that informality might also affect the way workers match their actual education with that required performing their job. We use micro-data from Colombia to test the relationship between overeducation and informality. Empirical results suggest that, once the endogeneity of employment choice has been accounted for, formal male workers are less likely to be overeducated. Interestingly, the propensity of being overeducated among women does not seem to be closely related to the employment choice.
Resumo:
Our essay aims at studying suitable statistical methods for the clustering ofcompositional data in situations where observations are constituted by trajectories ofcompositional data, that is, by sequences of composition measurements along a domain.Observed trajectories are known as “functional data” and several methods have beenproposed for their analysis.In particular, methods for clustering functional data, known as Functional ClusterAnalysis (FCA), have been applied by practitioners and scientists in many fields. To ourknowledge, FCA techniques have not been extended to cope with the problem ofclustering compositional data trajectories. In order to extend FCA techniques to theanalysis of compositional data, FCA clustering techniques have to be adapted by using asuitable compositional algebra.The present work centres on the following question: given a sample of compositionaldata trajectories, how can we formulate a segmentation procedure giving homogeneousclasses? To address this problem we follow the steps described below.First of all we adapt the well-known spline smoothing techniques in order to cope withthe smoothing of compositional data trajectories. In fact, an observed curve can bethought of as the sum of a smooth part plus some noise due to measurement errors.Spline smoothing techniques are used to isolate the smooth part of the trajectory:clustering algorithms are then applied to these smooth curves.The second step consists in building suitable metrics for measuring the dissimilaritybetween trajectories: we propose a metric that accounts for difference in both shape andlevel, and a metric accounting for differences in shape only.A simulation study is performed in order to evaluate the proposed methodologies, usingboth hierarchical and partitional clustering algorithm. The quality of the obtained resultsis assessed by means of several indices
Resumo:
It has been shown that the accuracy of mammographic abnormality detection methods is strongly dependent on the breast tissue characteristics, where a dense breast drastically reduces detection sensitivity. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. Here, we describe the development of an automatic breast tissue classification methodology, which can be summarized in a number of distinct steps: 1) the segmentation of the breast area into fatty versus dense mammographic tissue; 2) the extraction of morphological and texture features from the segmented breast areas; and 3) the use of a Bayesian combination of a number of classifiers. The evaluation, based on a large number of cases from two different mammographic data sets, shows a strong correlation ( and 0.67 for the two data sets) between automatic and expert-based Breast Imaging Reporting and Data System mammographic density assessment
Resumo:
The speed of front propagation in fractals is studied by using (i) the reduction of the reaction-transport equation into a Hamilton-Jacobi equation and (ii) the local-equilibrium approach. Different equations proposed for describing transport in fractal media, together with logistic reaction kinetics, are considered. Finally, we analyze the main features of wave fronts resulting from this dynamic process, i.e., why they are accelerated and what is the exact form of this acceleration
Resumo:
The front speed problem for nonuniform reaction rate and diffusion coefficient is studied by using singular perturbation analysis, the geometric approach of Hamilton-Jacobi dynamics, and the local speed approach. Exact and perturbed expressions for the front speed are obtained in the limit of large times. For linear and fractal heterogeneities, the analytic results have been compared with numerical results exhibiting a good agreement. Finally we reach a general expression for the speed of the front in the case of smooth and weak heterogeneities
Resumo:
In the last few years, some of the visionary concepts behind the virtual physiological human began to be demonstrated on various clinical domains, showing great promise for improving healthcare management. In the current work, we provide an overview of image- and biomechanics-based techniques that, when put together, provide a patient-specific pipeline for the management of intracranial aneurysms. The derivation and subsequent integration of morphological, morphodynamic, haemodynamic and structural analyses allow us to extract patient-specific models and information from which diagnostic and prognostic descriptors can be obtained. Linking such new indices with relevant clinical events should bring new insights into the processes behind aneurysm genesis, growth and rupture. The development of techniques for modelling endovascular devices such as stents and coils allows the evaluation of alternative treatment scenarios before the intervention takes place and could also contribute to the understanding and improved design of more effective devices. A key element to facilitate the clinical take-up of all these developments is their comprehensive validation. Although a number of previously published results have shown the accuracy and robustness of individual components, further efforts should be directed to demonstrate the diagnostic and prognostic efficacy of these advanced tools through large-scale clinical trials.
Resumo:
In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.
Resumo:
This study analyses the determinants of the rate of temporary employment in various OECD countries using both macro-level data drawn from the OECD and EUROSTAT databases, as well as micro-level data drawn from the 8th wave of the European Household Panel. Comparative analysis is set out to test different explanations originally formulated for the Spanish case. The evidence suggests that the overall distribution of temporary employment in advanced economies does not seem to be explicable by the characteristics of national productive structures. This evidence seems at odds with previous interpretations based on segmentation theories. As an alternative explanation, two types of supply-side factors are tested: crowding-out effects and educational gaps in the workforce. The former seems non significant, whilst the effects of the latter disappear after controlling for the levels of institutional protection in standard employment during the 1980s. Multivariate analysis shows that only this latter institutional variable, together with the degree of coordinated centralisation of the collective bargaining system, seem to have a significant impact on the distribution of temporary employment in the countries examined. On the basis of this observation, an explanation of the very high levels of temporary employment observed in Spain is proposed. This explanation is consistent with both country-specific and comparative evidence.
Resumo:
In this paper we portray the features of the Catalan textiles labour market in a period of technological change. Supply and demand for labour as well as a gendered view of living standards are presented. A first set of results is that labour supply adjusts to changes in labour demand trough the spread of new demographic attitudes. In this respect we imply that labour economic agents (or labour population) were able to modify the economic condition of their children. A second set of results refers to living standards and income distribution inequality. In this respect we see that unemployment and protectionism were the main sources breeding income inequality. A third set of results deals with the extreme labour market segmentation according to gender. Since women s real wages did not obey to an economic rationale we conclude that women were outside the labour market.