951 resultados para gray level probabilty density functions
Resumo:
This paper proposes a content based image retrieval (CBIR) system using the local colour and texture features of selected image sub-blocks and global colour and shape features of the image. The image sub-blocks are roughly identified by segmenting the image into partitions of different configuration, finding the edge density in each partition using edge thresholding, morphological dilation and finding the corner density in each partition. The colour and texture features of the identified regions are computed from the histograms of the quantized HSV colour space and Gray Level Co- occurrence Matrix (GLCM) respectively. A combined colour and texture feature vector is computed for each region. The shape features are computed from the Edge Histogram Descriptor (EHD). Euclidean distance measure is used for computing the distance between the features of the query and target image. Experimental results show that the proposed method provides better retrieving result than retrieval using some of the existing methods
Resumo:
This paper proposes a content based image retrieval (CBIR) system using the local colour and texture features of selected image sub-blocks and global colour and shape features of the image. The image sub-blocks are roughly identified by segmenting the image into partitions of different configuration, finding the edge density in each partition using edge thresholding, morphological dilation. The colour and texture features of the identified regions are computed from the histograms of the quantized HSV colour space and Gray Level Co- occurrence Matrix (GLCM) respectively. A combined colour and texture feature vector is computed for each region. The shape features are computed from the Edge Histogram Descriptor (EHD). A modified Integrated Region Matching (IRM) algorithm is used for finding the minimum distance between the sub-blocks of the query and target image. Experimental results show that the proposed method provides better retrieving result than retrieval using some of the existing methods
Resumo:
La present tesi, tot i que emmarcada dins de la teoria de les Mesures Semblança Molecular Quántica (MQSM), es deriva en tres àmbits clarament definits: - La creació de Contorns Moleculars de IsoDensitat Electrònica (MIDCOs, de l'anglès Molecular IsoDensity COntours) a partir de densitats electròniques ajustades. - El desenvolupament d'un mètode de sobreposició molecular, alternatiu a la regla de la màxima semblança. - Relacions Quantitatives Estructura-Activitat (QSAR, de l'anglès Quantitative Structure-Activity Relationships). L'objectiu en el camp dels MIDCOs és l'aplicació de funcions densitat ajustades, ideades inicialment per a abaratir els càlculs de MQSM, per a l'obtenció de MIDCOs. Així, es realitza un estudi gràfic comparatiu entre diferents funcions densitat ajustades a diferents bases amb densitats obtingudes de càlculs duts a terme a nivells ab initio. D'aquesta manera, l'analogia visual entre les funcions ajustades i les ab initio obtinguda en el ventall de representacions de densitat obtingudes, i juntament amb els valors de les mesures de semblança obtinguts prèviament, totalment comparables, fonamenta l'ús d'aquestes funcions ajustades. Més enllà del propòsit inicial, es van realitzar dos estudis complementaris a la simple representació de densitats, i són l'anàlisi de curvatura i l'extensió a macromolècules. La primera observació correspon a comprovar no només la semblança dels MIDCOs, sinó la coherència del seu comportament a nivell de curvatura, podent-se així observar punts d'inflexió en la representació de densitats i veure gràficament aquelles zones on la densitat és còncava o convexa. Aquest primer estudi revela que tant les densitats ajustades com les calculades a nivell ab initio es comporten de manera totalment anàloga. En la segona part d'aquest treball es va poder estendre el mètode a molècules més grans, de fins uns 2500 àtoms. Finalment, s'aplica part de la filosofia del MEDLA. Sabent que la densitat electrònica decau ràpidament al allunyar-se dels nuclis, el càlcul d'aquesta pot ser obviat a distàncies grans d'aquests. D'aquesta manera es va proposar particionar l'espai, i calcular tan sols les funcions ajustades de cada àtom tan sols en una regió petita, envoltant l'àtom en qüestió. Duent a terme aquest procés, es disminueix el temps de càlcul i el procés esdevé lineal amb nombre d'àtoms presents en la molècula tractada. En el tema dedicat a la sobreposició molecular es tracta la creació d'un algorisme, així com la seva implementació en forma de programa, batejat Topo-Geometrical Superposition Algorithm (TGSA), d'un mètode que proporcionés aquells alineaments que coincideixen amb la intuïció química. El resultat és un programa informàtic, codificat en Fortran 90, el qual alinea les molècules per parelles considerant tan sols nombres i distàncies atòmiques. La total absència de paràmetres teòrics permet desenvolupar un mètode de sobreposició molecular general, que proporcioni una sobreposició intuïtiva, i també de forma rellevant, de manera ràpida i amb poca intervenció de l'usuari. L'ús màxim del TGSA s'ha dedicat a calcular semblances per al seu ús posterior en QSAR, les quals majoritàriament no corresponen al valor que s'obtindria d'emprar la regla de la màxima semblança, sobretot si hi ha àtoms pesats en joc. Finalment, en l'últim tema, dedicat a la Semblança Quàntica en el marc del QSAR, es tracten tres aspectes diferents: - Ús de matrius de semblança. Aquí intervé l'anomenada matriu de semblança, calculada a partir de les semblances per parelles d'entre un conjunt de molècules. Aquesta matriu és emprada posteriorment, degudament tractada, com a font de descriptors moleculars per a estudis QSAR. Dins d'aquest àmbit s'han fet diversos estudis de correlació d'interès farmacològic, toxicològic, així com de diverses propietats físiques. - Aplicació de l'energia d'interacció electró-electró, assimilat com a una forma d'autosemblança. Aquesta modesta contribució consisteix breument en prendre el valor d'aquesta magnitud, i per analogia amb la notació de l'autosemblança molecular quàntica, assimilar-la com a cas particular de d'aquesta mesura. Aquesta energia d'interacció s'obté fàcilment a partir de programari mecanoquàntic, i esdevé ideal per a fer un primer estudi preliminar de correlació, on s'utilitza aquesta magnitud com a únic descriptor. - Càlcul d'autosemblances, on la densitat ha estat modificada per a augmentar el paper d'un substituent. Treballs previs amb densitats de fragments, tot i donar molt bons resultats, manquen de cert rigor conceptual en aïllar un fragment, suposadament responsable de l'activitat molecular, de la totalitat de l'estructura molecular, tot i que les densitats associades a aquest fragment ja difereixen degut a pertànyer a esquelets amb diferents substitucions. Un procediment per a omplir aquest buit que deixa la simple separació del fragment, considerant així la totalitat de la molècula (calcular-ne l'autosemblança), però evitant al mateix temps valors d'autosemblança no desitjats provocats per àtoms pesats, és l'ús de densitats de Forats de fermi, els quals es troben definits al voltant del fragment d'interès. Aquest procediment modifica la densitat de manera que es troba majoritàriament concentrada a la regió d'interès, però alhora permet obtenir una funció densitat, la qual es comporta matemàticament igual que la densitat electrònica regular, podent-se així incorporar dins del marc de la semblança molecular. Les autosemblances calculades amb aquesta metodologia han portat a bones correlacions amb àcids aromàtics substituïts, podent així donar una explicació al seu comportament. Des d'un altre punt de vista, també s'han fet contribucions conceptuals. S'ha implementat una nova mesura de semblança, la d'energia cinètica, la qual consisteix en prendre la recentment desenvolupada funció densitat d'energia cinètica, la qual al comportar-se matemàticament igual a les densitats electròniques regulars, s'ha incorporat en el marc de la semblança. A partir d'aquesta mesura s'han obtingut models QSAR satisfactoris per diferents conjunts moleculars. Dins de l'aspecte del tractament de les matrius de semblança s'ha implementat l'anomenada transformació estocàstica com a alternativa a l'ús de l'índex Carbó. Aquesta transformació de la matriu de semblança permet obtenir una nova matriu no simètrica, la qual pot ser posteriorment tractada per a construir models QSAR.
Resumo:
If the potential field due to the nuclei in the methane molecule is expanded in terms of a set of spherical harmonics about the carbon nucleus, only the terms involving s, f, and higher harmonic functions differ from zero in the equilibrium configuration. Wave functions have been calculated for the equilibrium configuration, first including only the spherically symmetric s term in the potential, and secondly including both the s and the f terms. In the first calculation the complete Hartree-Fock S.C.F. wave functions were determined; in the second calculation a variation method was used to determine the best form of the wave function involving f harmonics. The resulting wave functions and electron density functions are presented and discussed
Resumo:
A statistical–dynamical downscaling (SDD) approach for the regionalization of wind energy output (Eout) over Europe with special focus on Germany is proposed. SDD uses an extended circulation weather type (CWT) analysis on global daily mean sea level pressure fields with the central point being located over Germany. Seventy-seven weather classes based on the associated CWT and the intensity of the geostrophic flow are identified. Representatives of these classes are dynamically downscaled with the regional climate model COSMO-CLM. By using weather class frequencies of different data sets, the simulated representatives are recombined to probability density functions (PDFs) of near-surface wind speed and finally to Eout of a sample wind turbine for present and future climate. This is performed for reanalysis, decadal hindcasts and long-term future projections. For evaluation purposes, results of SDD are compared to wind observations and to simulated Eout of purely dynamical downscaling (DD) methods. For the present climate, SDD is able to simulate realistic PDFs of 10-m wind speed for most stations in Germany. The resulting spatial Eout patterns are similar to DD-simulated Eout. In terms of decadal hindcasts, results of SDD are similar to DD-simulated Eout over Germany, Poland, Czech Republic, and Benelux, for which high correlations between annual Eout time series of SDD and DD are detected for selected hindcasts. Lower correlation is found for other European countries. It is demonstrated that SDD can be used to downscale the full ensemble of the Earth System Model of the Max Planck Institute (MPI-ESM) decadal prediction system. Long-term climate change projections in Special Report on Emission Scenarios of ECHAM5/MPI-OM as obtained by SDD agree well to the results of other studies using DD methods, with increasing Eout over northern Europe and a negative trend over southern Europe. Despite some biases, it is concluded that SDD is an adequate tool to assess regional wind energy changes in large model ensembles.
Resumo:
The Atlantic Rain Forest, an important biodiversity hot spot, has faced severe habitat loss since the last century which has resulted in a highly fragmented landscape with a large number of small forest patches (<100 ha). For conservation planning it is essential to understand how current and future forest regeneration depends on ecological processes, fragment size and the connection to the regional seed pool. We have investigated the following questions by applying the forest growth simulation model FORMIND to the situation of the Atlantic Forest in the state of Sao Paulo, SE Brazil: (1) which set of parameters describing the local regeneration and level of density regulation can reproduce the biomass distribution and stem density of an old growth forest in a reserve? (2) Which additional processes apart from those describing the dynamics of an old growth forest, drive forest succession of small isolated fragments? (3) Which role does external seed input play during succession? Therefore, more than 300 tree species have been classified into nine plant functional types (PFTs), which are characterized by maximum potential height and shade tolerance. We differentiate between two seed dispersal modes: (i) local dispersal, i.e. all seedlings originated from fertile trees within the simulated area and (ii) external seed rain. Local seed dispersal has been parameterized following the pattern oriented approach, using biomass estimates of old growth forest. We have found that moderate density regulation is essential to achieve coexistence for a broad range of regeneration parameters. Considering the expected uncertainty and variability in the regeneration processes it is important that the forest dynamics are robust to variations in the regeneration parameters. Furthermore, edge effects such as increased mortality at the border and external seed rain have been necessary to reproduce the patterns for small isolated fragments. Overall, simulated biomass is much lower in the fragments compared to the continuous forest, whereas shade tolerant species are affected most strongly by fragmentation. Our simulations can supplement empirical studies by extrapolating local knowledge on edge effects of fragments to larger temporal and spatial scales. In particular our results show the importance of external seed rain and therefore highlight the importance of structural connectivity between regenerating fragments and mature forest stands. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Bisphosphonates are currently used in the treatment of many diseases involving increased bone resorption such as osteoporosis. Statins have been widely used for the treatment of hypercholesterolemia and recent studies have shown that these drugs are also capable of stimulating bone formation. The purpose of this study was to evaluate thel influence of an estrogen deficient state and the effects of simvastatin and sodium alendronate therapies on alveolar bone in female rats. Fifty-four rats were either ovariectomized (OVX) or sham operated. A month later, the animals began to receive a daily dose of simvastatin (SIN - 25 mg/kg), sodium alendronate (ALN - 2 mg/kg) or water (control) orally. Thirty-five days after the beginning of the treatment, the rats were sacrificed and their left hemimandibles were removed and radiographed using digital X-ray equipment. The alveolar radiographic density under the first molar was determined with gray-level scaling and the values were submitted to analysis of variance (α = 5%). Ovariectomized rats gained more weight (mean ± standard deviation: 20.06 ± 6.68%) than did the sham operated animals (12.13 ± 5.63%). Alveolar radiographic density values, expressed as gray levels, were lowest in the OVX-water group (183.49 ± 6.47), and differed significantly from those observed for the groups receiving alendronate (sham-ALN: 193.85 ± 3.81; OVX-ALN: 196.06 ± 5.11) and from those of the sham-water group (193.66 ± 4.36). Other comparisons between groups did not show significant differences. It was concluded that the ovariectomy reduced alveolar bone density and that alendronate was efficient for the treatment of this condition.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Abstract Background To establish the correlation between quantitative analysis based on B-mode ultrasound images of vulnerable carotid plaque and histological examination of the surgically removed plaque, on the basis of a videodensitometric digital texture characterization. Methods Twenty-five patients (18 males, mean age 67 ± 6.9 years) admitted for carotid endarterectomy for extracranial high-grade internal carotid artery stenosis (≥ 70% luminal narrowing) underwent to quantitative ultrasonic tissue characterization of carotid plaque before surgery. A computer software (Carotid Plaque Analysis Software) was developed to perform the videodensitometric analysis. The patients were divided into 2 groups according to symptomatology (group I, 15 symptomatic patients; and group II, 10 patients asymptomatic). Tissue specimens were analysed for lipid, fibromuscular tissue and calcium. Results The first order statistic parameter mean gray level was able to distinguish the groups I and II (p = 0.04). The second order parameter energy also was able to distinguish the groups (p = 0,02). A histological correlation showed a tendency of mean gray level to have progressively greater values from specimens with < 50% to >75% of fibrosis. Conclusion Videodensitometric computer analysis of scan images may be used to identify vulnerable and potentially unstable lipid-rich carotid plaques, which are less echogenic in density than stable or asymptomatic, more densely fibrotic plaques.
Resumo:
Pliocene and Pleistocene sediments of the Oman margin and Owen Ridge are characterized by continuous alternation of light and dark layers of nannofossil ooze and marly nannofossil ooze and cyclic variation of wet-bulk density. Origin of the wet-bulk density and color cycles was examined at Ocean Drilling Program Site 722 on the Owen Ridge and Site 728 on the Oman margin using 3.4-m.y.-long GRAPE (gamma ray attenuation) wet-bulk density records and records of sediment color represented as changes in gray level on black-and-white core photographs. At Sites 722 and 728 sediments display a weak correlation of decreasing wet-bulk density with increasing darkness of sediment color. Wet-bulk density is inversely related to organic carbon concentration and displays little relation to calcium carbonate concentration, which varies inversely with the abundance of terrigenous sediment components. Sediment color darkens with increasing terrigenous sediment abundance (decreasing carbonate content) and with increasing organic carbon concentration. Upper Pleistocene sediments at Site 722 display a regular pattern of dark colored intervals coinciding with glacial periods, whereas at Site 728 the pattern of color variation is more irregular. There is not a consistent relationship between the dark intervals and their relative wet-bulk density in the upper Pleistocene sections at Sites 722 and 728, suggesting that dominance of organic matter or terrigenous sediment as primary coloring agents varies. Spectra of wet-bulk density and optical density time series display concentration of variance at orbital periodicities of 100, 41, 23, and 19 k.y. A strong 41-k.y. periodicity characterizes wet-bulk density and optical density variation at both sites throughout most of the past 3.4 m.y. Cyclicity at the 41-k.y. periodicity is characterized by a lack of coherence between wet-bulk density and optical density suggesting that the bulk density and color cycles reflect the mixed influence of varying abundance of terrigenous sediments and organic matter. The 23-k.y. periodicity in wet-bulk density and sediment color cycles is generally characterized by significant coherence between wet-bulk density and optical density, which reflects an inverse relationship between these parameters. Varying organic matter abundance, associated with changes in productivity or preservation, is inferred to more strongly influence changes in wet-bulk density and sediment color at this periodicity.
Resumo:
We propose a level set based variational approach that incorporates shape priors into edge-based and region-based models. The evolution of the active contour depends on local and global information. It has been implemented using an efficient narrow band technique. For each boundary pixel we calculate its dynamic according to its gray level, the neighborhood and geometric properties established by training shapes. We also propose a criterion for shape aligning based on affine transformation using an image normalization procedure. Finally, we illustrate the benefits of the our approach on the liver segmentation from CT images.
Resumo:
The selection of predefined analytic grids (partitions of the numeric ranges) to represent input and output functions as histograms has been proposed as a mechanism of approximation in order to control the tradeoff between accuracy and computation times in several áreas ranging from simulation to constraint solving. In particular, the application of interval methods for probabilistic function characterization has been shown to have advantages over other methods based on the simulation of random samples. However, standard interval arithmetic has always been used for the computation steps. In this paper, we introduce an alternative approximate arithmetic aimed at controlling the cost of the interval operations. Its distinctive feature is that grids are taken into account by the operators. We apply the technique in the context of probability density functions in order to improve the accuracy of the probability estimates. Results show that this approach has advantages over existing approaches in some particular situations, although computation times tend to increase significantly when analyzing large functions.
Resumo:
A conventional neural network approach to regression problems approximates the conditional mean of the output vector. For mappings which are multi-valued this approach breaks down, since the average of two solutions is not necessarily a valid solution. In this article mixture density networks, a principled method to model conditional probability density functions, are applied to retrieving Cartesian wind vector components from satellite scatterometer data. A hybrid mixture density network is implemented to incorporate prior knowledge of the predominantly bimodal function branches. An advantage of a fully probabilistic model is that more sophisticated and principled methods can be used to resolve ambiguities.
Resumo:
Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.
Resumo:
A conventional neural network approach to regression problems approximates the conditional mean of the output vector. For mappings which are multi-valued this approach breaks down, since the average of two solutions is not necessarily a valid solution. In this article mixture density networks, a principled method to model conditional probability density functions, are applied to retrieving Cartesian wind vector components from satellite scatterometer data. A hybrid mixture density network is implemented to incorporate prior knowledge of the predominantly bimodal function branches. An advantage of a fully probabilistic model is that more sophisticated and principled methods can be used to resolve ambiguities.