951 resultados para vector quantization based Gaussian modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work proposes the development of an intelligent system for analysis of digital mammograms, capable to detect and to classify masses and microcalcifications. The digital mammograms will be pre-processed through techniques of digital processing of images with the purpose of adapting the image to the detection system and automatic classification of the existent calcifications in the suckles. The model adopted for the detection and classification of the mammograms uses the neural network of Kohonen by the algorithm Self Organization Map - SOM. The algorithm of Vector quantization, Kmeans it is also used with the same purpose of the SOM. An analysis of the performance of the two algorithms in the automatic classification of digital mammograms is developed. The developed system will aid the radiologist in the diagnosis and accompaniment of the development of abnormalities

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ln this work the implementation of the SOM (Self Organizing Maps) algorithm or Kohonen neural network is presented in the form of hierarchical structures, applied to the compression of images. The main objective of this approach is to develop an Hierarchical SOM algorithm with static structure and another one with dynamic structure to generate codebooks (books of codes) in the process of the image Vector Quantization (VQ), reducing the time of processing and obtaining a good rate of compression of images with a minimum degradation of the quality in relation to the original image. Both self-organizing neural networks developed here, were denominated HSOM, for static case, and DHSOM, for the dynamic case. ln the first form, the hierarchical structure is previously defined and in the later this structure grows in an automatic way in agreement with heuristic rules that explore the data of the training group without use of external parameters. For the network, the heuristic mIes determine the dynamics of growth, the pruning of ramifications criteria, the flexibility and the size of children maps. The LBO (Linde-Buzo-Oray) algorithm or K-means, one ofthe more used algorithms to develop codebook for Vector Quantization, was used together with the algorithm of Kohonen in its basic form, that is, not hierarchical, as a reference to compare the performance of the algorithms here proposed. A performance analysis between the two hierarchical structures is also accomplished in this work. The efficiency of the proposed processing is verified by the reduction in the complexity computational compared to the traditional algorithms, as well as, through the quantitative analysis of the images reconstructed in function of the parameters: (PSNR) peak signal-to-noise ratio and (MSE) medium squared error

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The VSS X- chart is known to perform better than the traditional X- control chart in detecting small to moderate mean shifts in the process. Many researchers have used this chart in order to detect a process mean shift under the assumption of known parameters. However, in practice, the process parameters are rarely known and are usually estimated from an in-control Phase I data set. In this paper, we evaluate the (run length) performances of the VSS X- control chart when the process parameters are estimated and we compare them in the case where the process parameters are assumed known. We draw the conclusion that these performances are quite different when the shift and the number of samples used during the phase I are small. ©2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with a system involving a flexible rod subjected to magnetic forces that can bend it while simultaneously subjected to external excitations produces complex and nonlinear dynamic behavior, which may present different types of solutions for its different movement-related responses. This fact motivated us to analyze such a mechanical system based on modeling and numerical simulation involving both, integer order calculus (IOC) and fractional order calculus (FOC) approaches. The time responses, pseudo phase portraits and Fourier spectra have been presented. The results obtained can be used as a source for conduct experiments in order to obtain more realistic and more accurate results about fractional-order models when compared to the integer-order models. © Published under licence by IOP Publishing Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tese apresenta uma metodologia para avaliação de desempenho de redes de acesso banda larga. A avaliação de desempenho de redes é uma forma de identificar e analisar como determinadas características tais como diferentes tipos de tráfego ou formas de utilização, por exemplo, podem influenciar no comportamento da rede em foco, podendo assim prever como tal rede se comportará frente a situações futuras. A metodologia apresentada é composta de duas abordagens: uma abordagem baseada em medições e outra baseada em modelagem via processos Markovianos. As redes analisadas englobam os dois tipos básicos de arquitetura de acesso: redes ADSL2+ (linha digital do assinante assimétrica 2+ – Asymmetric Digital Subscriber Line 2+), as quais são redes cabeadas que utilizam cabos metálicos de pares trançados; redes FBWN (rede sem fio banda larga fixa – Fixed Broadband Wireless Network), as quais são redes sem fio (wireless) baseadas no padrão IEEE 802.16. A abordagem de medições é focada na forma como a rede analisada se comporta frente a três situações: transmissão de um tráfego genérico; impacto de ruídos não-estacionários no sistema; e uso da rede como meio de transmissão de tráfego multimídia em tempo real. A abordagem de modelagem, por sua vez, ´e baseada em prever o comportamento das redes analisadas utilizando uma formulação matemática fundamentada em processos Markovianos. Os resultados apresentados indicam a viabilidade de aplicação desta metodologia como forma de avaliação de desempenho. Os resultados ainda tornam possível a extensão desta metodologia a outros tipos de redes de acesso banda larga, tais como: redes de fibras ópticas, redes de enlaces de microondas, redes VDSL/VDSL2 (linha digital do assinante de alta taxa de dados – Very-high-data-rate DSL), etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The modeling technique is simple, useful and practical to calculate optimum nutrient density to maximize profit margins, using nonlinear programming by predictive broiler performance. To demonstrate the influence of the broiler price could interact with nutrient density, the experiment aimed to define the quadratic equations for consumption and weight gain, based on modeling, to be applied to nonlinear programming, according to sex (male and female) in the starter (1 to 21 days), grower (22 to 42 days) and finisher phases (43 to 56 days). The experimental design was a randomized, totaling 6 treatments [energy levels of 2800, 2900, 3000, 3100, 3200 and 3300kcal AME/kg with constant nutrient : AME (Apparent Metabolizable Energy)] with 4 replicates and 10 birds per plot, using the program free download PPFR Excel workbook for feed formulation (http://www.foa.unesp.br/downloads/file_detalhes.asp?CatCod=4&SubCatCod=138&FileCod=1677). Data from this trial confirmed that there was a significant relationship between feed intake and total energy consumption of the diet, in which feed intake was increased or decreased simply to keep the amount of energy, with a constant rate of nutrient : AME. Therefore, the data support that if the essential dietary nutrients are kept in proportion to the energy density of the diet, according to the appropriate requirements (male / female) of broilers, the weight and feed conversion are significantly (P<0.05) favored by increasing the energy density of the diet. Thus, it enables the application of models for maximum profit (nonlinear formulation), to estimate the proportion of weight gain most appropriate according to the price paid by the market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivation An actual issue of great interest, both under a theoretical and an applicative perspective, is the analysis of biological sequences for disclosing the information that they encode. The development of new technologies for genome sequencing in the last years, opened new fundamental problems since huge amounts of biological data still deserve an interpretation. Indeed, the sequencing is only the first step of the genome annotation process that consists in the assignment of biological information to each sequence. Hence given the large amount of available data, in silico methods became useful and necessary in order to extract relevant information from sequences. The availability of data from Genome Projects gave rise to new strategies for tackling the basic problems of computational biology such as the determination of the tridimensional structures of proteins, their biological function and their reciprocal interactions. Results The aim of this work has been the implementation of predictive methods that allow the extraction of information on the properties of genomes and proteins starting from the nucleotide and aminoacidic sequences, by taking advantage of the information provided by the comparison of the genome sequences from different species. In the first part of the work a comprehensive large scale genome comparison of 599 organisms is described. 2,6 million of sequences coming from 551 prokaryotic and 48 eukaryotic genomes were aligned and clustered on the basis of their sequence identity. This procedure led to the identification of classes of proteins that are peculiar to the different groups of organisms. Moreover the adopted similarity threshold produced clusters that are homogeneous on the structural point of view and that can be used for structural annotation of uncharacterized sequences. The second part of the work focuses on the characterization of thermostable proteins and on the development of tools able to predict the thermostability of a protein starting from its sequence. By means of Principal Component Analysis the codon composition of a non redundant database comprising 116 prokaryotic genomes has been analyzed and it has been showed that a cross genomic approach can allow the extraction of common determinants of thermostability at the genome level, leading to an overall accuracy in discriminating thermophilic coding sequences equal to 95%. This result outperform those obtained in previous studies. Moreover, we investigated the effect of multiple mutations on protein thermostability. This issue is of great importance in the field of protein engineering, since thermostable proteins are generally more suitable than their mesostable counterparts in technological applications. A Support Vector Machine based method has been trained to predict if a set of mutations can enhance the thermostability of a given protein sequence. The developed predictor achieves 88% accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Negli ultimi decenni la Politica Agricola Comune (PAC) è stata sottoposta a diverse revisioni, più o meno programmate, che ne hanno modificato gli obiettivi operativi e gli strumenti per perseguirli. In letteratura economica agraria sono state eseguite diverse ricerche che affrontano analisi ex-ante sui possibili impatti delle riforme politiche, in particolare al disaccoppiamento, riguardo all’allocazione dei terreni alle diverse colture e all’adozione di tecniche di coltivazione più efficienti. Ma tale argomento, nonostante sia di grande importanza, non è stato finora affrontato come altri temi del mondo agricolo. Le principali lacune si riscontrano infatti nella carenza di analisi ex-ante, di modelli che includano le preferenze e le aspettative degli agricoltori. Questo studio valuta le scelte di investimento in terreno di un’azienda agricola di fronte a possibili scenari PAC post-2013, in condizioni di incertezza circa le specifiche condizioni in cui ciascuno scenario verrebbe a verificarsi. L’obiettivo è di ottenere indicazioni utili in termini di comprensione delle scelte di investimento dell’agricoltore in presenza di incertezza sul futuro. L’elemento maggiormente innovativo della ricerca consiste nell’applicazione di un approccio real options e nell’interazione tra la presenza di diversi scenari sul futuro del settore agricolo post-2013, e la componente di incertezza che incide e gravita su di essi. La metodologia adottata nel seguente lavoro si basa sulla modellizzazione di un’azienda agricola, in cui viene simulato il comportamento dell’azienda agricola in reazione alle riforme della PAC e alla variazione dei prezzi dei prodotti in presenza di incertezza. Mediante un modello di Real Option viene valutata la scelta della tempistica ottimale per investire nell’acquisto di terreno (caratterizzato da incertezza e irreversibilità). Dai risultati emerge come in presenza di incertezza all’agricoltore convenga rimandare la decisione a dopo il 2013 e in base alle maggiori informazioni disponibili eseguire l’investimento solo in presenza di condizioni favorevoli. La variazione dei prezzi dei prodotti influenza le scelte più dell’incertezza dei contributi PAC. Il Real Option sembra interpretare meglio il comportamento dell’agricoltore rispetto all’approccio classico del Net Present Value.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has long been known that trypanosomes regulate mitochondrial biogenesis during the life cycle of the parasite; however, the mitochondrial protein inventory (MitoCarta) and its regulation remain unknown. We present a novel computational method for genome-wide prediction of mitochondrial proteins using a support vector machine-based classifier with approximately 90% prediction accuracy. Using this method, we predicted the mitochondrial localization of 468 proteins with high confidence and have experimentally verified the localization of a subset of these proteins. We then applied a recently developed parallel sequencing technology to determine the expression profiles and the splicing patterns of a total of 1065 predicted MitoCarta transcripts during the development of the parasite, and showed that 435 of the transcripts significantly changed their expressions while 630 remain unchanged in any of the three life stages analyzed. Furthermore, we identified 298 alternatively splicing events, a small subset of which could lead to dual localization of the corresponding proteins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims Phenotypic optimality models neglect genetics. However, especially when heterozygous genotypes ire fittest, evolving allele, genotype and phenotype frequencies may not correspond to predicted optima. This was not previously addressed for organisms with complex life histories. Methods Therefore, we modelled the evolution of a fitness-relevant trait of clonal plants, stolon internode length. We explored the likely case of air asymmetric unimodal fitness profile with three model types. In constant selection models (CSMs), which are gametic, but not spatially explicit, evolving allele frequencies in the one-locus and five-loci cases did not correspond to optimum stolon internode length predicted by the spatially explicit, but not gametic, phenotypic model. This deviation was due to the asymmetry of the fitness profile. Gametic, spatially explicit individual-based (SEIB) modeling allowed us relaxing the CSM assumptions of constant selection with exclusively sexual reproduction. Important findings For entirely vegetative or sexual reproduction, predictions. of the gametic SEIB model were close to the ones of spatially explicit CSMs gametic phenotypic models, hut for mixed modes of reproduction they appoximated those of gametic, not spatially explicit CSMs. Thus, in contrast to gametic SEIB models, phenotypic models and, especially for few loci, also CSMs can be very misleading. We conclude that the evolution of trails governed by few quantitative trait loci appears hardly predictable by simple models, that genetic algorithms aiming at technical optimization may actually, miss the optimum and that selection may lead to loci with smaller effects, in derived compared with ancestral lines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several proxy-based and modeling studies have investigated long-term changes in Caribbean climate during the Holocene, however, very little is known on its variability on short timescales. Here we reconstruct seasonality and interannual to multidecadal variability of sea surface hydrology of the southern Caribbean Sea by applying paired coral Sr/Ca and d18O measurements on fossil annually banded Diploria strigosa corals from Bonaire. This allows for better understanding of seasonal to multidecadal variability of the Caribbean hydrological cycle during the mid- to late Holocene. The monthly resolved coral Delta d18O records are used as a proxy for the oxygen isotopic composition of seawater (d18Osw) of the southern Caribbean Sea. Consistent with modern day conditions, annual d18Osw cycles reconstructed from three modern corals reveal that freshwater budget at the study site is influenced by both net precipitation and advection of tropical freshwater brought by wind-driven surface currents. In contrast, the annual d18Osw cycle reconstructed from a mid-Holocene coral indicates a sharp peak towards more negative values in summer, suggesting intense summer precipitation at 6 ka BP (before present). In line with this, our model simulations indicate that increased seasonality of the hydrological cycle at 6 ka BP results from enhanced precipitation in summertime. On interannual to multidecadal timescales, the systematic positive correlation observed between reconstructed sea surface temperature and salinity suggests that freshwater discharged from the Orinoco and Amazon rivers and transported into the Caribbean by wind-driven surface currents is a critical component influencing sea surface hydrology on these timescales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The geometries of a catchment constitute the basis for distributed physically based numerical modeling of different geoscientific disciplines. In this paper results from ground-penetrating radar (GPR) measurements, in terms of a 3D model of total sediment thickness and active layer thickness in a periglacial catchment in western Greenland, is presented. Using the topography, thickness and distribution of sediments is calculated. Vegetation classification and GPR measurements are used to scale active layer thickness from local measurements to catchment scale models. Annual maximum active layer thickness varies from 0.3 m in wetlands to 2.0 m in barren areas and areas of exposed bedrock. Maximum sediment thickness is estimated to be 12.3 m in the major valleys of the catchment. A method to correlate surface vegetation with active layer thickness is also presented. By using relatively simple methods, such as probing and vegetation classification, it is possible to upscale local point measurements to catchment scale models, in areas where the upper subsurface is relatively homogenous. The resulting spatial model of active layer thickness can be used in combination with the sediment model as a geometrical input to further studies of subsurface mass-transport and hydrological flow paths in the periglacial catchment through numerical modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Self-OrganizingMap (SOM) is a neural network model that performs an ordered projection of a high dimensional input space in a low-dimensional topological structure. The process in which such mapping is formed is defined by the SOM algorithm, which is a competitive, unsupervised and nonparametric method, since it does not make any assumption about the input data distribution. The feature maps provided by this algorithm have been successfully applied for vector quantization, clustering and high dimensional data visualization processes. However, the initialization of the network topology and the selection of the SOM training parameters are two difficult tasks caused by the unknown distribution of the input signals. A misconfiguration of these parameters can generate a feature map of low-quality, so it is necessary to have some measure of the degree of adaptation of the SOM network to the input data model. The topologypreservation is the most common concept used to implement this measure. Several qualitative and quantitative methods have been proposed for measuring the degree of SOM topologypreservation, particularly using Kohonen's model. In this work, two methods for measuring the topologypreservation of the Growing Cell Structures (GCSs) model are proposed: the topographic function and the topology preserving map