884 resultados para algoritmos de agrupamento


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este material contém a apostila “Construção de algoritmos” da disciplina Algoritmos e programação I do curso de Sistemas de informação. O conteúdo programático a ser abordado é composto de 11 unidades, sendo: “Unidade 1: Conceitos básicos sobre algoritmos”; “Unidade 2: Conceitos básicos para o desenvolvimento de algoritmos”; “Unidade 3: Expressões algorítmicas”; “Unidade 4: Estratégia de desenvolvimento de algoritmos”; “Unidade 5: Comandos condicionais”; “Unidade 6: Comandos de repetição”; “Unidade 7: Ponteiros”; “Unidade 8: Estruturas compostas heterogêneas: registros”; “Unidade 9: Sub-rotinas”; “Unidade 10: Estruturas compostas homogêneas: arranjos”; “Unidade 11: Estruturas compostas mistas: homogêneas e heterogêneas”. O material possui figuras ilustrativas, algoritmos usados como exemplos e tabelas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

deo de introdução ao tema Análise de algoritmos. Neste vídeo são expostos os objetivos principais da análise de algoritmos, apresentando ao aluno o que é a análise de algoritmos e o que é a análise assintótica. É apresentado também o propósito da análise de algoritmos, a saber, para comparar dois ou mais algoritmos que fazem a mesma tarefa e decidir qual é o melhor. Para este tema, é apresentada a definição matemática relacionada ao assunto e também alguns exemplos visuais.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Versão acessível do vídeo com audiodescrição.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A videoaula traz uma visão geral, conceitos e principais características dos algorítimos computacionais. Algoritmos correspondem a uma sequencia finita de ações que, quando executadas, levam à solução de um problema em um tempo finito. A partir de um problema, passa-se pela aplicação de uma sequência de ações e no final o problema é resolvido. Algoritmos possuem como características a execução sequencial das instruções; cada instrução é executada por completo antes de se proceder para a próxima, não sendo ambíguas e dependentes de interpretação.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A videoaula traz uma introdução sobre análise de algoritmos e análise assintótica. A análise de algoritmos possibilita a compreensão do comportamento do algoritmo quando há muito o que processar, e a comparação entre algoritmos diferentes que resolvem um mesmo problema. A análise é feita por causa do tempo de execução, sendo possível fazer uma análise pelo espaço requerido.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A videoaula traz exemplos sobre análise de algoritmos, explanando sobre a análise de trechos com tempo constante, análise de trechos com repetições de incremento constante, e análise de trechos com multiplicação ou divisão do controle de repetição.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nesta dissertação defendemos uma forma nova de medir o produto de software com base nas medidas usadas na teoria dos sistemas complexos. Consideramos o uso dessas medidas vantajoso em relação ao uso das medidas tradicionais da engenharia de software. A inovação desta dissertação sintetiza-se em considerar o produto de software como um sistema complexo, dotado de uma estrutura que comporta vários níveis e na proposta da correlação de gama longa como medida da complexidade de estrutura de programas fontes. Essa medida, invariante para a escala de cada nível da estrutura, pode ser calculada automaticamente. Na dissertação, primeiro descrevemos o processo de desenvolvimento do software e as medidas existentes para medir o referido processo e produto e introduzimos a teoria dos sistemas complexos. Concluímos que o processo tem características de sistema complexo e propomos que seja medido como tal. Seguidamente, estudamos a estrutura do produto e a dinâmica do seu. processo de desenvolvimento. Apresentamos um estudo experimental sobre algoritmos codificados em C, que usamos para validar hipóteses sobre a complexidade da estrutura do produto. Propomos a correlação de gama longa como medida da complexidade da estrutura. Estendemos essa medida a uma amostra codificada em Java. Concluímos, evidenciando as limitações e as potencialidades dessa medida e a sua aplicação em Engenharia de Software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This master dissertation presents the study and implementation of inteligent algorithms to monitor the measurement of sensors involved in natural gas custody transfer processes. To create these algoritmhs Artificial Neural Networks are investigated because they have some particular properties, such as: learning, adaptation, prediction. A neural predictor is developed to reproduce the sensor output dynamic behavior, in such a way that its output is compared to the real sensor output. A recurrent neural network is used for this purpose, because of its ability to deal with dynamic information. The real sensor output and the estimated predictor output work as the basis for the creation of possible sensor fault detection and diagnosis strategies. Two competitive neural network architectures are investigated and their capabilities are used to classify different kinds of faults. The prediction algorithm and the fault detection classification strategies, as well as the obtained results, are presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, the DFA introduced by Peng, was established as an important tool capable of detecting long-range autocorrelation in time series with non-stationary. This technique has been successfully applied to various areas such as: Econophysics, Biophysics, Medicine, Physics and Climatology. In this study, we used the DFA technique to obtain the Hurst exponent (H) of the profile of electric density profile (RHOB) of 53 wells resulting from the Field School of Namorados. In this work we want to know if we can or not use H to spatially characterize the spatial data field. Two cases arise: In the first a set of H reflects the local geology, with wells that are geographically closer showing similar H, and then one can use H in geostatistical procedures. In the second case each well has its proper H and the information of the well are uncorrelated, the profiles show only random fluctuations in H that do not show any spatial structure. Cluster analysis is a method widely used in carrying out statistical analysis. In this work we use the non-hierarchy method of k-means. In order to verify whether a set of data generated by the k-means method shows spatial patterns, we create the parameter Ω (index of neighborhood). High Ω shows more aggregated data, low Ω indicates dispersed or data without spatial correlation. With help of this index and the method of Monte Carlo. Using Ω index we verify that random cluster data shows a distribution of Ω that is lower than actual cluster Ω. Thus we conclude that the data of H obtained in 53 wells are grouped and can be used to characterize space patterns. The analysis of curves level confirmed the results of the k-means

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper artificial neural network (ANN) based on supervised and unsupervised algorithms were investigated for use in the study of rheological parameters of solid pharmaceutical excipients, in order to develop computational tools for manufacturing solid dosage forms. Among four supervised neural networks investigated, the best learning performance was achieved by a feedfoward multilayer perceptron whose architectures was composed by eight neurons in the input layer, sixteen neurons in the hidden layer and one neuron in the output layer. Learning and predictive performance relative to repose angle was poor while to Carr index and Hausner ratio (CI and HR, respectively) showed very good fitting capacity and learning, therefore HR and CI were considered suitable descriptors for the next stage of development of supervised ANNs. Clustering capacity was evaluated for five unsupervised strategies. Network based on purely unsupervised competitive strategies, classic "Winner-Take-All", "Frequency-Sensitive Competitive Learning" and "Rival-Penalize Competitive Learning" (WTA, FSCL and RPCL, respectively) were able to perform clustering from database, however this classification was very poor, showing severe classification errors by grouping data with conflicting properties into the same cluster or even the same neuron. On the other hand it could not be established what was the criteria adopted by the neural network for those clustering. Self-Organizing Maps (SOM) and Neural Gas (NG) networks showed better clustering capacity. Both have recognized the two major groupings of data corresponding to lactose (LAC) and cellulose (CEL). However, SOM showed some errors in classify data from minority excipients, magnesium stearate (EMG) , talc (TLC) and attapulgite (ATP). NG network in turn performed a very consistent classification of data and solve the misclassification of SOM, being the most appropriate network for classifying data of the study. The use of NG network in pharmaceutical technology was still unpublished. NG therefore has great potential for use in the development of software for use in automated classification systems of pharmaceutical powders and as a new tool for mining and clustering data in drug development

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective in the facility location problem with limited distances is to minimize the sum of distance functions from the facility to the customers, but with a limit on each distance, after which the corresponding function becomes constant. The problem has applications in situations where the service provided by the facility is insensitive after a given threshold distance (eg. fire station location). In this work, we propose a global optimization algorithm for the case in which there are lower and upper limits on the numbers of customers that can be served

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The bidimensional periodic structures called frequency selective surfaces have been well investigated because of their filtering properties. Similar to the filters that work at the traditional radiofrequency band, such structures can behave as band-stop or pass-band filters, depending on the elements of the array (patch or aperture, respectively) and can be used for a variety of applications, such as: radomes, dichroic reflectors, waveguide filters, artificial magnetic conductors, microwave absorbers etc. To provide high-performance filtering properties at microwave bands, electromagnetic engineers have investigated various types of periodic structures: reconfigurable frequency selective screens, multilayered selective filters, as well as periodic arrays printed on anisotropic dielectric substrates and composed by fractal elements. In general, there is no closed form solution directly from a given desired frequency response to a corresponding device; thus, the analysis of its scattering characteristics requires the application of rigorous full-wave techniques. Besides that, due to the computational complexity of using a full-wave simulator to evaluate the frequency selective surface scattering variables, many electromagnetic engineers still use trial-and-error process until to achieve a given design criterion. As this procedure is very laborious and human dependent, optimization techniques are required to design practical periodic structures with desired filter specifications. Some authors have been employed neural networks and natural optimization algorithms, such as the genetic algorithms and the particle swarm optimization for the frequency selective surface design and optimization. This work has as objective the accomplishment of a rigorous study about the electromagnetic behavior of the periodic structures, enabling the design of efficient devices applied to microwave band. For this, artificial neural networks are used together with natural optimization techniques, allowing the accurate and efficient investigation of various types of frequency selective surfaces, in a simple and fast manner, becoming a powerful tool for the design and optimization of such structures