882 resultados para algoritmos de confiabilidade


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este material contém a apostila “Construção de algoritmos” da disciplina Algoritmos e programação I do curso de Sistemas de informação. O conteúdo programático a ser abordado é composto de 11 unidades, sendo: “Unidade 1: Conceitos básicos sobre algoritmos”; “Unidade 2: Conceitos básicos para o desenvolvimento de algoritmos”; “Unidade 3: Expressões algorítmicas”; “Unidade 4: Estratégia de desenvolvimento de algoritmos”; “Unidade 5: Comandos condicionais”; “Unidade 6: Comandos de repetição”; “Unidade 7: Ponteiros”; “Unidade 8: Estruturas compostas heterogêneas: registros”; “Unidade 9: Sub-rotinas”; “Unidade 10: Estruturas compostas homogêneas: arranjos”; “Unidade 11: Estruturas compostas mistas: homogêneas e heterogêneas”. O material possui figuras ilustrativas, algoritmos usados como exemplos e tabelas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

deo de introdução ao tema Análise de algoritmos. Neste vídeo são expostos os objetivos principais da análise de algoritmos, apresentando ao aluno o que é a análise de algoritmos e o que é a análise assintótica. É apresentado também o propósito da análise de algoritmos, a saber, para comparar dois ou mais algoritmos que fazem a mesma tarefa e decidir qual é o melhor. Para este tema, é apresentada a definição matemática relacionada ao assunto e também alguns exemplos visuais.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Versão acessível do vídeo com audiodescrição.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A videoaula traz uma visão geral, conceitos e principais características dos algorítimos computacionais. Algoritmos correspondem a uma sequencia finita de ações que, quando executadas, levam à solução de um problema em um tempo finito. A partir de um problema, passa-se pela aplicação de uma sequência de ações e no final o problema é resolvido. Algoritmos possuem como características a execução sequencial das instruções; cada instrução é executada por completo antes de se proceder para a próxima, não sendo ambíguas e dependentes de interpretação.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A videoaula traz uma introdução sobre análise de algoritmos e análise assintótica. A análise de algoritmos possibilita a compreensão do comportamento do algoritmo quando há muito o que processar, e a comparação entre algoritmos diferentes que resolvem um mesmo problema. A análise é feita por causa do tempo de execução, sendo possível fazer uma análise pelo espaço requerido.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A videoaula traz exemplos sobre análise de algoritmos, explanando sobre a análise de trechos com tempo constante, análise de trechos com repetições de incremento constante, e análise de trechos com multiplicação ou divisão do controle de repetição.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nesta dissertação defendemos uma forma nova de medir o produto de software com base nas medidas usadas na teoria dos sistemas complexos. Consideramos o uso dessas medidas vantajoso em relação ao uso das medidas tradicionais da engenharia de software. A inovação desta dissertação sintetiza-se em considerar o produto de software como um sistema complexo, dotado de uma estrutura que comporta vários níveis e na proposta da correlação de gama longa como medida da complexidade de estrutura de programas fontes. Essa medida, invariante para a escala de cada nível da estrutura, pode ser calculada automaticamente. Na dissertação, primeiro descrevemos o processo de desenvolvimento do software e as medidas existentes para medir o referido processo e produto e introduzimos a teoria dos sistemas complexos. Concluímos que o processo tem características de sistema complexo e propomos que seja medido como tal. Seguidamente, estudamos a estrutura do produto e a dinâmica do seu. processo de desenvolvimento. Apresentamos um estudo experimental sobre algoritmos codificados em C, que usamos para validar hipóteses sobre a complexidade da estrutura do produto. Propomos a correlação de gama longa como medida da complexidade da estrutura. Estendemos essa medida a uma amostra codificada em Java. Concluímos, evidenciando as limitações e as potencialidades dessa medida e a sua aplicação em Engenharia de Software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This master dissertation presents the study and implementation of inteligent algorithms to monitor the measurement of sensors involved in natural gas custody transfer processes. To create these algoritmhs Artificial Neural Networks are investigated because they have some particular properties, such as: learning, adaptation, prediction. A neural predictor is developed to reproduce the sensor output dynamic behavior, in such a way that its output is compared to the real sensor output. A recurrent neural network is used for this purpose, because of its ability to deal with dynamic information. The real sensor output and the estimated predictor output work as the basis for the creation of possible sensor fault detection and diagnosis strategies. Two competitive neural network architectures are investigated and their capabilities are used to classify different kinds of faults. The prediction algorithm and the fault detection classification strategies, as well as the obtained results, are presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed at assessing the interobserver reliability of the Brazilian-Portuguese version of the Berg Balance Scale (BBS). The assessment was made by physiotherapists with extensive or little clinical experience in non-institutionalized elderly individuals. Participants comprised 12 elderly subjects (10 women and 2 men) with mean ages of 75.8 ± 8.4 years (range = 63-87) and 18 physiotherapists with varying clinical experience. Inter-examiner reliability obtained for each scale item yielded weighted kappa value > 0.75 in 11 of the 14 items (varying from 0.37 to 1.0). The intra-class correlation coefficient (ICC) for the total sum of BBS scores between the two groups of physiotherapists was 0.996 (95% confidence interval, 0.987 0.999) with a Cronbach alpha coefficient of 0.996. We found no difference between the rater groups when we compared the sum score means obtained with the student s T-test (p = 0.86). Although some items had low reliability values, in general our results suggest that the Brazilian Version of the BBS showed good levels of interrater reliability and agreement when used by physiotherapists with different clinical practice levels and without previous training on non-institutionalized elderly patients. We concluded that the BBS can be useful as an important evaluation instrument on a protocol for Rehabilitation clinics. It may be used by various health professionals, as: Physicians, Physical therapists, Physical educators, Occupational therapists, Nurses and Phonoaudiologists, so confirming the interdisciplinary character of this study

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective in the facility location problem with limited distances is to minimize the sum of distance functions from the facility to the customers, but with a limit on each distance, after which the corresponding function becomes constant. The problem has applications in situations where the service provided by the facility is insensitive after a given threshold distance (eg. fire station location). In this work, we propose a global optimization algorithm for the case in which there are lower and upper limits on the numbers of customers that can be served

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The bidimensional periodic structures called frequency selective surfaces have been well investigated because of their filtering properties. Similar to the filters that work at the traditional radiofrequency band, such structures can behave as band-stop or pass-band filters, depending on the elements of the array (patch or aperture, respectively) and can be used for a variety of applications, such as: radomes, dichroic reflectors, waveguide filters, artificial magnetic conductors, microwave absorbers etc. To provide high-performance filtering properties at microwave bands, electromagnetic engineers have investigated various types of periodic structures: reconfigurable frequency selective screens, multilayered selective filters, as well as periodic arrays printed on anisotropic dielectric substrates and composed by fractal elements. In general, there is no closed form solution directly from a given desired frequency response to a corresponding device; thus, the analysis of its scattering characteristics requires the application of rigorous full-wave techniques. Besides that, due to the computational complexity of using a full-wave simulator to evaluate the frequency selective surface scattering variables, many electromagnetic engineers still use trial-and-error process until to achieve a given design criterion. As this procedure is very laborious and human dependent, optimization techniques are required to design practical periodic structures with desired filter specifications. Some authors have been employed neural networks and natural optimization algorithms, such as the genetic algorithms and the particle swarm optimization for the frequency selective surface design and optimization. This work has as objective the accomplishment of a rigorous study about the electromagnetic behavior of the periodic structures, enabling the design of efficient devices applied to microwave band. For this, artificial neural networks are used together with natural optimization techniques, allowing the accurate and efficient investigation of various types of frequency selective surfaces, in a simple and fast manner, becoming a powerful tool for the design and optimization of such structures

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a set of intelligent algorithms with the purpose of correcting calibration errors in sensors and reducting the periodicity of their calibrations. Such algorithms were designed using Artificial Neural Networks due to its great capacity of learning, adaptation and function approximation. Two approaches willbe shown, the firstone uses Multilayer Perceptron Networks to approximate the many shapes of the calibration curve of a sensor which discalibrates in different time points. This approach requires the knowledge of the sensor s functioning time, but this information is not always available. To overcome this need, another approach using Recurrent Neural Networks was proposed. The Recurrent Neural Networks have a great capacity of learning the dynamics of a system to which it was trained, so they can learn the dynamics of a sensor s discalibration. Knowingthe sensor s functioning time or its discalibration dynamics, it is possible to determine how much a sensor is discalibrated and correct its measured value, providing then, a more exact measurement. The algorithms proposed in this work can be implemented in a Foundation Fieldbus industrial network environment, which has a good capacity of device programming through its function blocks, making it possible to have them applied to the measurement process