50 resultados para Algoritmos de consulta
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
This master dissertation presents the study and implementation of inteligent algorithms to monitor the measurement of sensors involved in natural gas custody transfer processes. To create these algoritmhs Artificial Neural Networks are investigated because they have some particular properties, such as: learning, adaptation, prediction. A neural predictor is developed to reproduce the sensor output dynamic behavior, in such a way that its output is compared to the real sensor output. A recurrent neural network is used for this purpose, because of its ability to deal with dynamic information. The real sensor output and the estimated predictor output work as the basis for the creation of possible sensor fault detection and diagnosis strategies. Two competitive neural network architectures are investigated and their capabilities are used to classify different kinds of faults. The prediction algorithm and the fault detection classification strategies, as well as the obtained results, are presented
Resumo:
The objective of this exploratory descriptive quantitative study was to analyze the behaviors in the detection, treatment and followup of the pregnant woman with syphilis, by health professionals that conduct the pre-natal consultation in the Family Health Strategy, as to the adherence to the recommendations of the Ministry of Health. The study was conducted in nine municipalites of the Trairi region in the state of Rio Grande do Norte. Data were collected during the months of July through September 2007, by means of a questionnaire with a population of 53 health professionals, 30 nurses and 23 physicians. Data were analyzed by descriptive statistics. The results were organized in three major items: knowledge of the health professionals about the symptomatology of syphilis and their actions in the detection of the disease; actions in the treatment of the pregnant woman with syphilis; and the actions of followup of the desease. We identified that 81,2% of the professionals have knowledge about the symptomatology of syphilis in the pregnant woman;79,2% request the VDRL exam in the adequate intervals and approximately 50% conduct the treatment in conformity with the recommendations of the Ministry of Health. For the followup care of the infected woman, 79,2% request a monthly VDRL examination, 69,8% explain the disease to the pregnant woman, and 20,7% affirm that they conduct a proper reception to the woman.We conclude that the majority of the health professionals have knowledge of the detection, treatment and followup of the pregnant woman with syphilis. However, the actions of some professionals diverge from the conduct procedures recommended by the Ministry of Health, as to the requesting of the examinations, medication prescription and notification of the iesease. This indicates the need for improved
Resumo:
This study aimed to validate the contents of an Instrument for Nursing Consultation in the Home Visit of people with Spinal Cord Injury (INCEVDOP-LM), based on the Self-Care Deficit Theory. The methodological development study was conducted with spinal cord injured (SCI) people ascribed in the Family Health Units the city of Natal/RN/Brazil, and with the nurses of these institutions. The study was conducted from Januray 2012 to January 2013 in two phases: the first aimed to identify the need for self-care of persons with SCI, and the second to develop and validate the INCEVDOP-LM. The first phase consisted of a census study of people with SCI living in Natal/RN. In the second phase, a non-probabilistic convencience sample of subjects was selected to form two groups: First stage - Group 1 of the first stage was comprised by 73 adults with SCI diagnosed with paraplegia or tetraplegia, with cognitive function preserved and that were registered to some family health unit; Group 2 of the Second phase was composed of six experts that were nurses with doctoral formation, scientific experience in the area of technology development or assistance to persons with SCI, and with publications in periodicals Qualis A2. Data collection of the first phase was conducted through home visits of people with SCI that responded three instruments: Questionnaire I (comprised of demographic and socioeconomic variables), The Competency Rating Scale for Self-care (ASA) and the Barthel Index (an instrument for evaluation of functional capacity). The research for the second phase was conducted in two stages: I-construction of the INCEVDOP LM; II-validation of the INCEVDOP-LM. The instrument and an evaluation form were forwarded to the experts for the validation. The correlations between the responses were analyzed by the Kappa test, with accepting values of>0.75. The evaluation criteria were: organization, clarity, simplicity, readability, appropriateness of vocabulary, objectivity, accuracy, reliability and suitability and the positive responses with frequency values of≥90% were considered excellent. The chi-square test was used to investigate the differences between proportions. The study attended to the principles of Human Rights CNS Resolution 196/96. Results were reported by means of four articles derived from the study. The findings indicate that the items that showed disagreement among experts (k=0.02) were diagnoses, interventions and evaluation of the nursing features pertaining to the domains of Nutrition, Hygiene, Elimination, Physical, Social and Psychological, and of the Ability to perform work activities feature. Agreement among the experts were reported for the other items, with kappa ranging from 0.72 to 1. After removing items with disagreement, all criteria achieved excellent rates and no significant differences were observed between the proportions of responses of evaluation of experts (p>0.05). We conclude that the instrument shows validity to serve as a guide for nurses to conduct a systematic consultation during the home visit to people with spinal cord injury, with emphasis on self-care. The instrument must go through other levels of validation when applied in the clinical setting
Resumo:
A pesquisa teve como objetivo elaborar e validar um instrumento para sistematizar a assistência de enfermagem à puérpera no âmbito da atenção básica. O documento foi construído com base na Teoria das Necessidades Humanas Básicas de Horta, na Padronização de um Conjunto Internacional de Dados Essenciais em Enfermagem e na Nomenclatura de diagnósticos e intervenções de enfermagem desenvolvida a partir dos resultados da Classificação Internacional para as Práticas de Enfermagem. Trata-se de um estudo do tipo metodológico desenvolvido em cinco etapas: identificação dos indicadores empíricos relativos à puérpera mediante revisão integrativa da literatura; avaliação dos indicadores empíricos e sua relação com as necessidades humanas básicas por grupo focal com cinco enfermeiras especialistas; estruturação do instrumento mediante a categorização dos indicadores; validação de forma e conteúdo do instrumento pelos especialistas, por meio da técnica Delphi; e aplicação e desenvolvimento das afirmativas de diagnóstico e intervenções de enfermagem. A coleta de dados da primeira etapa ocorreu nos meses de janeiro a março de 2013 nas bases de dados Scopus, Cinahl, Pubmed, Cochrane, e no periódico Journal of Midwifery and Women s Health. A segunda, terceira e quarta etapas se realizaram nos meses de maio a outubro de 2013. Participaram doze e sete especialistas na primeira e segunda rodada de avaliação respectivamente. A seleção dos especialistas ocorreu pela Plataforma Lattes mediante os seguintes critérios de inclusão: ser enfermeiro (a) docente e especialista em enfermagem obstétrica. A consulta a estes profissionais se deu via email e, ao aceitarem participar da pesquisa, assinaram um Termo de Consentimento Livre e Esclarecido. A pesquisa obteve aprovação da Comissão de Ética em Pesquisa da Universidade Federal do Rio Grande do Norte, sob o protocolo nº 184.241 e Certificado de Apresentação para Apreciação Ética nº 11674112.3.0000.5537. Para análise dos dados da primeira etapa, utilizou-se a estatística descritiva e os resultados apresentados em forma de tabelas. Nesta etapa, identificou-se 97 indicadores empíricos e, quando relacionados com as necessidades humanas básicas, 46 desses encontravam-se nas necessidades psicobiológicas, 51 nas psicossociais e 01 nas necessidades psicoespirituais. Com relação à segunda e terceira etapas, os dados passaram por um processo de categorização e análise pelo Índice de Validade de Conteúdo. Os indicadores obtiveram um índice de validação de 100%. Na parte de avaliação da puérpera, os itens não validados foram excluídos do instrumento. Os demais itens obtiveram índice acima de 70%, sendo, portanto, o instrumento validado. O instrumento para a consulta de enfermagem é constituído de dados de identificação da puérpera, dados de avaliação das necessidades humanas da puérpera e itens do cuidado de enfermagem. Na versão final foram selecionados 73 Diagnósticos de Enfermagem e 155 Intervenções de Enfermagem a partir da categorização dos indicadores empíricos validados na segunda e terceira etapas do estudo. Com a conclusão do estudo, o enfermeiro disporá de um instrumento para sistematização da assistência à puérpera na atenção básica. Além disso, o documento servirá como ferramenta para o ensino e a pesquisa em enfermagem obstétrica
Resumo:
The objective in the facility location problem with limited distances is to minimize the sum of distance functions from the facility to the customers, but with a limit on each distance, after which the corresponding function becomes constant. The problem has applications in situations where the service provided by the facility is insensitive after a given threshold distance (eg. fire station location). In this work, we propose a global optimization algorithm for the case in which there are lower and upper limits on the numbers of customers that can be served
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
This work presents a study in quality of health care, with focus on consulting appointment. The main purpose is to define a statistical model and propose a quality grade of the consulting appointment time. The time considered is that from the day the patient get the appointment done to the day the consulting is realized. It is used reliability techniques and functions that has as main characteristic the analysis of data regarding the time of occurrence certain event. It is gathered a random sample of 1743 patients in the appointment system of a University Hospital - the Hospital Universitário Onofre Lopes - of the Federal University of Rio Grande do Norte, Brazil. The sample is randomly stratified in terms on clinical specialty. The data were analyzed against the parametric methods of the reliability statistics and the adjustment of the regression model resulted in the Weibull distribution being best fit to data. The quality grade proposed is based in the PAHO criteria for a consulting appointment and result that no clinic got the PAHO quality grade. The quality grade proposed could be used to define priority for improvement and as criteria to quality control
Resumo:
The knowledge management has received major attention from product designers because many of the activities within this process have to be creative and, therefore, they depend basically on the knowledge of the people who are involved in the process. Moreover, Product Development Process (PDP) is one of the activities in which knowledge management manifests in the most critical form once it had the intense application of the knowledge. As a consequence, this thesis analyzes the knowledge management aiming to improve the PDP and it also proposes a theoretical model of knowledge management. This model uses five steps (creation, maintenance, dissemination, utilization and discard) through the verification of the occurrence of four types of knowledge conversion (socialization, externalization, combination and internalization) that it will improve the knowledge management in this process. The intellectual capital in Small and Medium Enterprises (SMEs) managed efficiently and with the participation of all employees has become the mechanism of the creation and transference processes of knowledge, supporting and, consequently, improving the PDP. The expected results are an effective and efficient application of the proposed model for the creation of the knowledge base within an organization (organizational memory) aiming a better performance of the PDP. In this way, it was carried out an extensive analysis of the knowledge management (instrument of qualitative and subjective evaluation) within the Design department of a Brazilian company (SEBRAE/RN). This analysis aimed to know the state-of-the-art of the Design department regarding the use of knowledge management. This step was important in order to evaluate in the level of the evolution of the department related to the practical use of knowledge management before implementing the proposed theoretical model and its methodology. At the end of this work, based on the results of the diagnosis, a knowledge management system is suggested to facilitate the knowledge sharing within the organization, in order words, the Design department
Resumo:
The bidimensional periodic structures called frequency selective surfaces have been well investigated because of their filtering properties. Similar to the filters that work at the traditional radiofrequency band, such structures can behave as band-stop or pass-band filters, depending on the elements of the array (patch or aperture, respectively) and can be used for a variety of applications, such as: radomes, dichroic reflectors, waveguide filters, artificial magnetic conductors, microwave absorbers etc. To provide high-performance filtering properties at microwave bands, electromagnetic engineers have investigated various types of periodic structures: reconfigurable frequency selective screens, multilayered selective filters, as well as periodic arrays printed on anisotropic dielectric substrates and composed by fractal elements. In general, there is no closed form solution directly from a given desired frequency response to a corresponding device; thus, the analysis of its scattering characteristics requires the application of rigorous full-wave techniques. Besides that, due to the computational complexity of using a full-wave simulator to evaluate the frequency selective surface scattering variables, many electromagnetic engineers still use trial-and-error process until to achieve a given design criterion. As this procedure is very laborious and human dependent, optimization techniques are required to design practical periodic structures with desired filter specifications. Some authors have been employed neural networks and natural optimization algorithms, such as the genetic algorithms and the particle swarm optimization for the frequency selective surface design and optimization. This work has as objective the accomplishment of a rigorous study about the electromagnetic behavior of the periodic structures, enabling the design of efficient devices applied to microwave band. For this, artificial neural networks are used together with natural optimization techniques, allowing the accurate and efficient investigation of various types of frequency selective surfaces, in a simple and fast manner, becoming a powerful tool for the design and optimization of such structures
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
This work presents a set of intelligent algorithms with the purpose of correcting calibration errors in sensors and reducting the periodicity of their calibrations. Such algorithms were designed using Artificial Neural Networks due to its great capacity of learning, adaptation and function approximation. Two approaches willbe shown, the firstone uses Multilayer Perceptron Networks to approximate the many shapes of the calibration curve of a sensor which discalibrates in different time points. This approach requires the knowledge of the sensor s functioning time, but this information is not always available. To overcome this need, another approach using Recurrent Neural Networks was proposed. The Recurrent Neural Networks have a great capacity of learning the dynamics of a system to which it was trained, so they can learn the dynamics of a sensor s discalibration. Knowingthe sensor s functioning time or its discalibration dynamics, it is possible to determine how much a sensor is discalibrated and correct its measured value, providing then, a more exact measurement. The algorithms proposed in this work can be implemented in a Foundation Fieldbus industrial network environment, which has a good capacity of device programming through its function blocks, making it possible to have them applied to the measurement process
Resumo:
The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm
Resumo:
The metaheuristics techiniques are known to solve optimization problems classified as NP-complete and are successful in obtaining good quality solutions. They use non-deterministic approaches to generate solutions that are close to the optimal, without the guarantee of finding the global optimum. Motivated by the difficulties in the resolution of these problems, this work proposes the development of parallel hybrid methods using the reinforcement learning, the metaheuristics GRASP and Genetic Algorithms. With the use of these techniques, we aim to contribute to improved efficiency in obtaining efficient solutions. In this case, instead of using the Q-learning algorithm by reinforcement learning, just as a technique for generating the initial solutions of metaheuristics, we use it in a cooperative and competitive approach with the Genetic Algorithm and GRASP, in an parallel implementation. In this context, was possible to verify that the implementations in this study showed satisfactory results, in both strategies, that is, in cooperation and competition between them and the cooperation and competition between groups. In some instances were found the global optimum, in others theses implementations reach close to it. In this sense was an analyze of the performance for this proposed approach was done and it shows a good performance on the requeriments that prove the efficiency and speedup (gain in speed with the parallel processing) of the implementations performed
Resumo:
In this work, the Markov chain will be the tool used in the modeling and analysis of convergence of the genetic algorithm, both the standard version as for the other versions that allows the genetic algorithm. In addition, we intend to compare the performance of the standard version with the fuzzy version, believing that this version gives the genetic algorithm a great ability to find a global optimum, own the global optimization algorithms. The choice of this algorithm is due to the fact that it has become, over the past thirty yares, one of the more importan tool used to find a solution of de optimization problem. This choice is due to its effectiveness in finding a good quality solution to the problem, considering that the knowledge of a good quality solution becomes acceptable given that there may not be another algorithm able to get the optimal solution for many of these problems. However, this algorithm can be set, taking into account, that it is not only dependent on how the problem is represented as but also some of the operators are defined, to the standard version of this, when the parameters are kept fixed, to their versions with variables parameters. Therefore to achieve good performance with the aforementioned algorithm is necessary that it has an adequate criterion in the choice of its parameters, especially the rate of mutation and crossover rate or even the size of the population. It is important to remember that those implementations in which parameters are kept fixed throughout the execution, the modeling algorithm by Markov chain results in a homogeneous chain and when it allows the variation of parameters during the execution, the Markov chain that models becomes be non - homogeneous. Therefore, in an attempt to improve the algorithm performance, few studies have tried to make the setting of the parameters through strategies that capture the intrinsic characteristics of the problem. These characteristics are extracted from the present state of execution, in order to identify and preserve a pattern related to a solution of good quality and at the same time that standard discarding of low quality. Strategies for feature extraction can either use precise techniques as fuzzy techniques, in the latter case being made through a fuzzy controller. A Markov chain is used for modeling and convergence analysis of the algorithm, both in its standard version as for the other. In order to evaluate the performance of a non-homogeneous algorithm tests will be applied to compare the standard fuzzy algorithm with the genetic algorithm, and the rate of change adjusted by a fuzzy controller. To do so, pick up optimization problems whose number of solutions varies exponentially with the number of variables
Resumo:
This thesis describes design methodologies for frequency selective surfaces (FSSs) composed of periodic arrays of pre-fractals metallic patches on single-layer dielectrics (FR4, RT/duroid). Shapes presented by Sierpinski island and T fractal geometries are exploited to the simple design of efficient band-stop spatial filters with applications in the range of microwaves. Initial results are discussed in terms of the electromagnetic effect resulting from the variation of parameters such as, fractal iteration number (or fractal level), fractal iteration factor, and periodicity of FSS, depending on the used pre-fractal element (Sierpinski island or T fractal). The transmission properties of these proposed periodic arrays are investigated through simulations performed by Ansoft DesignerTM and Ansoft HFSSTM commercial softwares that run full-wave methods. To validate the employed methodology, FSS prototypes are selected for fabrication and measurement. The obtained results point to interesting features for FSS spatial filters: compactness, with high values of frequency compression factor; as well as stable frequency responses at oblique incidence of plane waves. This thesis also approaches, as it main focus, the application of an alternative electromagnetic (EM) optimization technique for analysis and synthesis of FSSs with fractal motifs. In application examples of this technique, Vicsek and Sierpinski pre-fractal elements are used in the optimal design of FSS structures. Based on computational intelligence tools, the proposed technique overcomes the high computational cost associated to the full-wave parametric analyzes. To this end, fast and accurate multilayer perceptron (MLP) neural network models are developed using different parameters as design input variables. These neural network models aim to calculate the cost function in the iterations of population-based search algorithms. Continuous genetic algorithm (GA), particle swarm optimization (PSO), and bees algorithm (BA) are used for FSSs optimization with specific resonant frequency and bandwidth. The performance of these algorithms is compared in terms of computational cost and numerical convergence. Consistent results can be verified by the excellent agreement obtained between simulations and measurements related to FSS prototypes built with a given fractal iteration