1000 resultados para Algoritmos diagnósticos


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analisar a associação estatística entre diagnósticos de enfermagem identifi cados nos pacientes transplantados renais e seus fatores sócio-econômicos. Estudo transversal realizado em um hospital universitário em Fortaleza-CE, no período compreendido entre dezembro de 2004 a abril de 2005. Foram identifi cados 38 diagnósticos de enfermagem. Dez diagnósticos apresentaram freqüências acima do percentil 75: Risco de infecção; percepção sensorial perturbada: visual; Padrão de sono perturbado; Nutrição desequilibrada: mais do que as necessidades corporais; Fadiga; Disfunção sexual; Percepção sensorial perturbada: audição; Dor aguda; Padrões de sexualidade inefi cazes; Risco de nutrição desequilibrada: mais do que as necessidades. O estudo sobre diagnósticos de enfermagem ajudou a expandir o conhecimento da realidade dos pacientes, o que é necessário para o estabelecimento de cuidados de enfermagem ao transplante renal, bem como para abordar os resultados que vierem a ser desenvolvidas com o objetivo de melhorar a qualidade de vida

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work supports the formalist education s improvement of the Brazilian architect and urban designer through a better application and systematization of the computer science s teaching potentialities. The research analyzes the discipline introduction in the Brazilian courses: Computer Science Applied to Architecture and Urbanism (InfoAU) in the Architecture and Urbanism Courses of Brazil (CAUs). It goes since the discipline was obligated by the MEC s Regulation nr. 1.770 from 1994 until the CNE/CES Resolution nr. 6 from 2006; it comprehends the discipline implantation, development and current context, in order to get more detailed, the research analyses three federal universities from the Northeast of the country: UFRN, UFPB and UFPE. Once identified the historical educational needs in the CAUs, the study focus on the computer science s increasing potential as an agent of integration, communication, development and knowledge/data management. It points out new social perspectives for a better use of this tool/mechanism, which adequately structuralized and integrated, creates propitious educational and professional performance/recycling conditions and a propeller instrument of research and extension activities. Among this work, it is suggested the aggregation of elements, which are complementary to the InfoAU discipline s reorganization, throughout a computerization s plan for the CAUs, extensive to the professional formation, generating a virtuous cycle in several activities of the academic, administrative and, research and extension s departments. Therefore, the InfoAU in the Brazilian CAUs context was analyzed; the main solutions and problems found were systemized; the possibilities of computer science s uses inside AU ware structuralized, InfoAU discipline s improvement plan was also defined, as well as strategies for the implementation of the computerization s plan for the CAUs, which would guarantee its maintenance in a continuity perspective

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This master dissertation presents the study and implementation of inteligent algorithms to monitor the measurement of sensors involved in natural gas custody transfer processes. To create these algoritmhs Artificial Neural Networks are investigated because they have some particular properties, such as: learning, adaptation, prediction. A neural predictor is developed to reproduce the sensor output dynamic behavior, in such a way that its output is compared to the real sensor output. A recurrent neural network is used for this purpose, because of its ability to deal with dynamic information. The real sensor output and the estimated predictor output work as the basis for the creation of possible sensor fault detection and diagnosis strategies. Two competitive neural network architectures are investigated and their capabilities are used to classify different kinds of faults. The prediction algorithm and the fault detection classification strategies, as well as the obtained results, are presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The metabolic syndrome (MetS) involves a group of risk factors and is associated with a significantly higher risk of developing cardiovascular diseases (CVD) and type 2 diabetes. Recent studies have shown the importance of preventing CVD through early diagnosis and treatment of patients with MetS. The objective of our study was to determine the prevalence of MetS by different diagnostic criteria in postmenopausal women and analyze the influence of socioeconomic factors on cardiovascular risk in this sample of the population. A cross-sectional study involving 127 postmenopausal women (45 to 64 years) from Natal and Mossoró, Brazil. The study was approved by the Research Ethics Committee of the Federal University of Rio Grande do Norte. The experimental protocol consisted of applying structured interview, clinical examination and implementation of dosages blood. The diagnosis of MetS was based on NCEP-ATP III (National Cholesterol Education Program-Adult Treatment Panel III) and IDF (International Diabetes Federation) criteria. The research was accomplished with the participation of an interdisciplinary team in their several phases. The result of the sample studied had mean age of 53.9 ± 4.6 years and per capita income of 54.5 dollars. The prevalence of MetS, according to NCEP-ATP III and IDF criteria, was 52.8% and 61.4$, respectively. The agreement rate between NCEP-ATP III and IDF criteria was 81.9%, with a kappa value of 0.63 (CI 95%, 0.49-0.76), indicating good agreement between the two definitions. The most prevalent cardiovascular risk factor was HDL < 50 mg/dl, observed in 96.1% of the women analyzed, followed by increased waist circumference (≥ 80 cm) in 78.0%, elevated blood pressure in 51.2%, triglycerides ≥ 150 mg/dl in 40.9% and glycemia ≥ 100 mg/dl in 37.0% of the women. The occurrence of MetS was significantly associated with schooling and body mass index (BMI). High blood pressure was significantly associated with low family income, low schooling and weight gain. There was no significant association between the intensity of climacteric symptomatology and the occurrence of MetS. The conclusions of the research were that MetS and its individual components show a high prevalence in postmenopausal Brazilian women, and significant associations with weight gain and low socioeconomic indicators. The data point to the need for an interdisciplinary approach at the basic health care level, directed toward the early identification of risk factors and the promotion of cardiovascular health of climacteric women.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The characterization of the nursing diagnoses in prostatectomized patients is important to provide an unique nursing language, facilitating the communication between professionals and patients. The objective of this study was to analyze the nursing diagnoses of patients in the immediate prostatectomy postoperative period. This is a cross-sectional and descriptive study, developed at the surgical-clinic of Onofre Lopes University Hospital, in the Natal City RN - Brazil. The sample was composed of 50 patients included by the criteria: have presented a diagnosis of a benign prostatic hyperplasia or a prostate cancer, have been subjected to a prostate surgery at the mentioned hospital, and have been in the immediate postoperative period at the moment of the data collection. The exclusion criteria were: haven t been in an appropriate physical and mental condition, have presented a brain vascular disease, a lung disease, an advanced liver disease, a heart disease or a extensive coronary artery disease. The data collection instruments were: the script of an interview and physical examination. The data collection period was between November 2010 and April 2011. The data were organized in two phases: the diagnostic process and the construction of the database. The project was approved by the Ethics Committee of the Federal University of Rio Grande do Norte The results showed that most patients came from the countryside, was living with partners, had an average of 67.78 years, was pensionerthose with low schooling, Catholic and often did not perform preventive examinations of prostatic disease. The patients showed an average of 9.48 nursing diagnoses, defining characteristics 21.70 and 20.72 related or risk factors per patient. We identified 30 nursing diagnoses, of which 7 were above the 75 percentile: Risk of falls, Impaired ambulation, Risk of infection, Self-care deficit bath / hygiene and dress up and Risk for deficient fluid volume. The top six nursing diagnoses were in all patients, and therefore could not apply any statistical test. The others ND were associated with their defining characteristics and related or risk factors. We conclude that the nursing diagnoses identified in this study contribute to the progress of the nursing care to the prostatectomized patients in post-surgery period, allowing the deployment of nursing actions for the effective resolution of identified problems

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nurses in the hemodialysis has an important role in the nursing process implementation, in the context of a theoretical referential. Among the nursing theories, highlights the Roy´s adaptations model, who considers a person as an holistic adaptive system that aims to adapt customers to different living conditions. Thus, it is believed that the Roy´s nursing process will guide nursing care to patients on dialysis. Therefore, the study aimed to analyze the nursing diagnosis present in patients with chronic kidney disease on hemodialysis based on the theoretical model of Roy and NANDA-International. Descriptive and cros-sectional study, performed at a dialysis center in a city in northeastern Brazil. Sample of 178 patients and consecutive sampling by convenience. Data collection ocurred from October/2011 until February/2012, through interview and physical examination forms. Data analysis was initiated by clinical reasoning, diagnosis judgment and similarity relation. Then, the data were entered into SPSS program, 16.0 version, generating descriptive statistics. The project was approved by the Ethics Research Committee (protocol nº 115/11) with a Presentation Certificate for Ethics Appreciation (in 0139.0.051.000-111) and was funded by the Universal edict MCT / CNPq 14/2010. The results revealed that most patients were male (52.2%), married (62.9%) and residents in the Natal´s metropolitan region (54.5%). The mean age was 46.6 years and the years of study, 8,5. Regarding nursing diagnosis obtained an average of 6.6, especially: Risk of Infection (100%), excessive fluid volume (99.4%) and hypothermia (61.8%). On the other hand the adaptive problems average was 6.4, and the most common: intracellular fluid retention (99.4%); Hyperkalemia (64.6%); Hypothermia (61.8%) and edema (53.9%). Were established 20 similarity relations between the NANDA-International nursing diagnosis and adaptive problems of Roy, namely: risk of falls / injury risk and potential for injury, impaired physical mobility and walking mobility and / or restricted coordination, dressing self-care deficit and loss of self-care ability; hypothermia and hypothermia; impaired skin integrity and impaired skin integrity; excessive fluid volume and intracellular fluid retention / Hyperkalemia / Hypocalcemia / edema; imbalanced nutrition: less than body requirements and Nutrition less than the body's needs; constipation and constipation, acute pain and acute pain, chronic pain and chronic pain, sensorial perception disturbed: visual, tactile and auditory disabilities and a primary sense: sight, hearing and tactile; sleep deprivation and insomnia, fatigue and intolerance to activities; ineffective self health and fails in the role; sexual dysfunction and sexual dysfunction; situational low self-esteem and low self-esteem, and diarrhea and diarrhea. We conclude that there is similarity between the typologies and was required a model´s analysis, because they present different ways to establish the nursing diagnosis. Moreover, the nursing process use, under the context of a theory and a classification system, subsidizes the care and contributes to the strengthening of nursing science

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective in the facility location problem with limited distances is to minimize the sum of distance functions from the facility to the customers, but with a limit on each distance, after which the corresponding function becomes constant. The problem has applications in situations where the service provided by the facility is insensitive after a given threshold distance (eg. fire station location). In this work, we propose a global optimization algorithm for the case in which there are lower and upper limits on the numbers of customers that can be served

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The bidimensional periodic structures called frequency selective surfaces have been well investigated because of their filtering properties. Similar to the filters that work at the traditional radiofrequency band, such structures can behave as band-stop or pass-band filters, depending on the elements of the array (patch or aperture, respectively) and can be used for a variety of applications, such as: radomes, dichroic reflectors, waveguide filters, artificial magnetic conductors, microwave absorbers etc. To provide high-performance filtering properties at microwave bands, electromagnetic engineers have investigated various types of periodic structures: reconfigurable frequency selective screens, multilayered selective filters, as well as periodic arrays printed on anisotropic dielectric substrates and composed by fractal elements. In general, there is no closed form solution directly from a given desired frequency response to a corresponding device; thus, the analysis of its scattering characteristics requires the application of rigorous full-wave techniques. Besides that, due to the computational complexity of using a full-wave simulator to evaluate the frequency selective surface scattering variables, many electromagnetic engineers still use trial-and-error process until to achieve a given design criterion. As this procedure is very laborious and human dependent, optimization techniques are required to design practical periodic structures with desired filter specifications. Some authors have been employed neural networks and natural optimization algorithms, such as the genetic algorithms and the particle swarm optimization for the frequency selective surface design and optimization. This work has as objective the accomplishment of a rigorous study about the electromagnetic behavior of the periodic structures, enabling the design of efficient devices applied to microwave band. For this, artificial neural networks are used together with natural optimization techniques, allowing the accurate and efficient investigation of various types of frequency selective surfaces, in a simple and fast manner, becoming a powerful tool for the design and optimization of such structures

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a set of intelligent algorithms with the purpose of correcting calibration errors in sensors and reducting the periodicity of their calibrations. Such algorithms were designed using Artificial Neural Networks due to its great capacity of learning, adaptation and function approximation. Two approaches willbe shown, the firstone uses Multilayer Perceptron Networks to approximate the many shapes of the calibration curve of a sensor which discalibrates in different time points. This approach requires the knowledge of the sensor s functioning time, but this information is not always available. To overcome this need, another approach using Recurrent Neural Networks was proposed. The Recurrent Neural Networks have a great capacity of learning the dynamics of a system to which it was trained, so they can learn the dynamics of a sensor s discalibration. Knowingthe sensor s functioning time or its discalibration dynamics, it is possible to determine how much a sensor is discalibrated and correct its measured value, providing then, a more exact measurement. The algorithms proposed in this work can be implemented in a Foundation Fieldbus industrial network environment, which has a good capacity of device programming through its function blocks, making it possible to have them applied to the measurement process

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The metaheuristics techiniques are known to solve optimization problems classified as NP-complete and are successful in obtaining good quality solutions. They use non-deterministic approaches to generate solutions that are close to the optimal, without the guarantee of finding the global optimum. Motivated by the difficulties in the resolution of these problems, this work proposes the development of parallel hybrid methods using the reinforcement learning, the metaheuristics GRASP and Genetic Algorithms. With the use of these techniques, we aim to contribute to improved efficiency in obtaining efficient solutions. In this case, instead of using the Q-learning algorithm by reinforcement learning, just as a technique for generating the initial solutions of metaheuristics, we use it in a cooperative and competitive approach with the Genetic Algorithm and GRASP, in an parallel implementation. In this context, was possible to verify that the implementations in this study showed satisfactory results, in both strategies, that is, in cooperation and competition between them and the cooperation and competition between groups. In some instances were found the global optimum, in others theses implementations reach close to it. In this sense was an analyze of the performance for this proposed approach was done and it shows a good performance on the requeriments that prove the efficiency and speedup (gain in speed with the parallel processing) of the implementations performed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, the Markov chain will be the tool used in the modeling and analysis of convergence of the genetic algorithm, both the standard version as for the other versions that allows the genetic algorithm. In addition, we intend to compare the performance of the standard version with the fuzzy version, believing that this version gives the genetic algorithm a great ability to find a global optimum, own the global optimization algorithms. The choice of this algorithm is due to the fact that it has become, over the past thirty yares, one of the more importan tool used to find a solution of de optimization problem. This choice is due to its effectiveness in finding a good quality solution to the problem, considering that the knowledge of a good quality solution becomes acceptable given that there may not be another algorithm able to get the optimal solution for many of these problems. However, this algorithm can be set, taking into account, that it is not only dependent on how the problem is represented as but also some of the operators are defined, to the standard version of this, when the parameters are kept fixed, to their versions with variables parameters. Therefore to achieve good performance with the aforementioned algorithm is necessary that it has an adequate criterion in the choice of its parameters, especially the rate of mutation and crossover rate or even the size of the population. It is important to remember that those implementations in which parameters are kept fixed throughout the execution, the modeling algorithm by Markov chain results in a homogeneous chain and when it allows the variation of parameters during the execution, the Markov chain that models becomes be non - homogeneous. Therefore, in an attempt to improve the algorithm performance, few studies have tried to make the setting of the parameters through strategies that capture the intrinsic characteristics of the problem. These characteristics are extracted from the present state of execution, in order to identify and preserve a pattern related to a solution of good quality and at the same time that standard discarding of low quality. Strategies for feature extraction can either use precise techniques as fuzzy techniques, in the latter case being made through a fuzzy controller. A Markov chain is used for modeling and convergence analysis of the algorithm, both in its standard version as for the other. In order to evaluate the performance of a non-homogeneous algorithm tests will be applied to compare the standard fuzzy algorithm with the genetic algorithm, and the rate of change adjusted by a fuzzy controller. To do so, pick up optimization problems whose number of solutions varies exponentially with the number of variables