28 resultados para Classificação de Bliss
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
PEREIRA, Edinete do Nascimento et al. Classificação bibliográfica: as diversas contribuições para o tratamento da informação. In: SEMINÁRIO DE PESQUISA DO CCSA, 15., 2009. Anais... Natal: UFRN, 2009.
Resumo:
PEREIRA, Edinete do Nascimento et al. Classificação bibliográfica: as diversas contribuições para o tratamento da informação. In: SEMINÁRIO DE PESQUISA DO CCSA, 15., 2009. Anais... Natal: UFRN, 2009.
Resumo:
This master dissertation presents the study and implementation of inteligent algorithms to monitor the measurement of sensors involved in natural gas custody transfer processes. To create these algoritmhs Artificial Neural Networks are investigated because they have some particular properties, such as: learning, adaptation, prediction. A neural predictor is developed to reproduce the sensor output dynamic behavior, in such a way that its output is compared to the real sensor output. A recurrent neural network is used for this purpose, because of its ability to deal with dynamic information. The real sensor output and the estimated predictor output work as the basis for the creation of possible sensor fault detection and diagnosis strategies. Two competitive neural network architectures are investigated and their capabilities are used to classify different kinds of faults. The prediction algorithm and the fault detection classification strategies, as well as the obtained results, are presented
Resumo:
Stroke represents the first cause of disabilities among adults. Although different professions work together in treatment of stroke patients, all they use different terminologies for the description of the patients problems and it can constitute an impediment in the communication between the staff members. Thus, the multidisciplinary and interdisciplinary work would be facilitated if using a reference common tool, as the new International Classification of Functioning, Disability and Health (ICF). However, the ICF is very extensive and complex and due to its complexity, it has been evidenced the necessity to select its categories to become it more practical. The aim of the study was to investigate which categories of the ICF are more suitable to evaluate and to describe the stroke patient in the view of teachers and municipal public health professionals. It was a descriptive research, which involved 5 professors and 11 professionals of Physiotherapy that have worked at the health public area in Natal / RN. It was used the Delphi Technique in 3 rounds and the Likert Scale to select the categories among the ICF components. As result, from the 362 IFC categories, 94 were selected. The selected categories correspond to rehabilitative characteristics of Stroke patients in the universe of the Physiotherapy performance. The methodology applied was suitable to the studied object emphasizing the necessity of future studies for validation of the chosen categories
Resumo:
The venous ulcer is an epidemiological problem of high prevalence, causing disability and dependence. Assess the tissue impairment level of patients with venous lesions, within a nursing referential, is relevant for the implementation of a directed assistance to specific clientele. Thus, this work aims to characterize the health status regarding the integrity the lower limbs skin of patients with venous ulcers, according to the of tissue integrity outcome indicators from the Nursing Outcomes Classification. A cross-sectional study conducted in a university hospital in Natal - Rio Grande do Norte. The sample consisted of 50 participants, selected through consecutive sampling. Data collection occurred through a interview and physical examination form and a operational definitions tool for indicators of the nursing Tissue Integrity outcome directed to patients with venous ulcer, applied from February to June 2012. Data analysis was done by descriptive statistics and nonparametric tests (Spearman, Kruskal-Wallis and Mann-Whitney tests). The project was approved by the Research Ethics Committee with protocol 608/11 and Presentation Certificate to Ethical Consideration No. 0038.0.294.000-11. The results were presented using three scientific articles derivatives of research. It was found that the indicators show moderate impairment, light and not impaired, as the median. The respondents had an average of 59.72 years, 66% female, 50% were retired, 60% with a partner, 44% had arterial hypertension, 26% allergies, 20% diabetes mellitus, 96% were sedentary, 14% drank alcohol and 6% were smokers. There was a statistically significant correlation of low intensity between age and hydration (p=0.032; rs=-0.304) and skin desquamation (p=0.026; rs=-0.316), family income and necrosis (p=0.012; rs=-0.353); Ankle Brachial Index and tissue perfusion (p=0,044; rs=-0,329); Diabetes Mellitus and texture (p=0.015) and tissue perfusion (p=0.026); allergy and texture (p=0.034), physical activity and hydration (p=0.034), smoking and thickness (p=0.018), and alcohol consumption and exudate (p=0.045). We conclude that the patients had light to moderate impairment, indicating a good state of health on the integrity of the skin of the lower limbs, according to the indicators of the outcome of tissue integrity Classification Nursing Outcomes valued in the present study. It is believed that the evaluation of impairment tissue using a self-nursing system and its relation with socioeconomic, clinical and risk factors are unique tools in the care planning and in the wound healing
Resumo:
The objective of this work is to draw attention to the importance of use of techniques of loss prevention in small retail organization, analyzing and creating a classification model related to the use of these in companies. This work identifies the fragilities and virtues of companies and classifies them relating the use of techniques of loss prevention. The used methodology is based in a revision of the available literature on measurements and techniques of loss prevention, analyzing the processes that techniques needed to be adopted to reduce losses, approaching the "pillars" of loss prevention, the cycle life of products in retail and cycles of continues improvement in business. Based on the objectives of this work and on the light of researched techniques, was defined the case study, developed from a questionnaire application and the researcher's observation on a net of 16 small supermarkets. From those studies a model of classification of companies was created. The practical implications of this work are useful to point mistakes in retail administration that can become losses, reducing the profitability of companies or even making them impracticable. The academic contribution of this study is a proposal of an unpublished model of classification for small supermarkets based on the use of techniques of loss prevention. As a result of the research, 14 companies were classified as Companies with Minimum Use of Loss Prevention Techniques - CMULPT, and 02 companies were classified as Companies with Deficient Use of Loss Prevention Techniques - CDULPT. The result of the research concludes that on average the group was classified as being Companies with Minimum Use of Techniques of Prevention of Losses EUMTPP, and that the companies should adopt a program of loss prevention focusing in the identification and quantification of losses and in a implantation of a culture of loss prevention
Resumo:
Este trabalho tem como objetivo estudar os sistemas de Classificações existentes para a garantia da gestão da qualidade no setor hoteleiro, tendo como foco principal a Matriz de Classificação para os Meios de Hospedagem da EMBRATUR e a ISO 9000, observando os benefícios que esses sistemas e/ou processos de gestão poderão vir a proporcionar para o setor hoteleiro no que se refere à qualidade de seus serviços. Para a obtenção dessas informações foi realizada uma análise comparativa dos sistemas de gestão da qualidade através de pesquisas bibliográficas e de questionários enviados para empreendimentos hoteleiros certificados e classificados, onde os principais resultados fornecidos pela pesquisa foram trabalhados de forma a apresentar, de maneira clara, a superioridade de um sistema em relação ao outro
Resumo:
The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column
Resumo:
The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries
Resumo:
Post dispatch analysis of signals obtained from digital disturbances registers provide important information to identify and classify disturbances in systems, looking for a more efficient management of the supply. In order to enhance the task of identifying and classifying the disturbances - providing an automatic assessment - techniques of digital signal processing can be helpful. The Wavelet Transform has become a very efficient tool for the analysis of voltage or current signals, obtained immediately after disturbance s occurrences in the network. This work presents a methodology based on the Discrete Wavelet Transform to implement this process. It uses a comparison between distribution curves of signals energy, with and without disturbance. This is done for different resolution levels of its decomposition in order to obtain descriptors that permit its classification, using artificial neural networks
Resumo:
The precision and the fast identification of abnormalities of bottom hole are essential to prevent damage and increase production in the oil industry. This work presents a study about a new automatic approach to the detection and the classification of operation mode in the Sucker-rod Pumping through dynamometric cards of bottom hole. The main idea is the recognition of the well production status through the image processing of the bottom s hole dynamometric card (Boundary Descriptors) and statistics and similarity mathematics tools, like Fourier Descriptor, Principal Components Analysis (PCA) and Euclidean Distance. In order to validate the proposal, the Sucker-Rod Pumping system real data are used
Resumo:
The Brain-Computer Interfaces (BCI) have as main purpose to establish a communication path with the central nervous system (CNS) independently from the standard pathway (nervous, muscles), aiming to control a device. The main objective of the current research is to develop an off-line BCI that separates the different EEG patterns resulting from strictly mental tasks performed by an experimental subject, comparing the effectiveness of different signal-preprocessing approaches. We also tested different classification approaches: all versus all, one versus one and a hierarchic classification approach. No preprocessing techniques were found able to improve the system performance. Furthermore, the hierarchic approach proved to be capable to produce results above the expected by literature
Resumo:
Reinforcement learning is a machine learning technique that, although finding a large number of applications, maybe is yet to reach its full potential. One of the inadequately tested possibilities is the use of reinforcement learning in combination with other methods for the solution of pattern classification problems. It is well documented in the literature the problems that support vector machine ensembles face in terms of generalization capacity. Algorithms such as Adaboost do not deal appropriately with the imbalances that arise in those situations. Several alternatives have been proposed, with varying degrees of success. This dissertation presents a new approach to building committees of support vector machines. The presented algorithm combines Adaboost algorithm with a layer of reinforcement learning to adjust committee parameters in order to avoid that imbalances on the committee components affect the generalization performance of the final hypothesis. Comparisons were made with ensembles using and not using the reinforcement learning layer, testing benchmark data sets widely known in area of pattern classification
Resumo:
Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)
Resumo:
The pattern classification is one of the machine learning subareas that has the most outstanding. Among the various approaches to solve pattern classification problems, the Support Vector Machines (SVM) receive great emphasis, due to its ease of use and good generalization performance. The Least Squares formulation of SVM (LS-SVM) finds the solution by solving a set of linear equations instead of quadratic programming implemented in SVM. The LS-SVMs provide some free parameters that have to be correctly chosen to achieve satisfactory results in a given task. Despite the LS-SVMs having high performance, lots of tools have been developed to improve them, mainly the development of new classifying methods and the employment of ensembles, in other words, a combination of several classifiers. In this work, our proposal is to use an ensemble and a Genetic Algorithm (GA), search algorithm based on the evolution of species, to enhance the LSSVM classification. In the construction of this ensemble, we use a random selection of attributes of the original problem, which it splits the original problem into smaller ones where each classifier will act. So, we apply a genetic algorithm to find effective values of the LS-SVM parameters and also to find a weight vector, measuring the importance of each machine in the final classification. Finally, the final classification is obtained by a linear combination of the decision values of the LS-SVMs with the weight vector. We used several classification problems, taken as benchmarks to evaluate the performance of the algorithm and compared the results with other classifiers