128 resultados para Extração Lipídica


Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a contemporary tendency, it is been evidenced that the environmental changes theme, already admitted as a concernment to international economical and political reality, is also gaining repercussion on industrial and business sector. Firms are implementing actions on trial to minimize their own greenhouse gases (GHG) emissions impacts. However, the great majority of those actions of Corporative Social-Environmental Responsibility (CSR) are referred only to direct emissions of the main production systems. Direct emissions are those derived of an isolate process, without considering the upstream and downstream processes emissions, which respond for the majority of emissions originated because of respective firm‟s production system existence. Because the greenhouse effect occurs globally and the GHG emissions contribute to the environmental changes independently of their origin, it must be taken into account the whole productive life cycle of products and systems, since the energy invested on resources extraction and necessary materials to the final disposal. To do so, it must be investigated all relevant steps of a product/production system life cycle, tracking all activities which emit greenhouse gases, directly or indirectly. This amount of emissions consists in the firm‟s Carbon Footprint. This research purpose is to defend the Carbon Footprint relevance and its adoption viability to be used as an Environmental Indicator on measurement/assessment of CSR. It has been realized a study case on Petrobras‟s seat unity at Natal-Brazil, assessing part of its Carbon Footprint. It has been used the software GEMIS 4.6 to do the emissions quantifying. The items measured were the direct emissions of the own unity vehicles and indirect emissions of offset paper (A4), energy and disposable plastic cups consumed. To 2009, these emissions were 3.811,94 tCO2eq. We may conclude that Carbon Footprint quantification is indispensable to the knowledge of real emissions caused by a productive process existence, must serving as basis to CSR decisions about the environmental changes reversion challenge

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The human voice is an important communication tool and any disorder of the voice can have profound implications for social and professional life of an individual. Techniques of digital signal processing have been used by acoustic analysis of vocal disorders caused by pathologies in the larynx, due to its simplicity and noninvasive nature. This work deals with the acoustic analysis of voice signals affected by pathologies in the larynx, specifically, edema, and nodules on the vocal folds. The purpose of this work is to develop a classification system of voices to help pre-diagnosis of pathologies in the larynx, as well as monitoring pharmacological treatments and after surgery. Linear Prediction Coefficients (LPC), Mel Frequency cepstral coefficients (MFCC) and the coefficients obtained through the Wavelet Packet Transform (WPT) are applied to extract relevant characteristics of the voice signal. For the classification task is used the Support Vector Machine (SVM), which aims to build optimal hyperplanes that maximize the margin of separation between the classes involved. The hyperplane generated is determined by the support vectors, which are subsets of points in these classes. According to the database used in this work, the results showed a good performance, with a hit rate of 98.46% for classification of normal and pathological voices in general, and 98.75% in the classification of diseases together: edema and nodules

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the rapid growth of databases of various types (text, multimedia, etc..), There exist a need to propose methods for ordering, access and retrieve data in a simple and fast way. The images databases, in addition to these needs, require a representation of the images so that the semantic content characteristics are considered. Accordingly, several proposals such as the textual annotations based retrieval has been made. In the annotations approach, the recovery is based on the comparison between the textual description that a user can make of images and descriptions of the images stored in database. Among its drawbacks, it is noted that the textual description is very dependent on the observer, in addition to the computational effort required to describe all the images in database. Another approach is the content based image retrieval - CBIR, where each image is represented by low-level features such as: color, shape, texture, etc. In this sense, the results in the area of CBIR has been very promising. However, the representation of the images semantic by low-level features is an open problem. New algorithms for the extraction of features as well as new methods of indexing have been proposed in the literature. However, these algorithms become increasingly complex. So, doing an analysis, it is natural to ask whether there is a relationship between semantics and low-level features extracted in an image? and if there is a relationship, which descriptors better represent the semantic? which leads us to a new question: how to use descriptors to represent the content of the images?. The work presented in this thesis, proposes a method to analyze the relationship between low-level descriptors and semantics in an attempt to answer the questions before. Still, it was observed that there are three possibilities of indexing images: Using composed characteristic vectors, using parallel and independent index structures (for each descriptor or set of them) and using characteristic vectors sorted in sequential order. Thus, the first two forms have been widely studied and applied in literature, but there were no records of the third way has even been explored. So this thesis also proposes to index using a sequential structure of descriptors and also the order of these descriptors should be based on the relationship that exists between each descriptor and semantics of the users. Finally, the proposed index in this thesis revealed better than the traditional approachs and yet, was showed experimentally that the order in this sequence is important and there is a direct relationship between this order and the relationship of low-level descriptors with the semantics of the users

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The area of the hospital automation has been the subject a lot of research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others); communication (tracking patients, staff and materials), development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials); and aid to medical diagnosis (according to each speciality). This thesis presents an architecture for a patient monitoring and alert systems. This architecture is based on intelligent systems techniques and is applied in hospital automation, specifically in the Intensive Care Unit (ICU) for the patient monitoring in hospital environment. The main goal of this architecture is to transform the multiparameter monitor data into useful information, through the knowledge of specialists and normal parameters of vital signs based on fuzzy logic that allows to extract information about the clinical condition of ICU patients and give a pre-diagnosis. Finally, alerts are dispatched to medical professionals in case any abnormality is found during monitoring. After the validation of the architecture, the fuzzy logic inferences were applied to the trainning and validation of an Artificial Neural Network for classification of the cases that were validated a priori with the fuzzy system

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Breast cancer, despite being one of the leading causes of death among women worldwide is a disease that can be cured if diagnosed early. One of the main techniques used in the detection of breast cancer is the Fine Needle Aspirate FNA (aspiration puncture by thin needle) which, depending on the clinical case, requires the analysis of several medical specialists for the diagnosis development. However, such diagnosis and second opinions have been hampered by geographical dispersion of physicians and/or the difficulty in reconciling time to undertake work together. Within this reality, this PhD thesis uses computational intelligence in medical decision-making support for remote diagnosis. For that purpose, it presents a fuzzy method to assist the diagnosis of breast cancer, able to process and sort data extracted from breast tissue obtained by FNA. This method is integrated into a virtual environment for collaborative remote diagnosis, whose model was developed providing for the incorporation of prerequisite Modules for Pre Diagnosis to support medical decision. On the fuzzy Method Development, the process of knowledge acquisition was carried out by extraction and analysis of numerical data in gold standard data base and by interviews and discussions with medical experts. The method has been tested and validated with real cases and, according to the sensitivity and specificity achieved (correct diagnosis of tumors, malignant and benign respectively), the results obtained were satisfactory, considering the opinions of doctors and the quality standards for diagnosis of breast cancer and comparing them with other studies involving breast cancer diagnosis by FNA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need to implement a software architecture that promotes the development of a SCADA supervisory system for monitoring industrial processes simulated with the flexibility of adding intelligent modules and devices such as CLP, according to the specifications of the problem, it was the motivation for this work. In the present study, we developed an intelligent supervisory system on a simulation of a distillation column modeled with Unisim. Furthermore, OLE Automation was used as communication between the supervisory and simulation software, which, with the use of the database, promoted an architecture both scalable and easy to maintain. Moreover, intelligent modules have been developed for preprocessing, data characteristics extraction, and variables inference. These modules were fundamentally based on the Encog software

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Visual attention is a very important task in autonomous robotics, but, because of its complexity, the processing time required is significant. We propose an architecture for feature selection using foveated images that is guided by visual attention tasks and that reduces the processing time required to perform these tasks. Our system can be applied in bottom-up or top-down visual attention. The foveated model determines which scales are to be used on the feature extraction algorithm. The system is able to discard features that are not extremely necessary for the tasks, thus, reducing the processing time. If the fovea is correctly placed, then it is possible to reduce the processing time without compromising the quality of the tasks outputs. The distance of the fovea from the object is also analyzed. If the visual system loses the tracking in top-down attention, basic strategies of fovea placement can be applied. Experiments have shown that it is possible to reduce up to 60% the processing time with this approach. To validate the method, we tested it with the feature algorithm known as Speeded Up Robust Features (SURF), one of the most efficient approaches for feature extraction. With the proposed architecture, we can accomplish real time requirements of robotics vision, mainly to be applied in autonomous robotics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the Markov chain will be the tool used in the modeling and analysis of convergence of the genetic algorithm, both the standard version as for the other versions that allows the genetic algorithm. In addition, we intend to compare the performance of the standard version with the fuzzy version, believing that this version gives the genetic algorithm a great ability to find a global optimum, own the global optimization algorithms. The choice of this algorithm is due to the fact that it has become, over the past thirty yares, one of the more importan tool used to find a solution of de optimization problem. This choice is due to its effectiveness in finding a good quality solution to the problem, considering that the knowledge of a good quality solution becomes acceptable given that there may not be another algorithm able to get the optimal solution for many of these problems. However, this algorithm can be set, taking into account, that it is not only dependent on how the problem is represented as but also some of the operators are defined, to the standard version of this, when the parameters are kept fixed, to their versions with variables parameters. Therefore to achieve good performance with the aforementioned algorithm is necessary that it has an adequate criterion in the choice of its parameters, especially the rate of mutation and crossover rate or even the size of the population. It is important to remember that those implementations in which parameters are kept fixed throughout the execution, the modeling algorithm by Markov chain results in a homogeneous chain and when it allows the variation of parameters during the execution, the Markov chain that models becomes be non - homogeneous. Therefore, in an attempt to improve the algorithm performance, few studies have tried to make the setting of the parameters through strategies that capture the intrinsic characteristics of the problem. These characteristics are extracted from the present state of execution, in order to identify and preserve a pattern related to a solution of good quality and at the same time that standard discarding of low quality. Strategies for feature extraction can either use precise techniques as fuzzy techniques, in the latter case being made through a fuzzy controller. A Markov chain is used for modeling and convergence analysis of the algorithm, both in its standard version as for the other. In order to evaluate the performance of a non-homogeneous algorithm tests will be applied to compare the standard fuzzy algorithm with the genetic algorithm, and the rate of change adjusted by a fuzzy controller. To do so, pick up optimization problems whose number of solutions varies exponentially with the number of variables

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electric energy is essential to the development of modern society and its increasing demand in recent years, effect from population and economic growth, becomes the companies more interested in the quality and continuity of supply, factors regulated by ANEEL (Agência Nacional de Energia Elétrica). These factors must be attended when a permanent fault occurs in the system, where the defect location that caused the power interruption should be identified quickly, which is not a simple assignment because the current systems complexity. An example of this occurs in multiple terminals transmission lines, which interconnect existing circuits to feed the demand. These transmission lines have been adopted as a feasible solution to suply loads of magnitudes that do not justify economically the construction of new substations. This paper presents a fault location algorithm for multiple terminals transmission lines - two and three terminals. The location method is based on the use of voltage and current fundamental phasors, as well as the representation of the line through its series impedance. The wavelet transform is an effective mathematical tool in signals analysis with discontinuities and, therefore, is used to synchronize voltage and current data. The Fourier transform is another tool used in this work for extract voltage and current fundamental phasors. Tests to validate the location algorithm applicability used data from faulty signals simulated in ATP (Alternative Transients Program) as well as real data obtained from oscillographic recorders installed on CHESF s lines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a method based on the theory of electromagnetic waves reflected to evaluate the behavior of these waves and the level of attenuation caused in bone tissue. For this, it was proposed the construction of two antennas in microstrip structure with resonance frequency at 2.44 GHz The problem becomes relevant because of the diseases osteometabolic reach a large portion of the population, men and women. With this method, the signal is classified into two groups: tissue mass with bony tissues with normal or low bone mass. For this, techniques of feature extraction (Wavelet Transform) and pattern recognition (KNN and ANN) were used. The tests were performed on bovine bone and tissue with chemicals, the methodology and results are described in the work

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work consists in the use of techniques of signals processing and artificial neural networks to identify leaks in pipes with multiphase flow. In the traditional methods of leak detection exists a great difficulty to mount a profile, that is adjusted to the found in real conditions of the oil transport. These difficult conditions go since the unevenly soil that cause columns or vacuum throughout pipelines until the presence of multiphases like water, gas and oil; plus other components as sand, which use to produce discontinuous flow off and diverse variations. To attenuate these difficulties, the transform wavelet was used to map the signal pressure in different resolution plan allowing the extraction of descriptors that identify leaks patterns and with then to provide training for the neural network to learning of how to classify this pattern and report whenever this characterize leaks. During the tests were used transient and regime signals and pipelines with punctures with size variations from ½' to 1' of diameter to simulate leaks and between Upanema and Estreito B, of the UN-RNCE of the Petrobras, where it was possible to detect leaks. The results show that the proposed descriptors considered, based in statistical methods applied in domain transform, are sufficient to identify leaks patterns and make it possible to train the neural classifier to indicate the occurrence of pipeline leaks

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digital signal processing (DSP) aims to extract specific information from digital signals. Digital signals are, by definition, physical quantities represented by a sequence of discrete values and from these sequences it is possible to extract and analyze the desired information. The unevenly sampled data can not be properly analyzed using standard techniques of digital signal processing. This work aimed to adapt a technique of DSP, the multiresolution analysis, to analyze unevenly smapled data, to aid the studies in the CoRoT laboratory at UFRN. The process is based on re-indexing the wavelet transform to handle unevenly sampled data properly. The was efective presenting satisfactory results

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need to implement a software architecture that promotes the development of a SCADA supervisory system for monitoring industrial processes simulated with the flexibility of adding intelligent modules and devices such as CLP, according to the specifications of the problem, it was the motivation for this work. In the present study, we developed an intelligent supervisory system on a simulation of a distillation column modeled with Unisim. Furthermore, OLE Automation was used as communication between the supervisory and simulation software, which, with the use of the database, promoted an architecture both scalable and easy to maintain. Moreover, intelligent modules have been developed for preprocessing, data characteristics extraction, and variables inference. These modules were fundamentally based on the Encog software