893 resultados para COMPONENT ANALYSIS


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Blind Source Separation (BSS) refers to the problem of estimate original signals from observed linear mixtures with no knowledge about the sources or the mixing process. Independent Component Analysis (ICA) is a technique mainly applied to BSS problem and from the algorithms that implement this technique, FastICA is a high performance iterative algorithm of low computacional cost that uses nongaussianity measures based on high order statistics to estimate the original sources. The great number of applications where ICA has been found useful reects the need of the implementation of this technique in hardware and the natural paralelism of FastICA favors the implementation of this algorithm on digital hardware. This work proposes the implementation of FastICA on a reconfigurable hardware platform for the viability of it's use in blind source separation problems, more specifically in a hardware prototype embedded in a Field Programmable Gate Array (FPGA) board for the monitoring of beds in hospital environments. The implementations will be carried out by Simulink models and it's synthesizing will be done through the DSP Builder software from Altera Corporation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Image compress consists in represent by small amount of data, without loss a visual quality. Data compression is important when large images are used, for example satellite image. Full color digital images typically use 24 bits to specify the color of each pixel of the Images with 8 bits for each of the primary components, red, green and blue (RGB). Compress an image with three or more bands (multispectral) is fundamental to reduce the transmission time, process time and record time. Because many applications need images, that compression image data is important: medical image, satellite image, sensor etc. In this work a new compression color images method is proposed. This method is based in measure of information of each band. This technique is called by Self-Adaptive Compression (S.A.C.) and each band of image is compressed with a different threshold, for preserve information with better result. SAC do a large compression in large redundancy bands, that is, lower information and soft compression to bands with bigger amount of information. Two image transforms are used in this technique: Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA). Primary step is convert data to new bands without relationship, with PCA. Later Apply DCT in each band. Data Loss is doing when a threshold discarding any coefficients. This threshold is calculated with two elements: PCA result and a parameter user. Parameters user define a compression tax. The system produce three different thresholds, one to each band of image, that is proportional of amount information. For image reconstruction is realized DCT and PCA inverse. SAC was compared with JPEG (Joint Photographic Experts Group) standard and YIQ compression and better results are obtain, in MSE (Mean Square Root). Tests shown that SAC has better quality in hard compressions. With two advantages: (a) like is adaptive is sensible to image type, that is, presents good results to divers images kinds (synthetic, landscapes, people etc., and, (b) it need only one parameters user, that is, just letter human intervention is required

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work considers the development of a filtering system composed of an intelligent algorithm, that separates information and noise coming from sensors interconnected by Foundation Fieldbus (FF) network. The algorithm implementation will be made through FF standard function blocks, with on-line training through OPC (OLE for Process Control), and embedded technology in a DSP (Digital Signal Processor) that interacts with the fieldbus devices. The technique ICA (Independent Component Analysis), that explores the possibility of separating mixed signals based on the fact that they are statistically independent, was chosen to this Blind Source Separation (BSS) process. The algorithm and its implementations will be Presented, as well as the results

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The oil industry has several segments that can impact the environment. Among these, produced water which has been highlight in the environmental problem because of the great volume generated and its toxic composition. Those waters are the major source of waste in the oil industry. The composition of the produced water is strongly dependent on the production field. A good example is the wastewater produced on a Petrobras operating unit of Rio Grande do Norte and Ceará (UO-RNCE). A single effluent treatment station (ETS) of this unit receives effluent from 48 wells (onshore and offshore), which leads a large fluctuations in the water quality that can become a complicating factor for future treatment processes. The present work aims to realize a diagnosis of a sample of produced water from the OU - RNCE in compliance to certain physical and physico-chemical parameters (chloride concentration, conductivity, dissolved oxygen, pH, TOG (oil & grease), nitrate concentration, turbidity, salinity and temperature). The analysis of the effluent is accomplished by means of a MP TROLL 9500 Multiparameter probe, a TOG/TPH Infracal from Wilks Enterprise Corp. - Model HATR - T (TOG) and a MD-31 condutivimeter of Digimed. Results were analyzed by univariated and multivariated analysis (principal component analysis) associated statistical control charts. The multivariate analysis showed a negative correlation between dissolved oxygen and turbidity (-0.55) and positive correlations between salinity and chloride (1), conductivity, chloride and salinity (0.70). Multivariated analysis showed there are seven principal components which can explain the variability of the parameters. The variables, salinity, conductivity and chloride were the most important variables, with, higher sampling variance. Statistical control charts have helped to establish a general trend between the physical and chemical evaluated parameters

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reports on a sensor array able to distinguish tastes and used to classify red wines. The array comprises sensing units made from Langmuir-Blodgett (LB) films of conducting polymers and lipids and layer-by-layer (LBL) films from chitosan deposited onto gold interdigitated electrodes. Using impedance spectroscopy as the principle of detection, we show that distinct clusters can be identified in principal component analysis (PCA) plots for six types of red wine. Distinction can be made with regard to vintage, vineyard and brands of the red wine. Furthermore, if the data are treated with artificial neural networks (ANNs), this artificial tongue can identify wine samples stored under different conditions. This is illustrated by considering 900 wine samples, obtained with 30 measurements for each of the five bottles of the six wines, which could be recognised with 100% accuracy using the algorithms Standard Backpropagation and Backpropagation momentum in the ANNs. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The versatility of sensor arrays made from nanostructured Langmuir-Blodgett (LB) and layer-by-layer (LBL) films is demonstrated in two ways. First, different combinations of sensing units are employed to distinguish the basic tastes, viz. sweet, sour, bitter, and salty tastes, produced, respectively, by small concentrations (down to 0.01 g/mol) of sucrose, HCl, quinine, and NaCl solutions. The sensing units are comprised of LB and/or LBL films from semiconducting polymers, a ruthenium complex, and sulfonated lignin. Then, sensor arrays were used to identify wines from different sources, with the high distinguishing ability being demonstrated in principal component analysis (PCA) plots. Particularly important was the fact that the sensing ability does not depend on specific interactions between analytes and the film materials, but a judicious choice of materials is, nevertheless, required for the materials to respond differently to a given sample. It is also shown that the interaction with the analyte may affect the morphology of the nanostructured films, as indicated with scanning electron microscopy. For instance, in wine analysis these changes are not irreversible and the original film morphology is retrieved if the sensing unit is washed with copious amounts of water, thus allowing the sensor unit to be reused.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The synthesis of a poly(azo)urethane by fixing CO2 in bis-epoxide followed by a polymerization reaction with an azodiamine is presented. Since isocyanate is not used in the process, it is termed clean method and the polymers obtained are named NIPUs (non-isocyanate polyurethanes). Langmuir films were formed at the air-water interface and were characterized by surface pressure vs mean molecular area per met unit (Pi-A) isotherms. The Langmuir monolayers were further studied by running stability tests and cycles of compression/expansion (possible hysteresis) and by varying the compression speed of the monolayer formation, the subphase temperature, and the solvents used to prepare the spreading polymer solutions. The Langmuir-Blodgett (LB) technique was used to fabricate ultrathin films of a particular polymer (PAzoU). It is possible to grow homogeneous LB films of up to 15 layers as monitored using UV-vis absorption spectroscopy. Higher number of layers can be deposited when PAzoU is mixed with stearic acid, producing mixed LB films. Fourier transform infrared (FTIR) absorption spectroscopy and Raman scattering showed that the materials do not interact chemically in the mixed LB films. The atomic force microscopy (AFM) and micro-Raman technique (optical microscopy coupled to Raman spectrograph) revealed that mixed LB films present a phase separation distinguishable at micrometer or nanometer scale. Finally, mixed and neat LB films were successfully characterized using impedance spectroscopy at different temperatures, a property that may lead to future application as temperature sensors. Principal component analysis (PCA) was used to correlate the data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Chemical sensors made from nanostructured films of poly(o-ethoxyaniline) POEA and poly(sodium 4-styrene sulfonate) PSS are produced and used to detect and distinguish 4 chemicals in solution at 20 mM, including sucrose, NaCl, HCl, and caffeine. These substances are used in order to mimic the 4 basic tastes recognized by humans, namely sweet, salty, sour, and bitter, respectively. The sensors are produced by the deposition of POEA/PSS films at the top of interdigitated microelectrodes via the layer-by-layer technique, using POEA solutions containing different dopant acids. Besides the different characteristics of the POEA/PSS films investigated by UV-Vis and Raman spectroscopies, and by atomic force microscopy.. it is observed that their electrical response to the different chemicals in liquid media is very fast, in the order of seconds, systematical, reproducible, and extremely dependent on the type of acid used for film fabrication. The responses of the as-prepared sensors are reproducible and repetitive after many cycles of operation. Furthermore, the use of an "electronic tongue" composed by an array of these sensors and principal component analysis as pattern recognition tool allows one to reasonably distinguish test solutions according to their chemical composition. (c) 2007 Published by Elsevier B.V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)