17 resultados para mean field independent component analysis

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

LOPES-DOS-SANTOS, V. , CONDE-OCAZIONEZ, S. ; NICOLELIS, M. A. L. , RIBEIRO, S. T. , TORT, A. B. L. . Neuronal assembly detection and cell membership specification by principal component analysis. Plos One, v. 6, p. e20996, 2011.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent progress in the technology for single unit recordings has given the neuroscientific community theopportunity to record the spiking activity of large neuronal populations. At the same pace, statistical andmathematical tools were developed to deal with high-dimensional datasets typical of such recordings.A major line of research investigates the functional role of subsets of neurons with significant co-firingbehavior: the Hebbian cell assemblies. Here we review three linear methods for the detection of cellassemblies in large neuronal populations that rely on principal and independent component analysis.Based on their performance in spike train simulations, we propose a modified framework that incorpo-rates multiple features of these previous methods. We apply the new framework to actual single unitrecordings and show the existence of cell assemblies in the rat hippocampus, which typically oscillate attheta frequencies and couple to different phases of the underlying field rhythm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blind Source Separation (BSS) refers to the problem of estimate original signals from observed linear mixtures with no knowledge about the sources or the mixing process. Independent Component Analysis (ICA) is a technique mainly applied to BSS problem and from the algorithms that implement this technique, FastICA is a high performance iterative algorithm of low computacional cost that uses nongaussianity measures based on high order statistics to estimate the original sources. The great number of applications where ICA has been found useful reects the need of the implementation of this technique in hardware and the natural paralelism of FastICA favors the implementation of this algorithm on digital hardware. This work proposes the implementation of FastICA on a reconfigurable hardware platform for the viability of it's use in blind source separation problems, more specifically in a hardware prototype embedded in a Field Programmable Gate Array (FPGA) board for the monitoring of beds in hospital environments. The implementations will be carried out by Simulink models and it's synthesizing will be done through the DSP Builder software from Altera Corporation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

LOPES-DOS-SANTOS, V. , CONDE-OCAZIONEZ, S. ; NICOLELIS, M. A. L. , RIBEIRO, S. T. , TORT, A. B. L. . Neuronal assembly detection and cell membership specification by principal component analysis. Plos One, v. 6, p. e20996, 2011.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent progress in the technology for single unit recordings has given the neuroscientific community theopportunity to record the spiking activity of large neuronal populations. At the same pace, statistical andmathematical tools were developed to deal with high-dimensional datasets typical of such recordings.A major line of research investigates the functional role of subsets of neurons with significant co-firingbehavior: the Hebbian cell assemblies. Here we review three linear methods for the detection of cellassemblies in large neuronal populations that rely on principal and independent component analysis.Based on their performance in spike train simulations, we propose a modified framework that incorpo-rates multiple features of these previous methods. We apply the new framework to actual single unitrecordings and show the existence of cell assemblies in the rat hippocampus, which typically oscillate attheta frequencies and couple to different phases of the underlying field rhythm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work considers the development of a filtering system composed of an intelligent algorithm, that separates information and noise coming from sensors interconnected by Foundation Fieldbus (FF) network. The algorithm implementation will be made through FF standard function blocks, with on-line training through OPC (OLE for Process Control), and embedded technology in a DSP (Digital Signal Processor) that interacts with the fieldbus devices. The technique ICA (Independent Component Analysis), that explores the possibility of separating mixed signals based on the fact that they are statistically independent, was chosen to this Blind Source Separation (BSS) process. The algorithm and its implementations will be Presented, as well as the results

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On Rio Grande do Norte northern coast the process of sediment transport are intensely controlled by wind and sea (waves and currents) action, causing erosion and shoreline morphological instability. Due to the importance of such coastal zone it was realized the multi-spectral mapping and physical-chemical characterization of mudflats and mangroves aiming to support the mitigating actions related to the containment of the erosive process on the oil fields of Macau and Serra installed at the study area. The multi-spectral bands of 2000 and 2008 LANDSAT 5 TM images were submitted on the several digital processing steps and RGB color compositions integrating spectral bands and Principal Components. Such processing methodology was important to the mapping of different units on surface, together with field works. It was possible to make an analogy of the spectral characteristics of wetlands with vegetations areas (mangrove), showing the possibility to make a restoration of this area, contributing with the environmental monitoring of that ecosystem. The maps of several units were integrated in GIS environment at 1:60,000 scale, including the classification of features according to the presence or absence of vegetation cover. Thus, the strategy of methodology established that there are 10.13 km2 at least of sandy-muddy and of these approximately 0.89 km2 with the possibility to be used in a reforestation of typical flora of mangrove. The physical-chemical characterization showed areas with potential to introduce local species of mangrove and they had a pH above neutral with a mean of 8.4. The characteristic particle size is sand in the fine fractions, the high levels of carbonate, organic matter and major and trace element in general are concentrated where the sediment had the less particles size, showing the high correlation that those elements have with smaller particles of sediment. The application of that methodological strategy is relevant to the better understanding of features behavior and physical-chemical data of sediment samples collected on field allow the analysis of efficiency/capability of sandy-muddy to reforestation with local mangrove species for mitigation of the erosive action and coastal processes on the areas occupied by the oil industry

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in ultraviolet radiation (UV) at surface, the high incidence of non-melanoma skin cancer (NMSC) in coast of Northeast of Brazil (NEB) and reduction of total ozone were the motivation for the present study. The overall objective was to identify and understand the variability of UV or Index Ultraviolet Radiation (UV Index) in the capitals of the east coast of the NEB and adjust stochastic models to time series of UV index aiming make predictions (interpolations) and forecasts / projections (extrapolations) followed by trend analysis. The methodology consisted of applying multivariate analysis (principal component analysis and cluster analysis), Predictive Mean Matching method for filling gaps in the data, autoregressive distributed lag (ADL) and Mann-Kendal. The modeling via the ADL consisted of parameter estimation, diagnostics, residuals analysis and evaluation of the quality of the predictions and forecasts via mean squared error and Pearson correlation coefficient. The research results indicated that the annual variability of UV in the capital of Rio Grande do Norte (Natal) has a feature in the months of September and October that consisting of a stabilization / reduction of UV index because of the greater annual concentration total ozone. The increased amount of aerosol during this period contributes in lesser intensity for this event. The increased amount of aerosol during this period contributes in lesser intensity for this event. The application of cluster analysis on the east coast of the NEB showed that this event also occurs in the capitals of Paraiba (João Pessoa) and Pernambuco (Recife). Extreme events of UV in NEB were analyzed from the city of Natal and were associated with absence of cloud cover and levels below the annual average of total ozone and did not occurring in the entire region because of the uneven spatial distribution of these variables. The ADL (4, 1) model, adjusted with data of the UV index and total ozone to period 2001-2012 made a the projection / extrapolation for the next 30 years (2013-2043) indicating in end of that period an increase to the UV index of one unit (approximately), case total ozone maintain the downward trend observed in study period

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dyslipidemia and excess weight in adolescents, when combined, suggest a progression of risk factors for cardiovascular disease (CVD). Besides these, the dietary habits and lifestyle have also been considered unsuitable impacting the development of chronic diseases. The study objectives were: (1) estimate the prevalence of lipid profile and correlate with body mass index (BMI), waist circumference (WC) and waist / height ratio (WHR) in adolescents, considering the maturation sexual, (2) know the sources of variance in the diet and the number of days needed to estimate the usual diet of adolescents and (3) describe the dietary patterns and lifestyle of adolescents, family history of CVD and age correlates them with the patterns of risk for CVD, adjusted for sexual maturation. A cross-sectional study was performed with 432 adolescents, aged 10-19 years from public schools of the Natal city, Brazil. The dyslipidemias were evaluated considering the lipid profile, the index of I Castelli (TC / HDL) and II (LDL / HDL) and non-HDL cholesterol. Anthropometric indicators were BMI, WC and WHR. The intake of energy, nutrients including fiber, fatty acids and cholesterol was estimated from two 24-hour recalls (24HR). The variables of lipid profile, anthropometric and clinical data were used in the models of Pearson correlation and linear regression, considering the sexual maturation. The variance ratio of the diet was calculated from the component-person variance, determined by analysis of variance (ANOVA). The definition of the number of days to estimate the usual intake of each nutrient was obtained by taking the hypothetical correlation (r) ≥ 0.9, between nutrient intake and the true observed. We used the principal component analysis as a method of extracting factors that 129 accounted for the dependent variables and known cardiovascular risk obtained from the lipid profile, the index for Castelli I and II, non-HDL cholesterol, BMI, and WC the WHR. Dietary patterns and lifestyle were obtained from the independent variables, based on nutrients consumed and physical activity weekly. In the study of principal component analysis (PCA) was investigated associations between the patterns of cardiovascular risk factors in dietary patterns and lifestyle, age and positive family history of CVD, through bivariate and multiple logistic regression adjusted for sexual maturation. The low HDL-C dyslipidemia was most prevalent (50.5%) for adolescents. Significant correlations were observed between hypercholesterolemia and positive family history of CVD (r = 0.19, p <0.01) and hypertriglyceridemia with BMI (r = 0.30, p <0.01), with the CC (r = 0.32, p <0.01) and WHR (r = 0.33, p <0.01). The linear model constructed with sexual maturation, age and BMI explained about 1 to 10.4% of the variation in the lipid profile. The sources of variance between individuals were greater for all nutrients in both sexes. The reasons for variances were  1 for all nutrients were higher in females. The results suggest that to assess the diet of adolescents with greater precision, 2 days would be enough to R24h consumption of energy, carbohydrates, fiber, saturated and monounsaturated fatty acids. In contrast, 3 days would be recommended for protein, lipid, polyunsaturated fatty acids and cholesterol. Two cardiovascular risk factors as have been extracted in the ACP, referring to the dependent variables: the standard lipid profile (HDL-C and non-HDL cholesterol) and "standard anthropometric index (BMI, WC, WHR) with a power explaining 75% of the variance of the original data. The factors are representative of two independent variables led to dietary patterns, "pattern 130 western diet" and "pattern protein diet", and one on the lifestyle, "pattern energy balance". Together, these patterns provide an explanation power of 67%. Made adjustment for sexual maturation in males remained significant variables: the associations between puberty and be pattern anthropometric indicator (OR = 3.32, CI 1.34 to 8.17%), and between family history of CVD and the pattern lipid profile (OR = 2.62, CI 1.20 to 5.72%). In females adolescents, associations were identified between age after the first stage of puberty with anthropometric pattern (OR = 3.59, CI 1.58 to 8.17%) and lipid profile (OR = 0.33, CI 0.15 to 0.75%). Conclusions: The low HDL-C was the most prevalent dyslipidemia independent of sex and nutritional status of adolescents. Hypercholesterolemia was influenced by family history of CVD and sexual maturation, in turn, hypertriglyceridemia was closely associated with anthropometric indicators. The variance between the diets was greater for all nutrients. This fact reflected in a variance ratio less than 1 and consequently in a lower number of days requerid to estimate the usual diet of adolescents considering gender. The two dietary patterns were extracted and the pattern considered unhealthy lifestyle as healthy. The associations were found between the patterns of CVD risk with age and family history of CVD in the studied adolescents

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image compress consists in represent by small amount of data, without loss a visual quality. Data compression is important when large images are used, for example satellite image. Full color digital images typically use 24 bits to specify the color of each pixel of the Images with 8 bits for each of the primary components, red, green and blue (RGB). Compress an image with three or more bands (multispectral) is fundamental to reduce the transmission time, process time and record time. Because many applications need images, that compression image data is important: medical image, satellite image, sensor etc. In this work a new compression color images method is proposed. This method is based in measure of information of each band. This technique is called by Self-Adaptive Compression (S.A.C.) and each band of image is compressed with a different threshold, for preserve information with better result. SAC do a large compression in large redundancy bands, that is, lower information and soft compression to bands with bigger amount of information. Two image transforms are used in this technique: Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA). Primary step is convert data to new bands without relationship, with PCA. Later Apply DCT in each band. Data Loss is doing when a threshold discarding any coefficients. This threshold is calculated with two elements: PCA result and a parameter user. Parameters user define a compression tax. The system produce three different thresholds, one to each band of image, that is proportional of amount information. For image reconstruction is realized DCT and PCA inverse. SAC was compared with JPEG (Joint Photographic Experts Group) standard and YIQ compression and better results are obtain, in MSE (Mean Square Root). Tests shown that SAC has better quality in hard compressions. With two advantages: (a) like is adaptive is sensible to image type, that is, presents good results to divers images kinds (synthetic, landscapes, people etc., and, (b) it need only one parameters user, that is, just letter human intervention is required

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The oil industry has several segments that can impact the environment. Among these, produced water which has been highlight in the environmental problem because of the great volume generated and its toxic composition. Those waters are the major source of waste in the oil industry. The composition of the produced water is strongly dependent on the production field. A good example is the wastewater produced on a Petrobras operating unit of Rio Grande do Norte and Ceará (UO-RNCE). A single effluent treatment station (ETS) of this unit receives effluent from 48 wells (onshore and offshore), which leads a large fluctuations in the water quality that can become a complicating factor for future treatment processes. The present work aims to realize a diagnosis of a sample of produced water from the OU - RNCE in compliance to certain physical and physico-chemical parameters (chloride concentration, conductivity, dissolved oxygen, pH, TOG (oil & grease), nitrate concentration, turbidity, salinity and temperature). The analysis of the effluent is accomplished by means of a MP TROLL 9500 Multiparameter probe, a TOG/TPH Infracal from Wilks Enterprise Corp. - Model HATR - T (TOG) and a MD-31 condutivimeter of Digimed. Results were analyzed by univariated and multivariated analysis (principal component analysis) associated statistical control charts. The multivariate analysis showed a negative correlation between dissolved oxygen and turbidity (-0.55) and positive correlations between salinity and chloride (1), conductivity, chloride and salinity (0.70). Multivariated analysis showed there are seven principal components which can explain the variability of the parameters. The variables, salinity, conductivity and chloride were the most important variables, with, higher sampling variance. Statistical control charts have helped to establish a general trend between the physical and chemical evaluated parameters

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we used chemometric tools to classify and quantify the protein content in samples of milk powder. We applied the NIR diffuse reflectance spectroscopy combined with multivariate techniques. First, we carried out an exploratory method of samples by principal component analysis (PCA), then the classification of independent modeling of class analogy (SIMCA). Thus it became possible to classify the samples that were grouped by similarities in their composition. Finally, the techniques of partial least squares regression (PLS) and principal components regression (PCR) allowed the quantification of protein content in samples of milk powder, compared with the Kjeldahl reference method. A total of 53 samples of milk powder sold in the metropolitan areas of Natal, Salvador and Rio de Janeiro were acquired for analysis, in which after pre-treatment data, there were four models, which were employed for classification and quantification of samples. The methods employed after being assessed and validated showed good performance, good accuracy and reliability of the results, showing that the NIR technique can be a non invasive technique, since it produces no waste and saves time in analyzing the samples

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aiming to consumer s safety the presence of pathogenic contaminants in foods must be monitored because they are responsible for foodborne outbreaks that depending on the level of contamination can ultimately cause the death of those who consume them. In industry is necessary that this identification be fast and profitable. This study shows the utility and application of near-infrared (NIR) transflectance spectroscopy as an alternative method for the identification and classification of Escherichia coli and Salmonella Enteritidis in commercial fruit pulp (pineapple). Principal Component Analysis (PCA), Independent Modeling of Class Analogy (SIMCA) and Discriminant Analysis Partial Least Squares (PLS-DA) were used in the analysis. It was not possible to obtain total separation between samples using PCA and SIMCA. The PLS-DA showed good performance in prediction capacity reaching 87.5% for E. coli and 88.3% for S. Enteritides, respectively. The best models were obtained for the PLS-DA with second derivative spectra treated with a sensitivity and specificity of 0.87 and 0.83, respectively. These results suggest that the NIR spectroscopy and PLS-DA can be used to discriminate and detect bacteria in the fruit pulp