45 resultados para Hopfield neural network


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use networks composed of three phase-locked loops (PLLs), where one of them is the master, for recognizing noisy images. The values of the coupling weights among the PLLs control the noise level which does not affect the successful identification of the input image. Analytical results and numerical tests are presented concerning the scheme performance. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accurate price forecasting for agricultural commodities can have significant decision-making implications for suppliers, especially those of biofuels, where the agriculture and energy sectors intersect. Environmental pressures and high oil prices affect demand for biofuels and have reignited the discussion about effects on food prices. Suppliers in the sugar-alcohol sector need to decide the ideal proportion of ethanol and sugar to optimise their financial strategy. Prices can be affected by exogenous factors, such as exchange rates and interest rates, as well as non-observable variables like the convenience yield, which is related to supply shortages. The literature generally uses two approaches: artificial neural networks (ANNs), which are recognised as being in the forefront of exogenous-variable analysis, and stochastic models such as the Kalman filter, which is able to account for non-observable variables. This article proposes a hybrid model for forecasting the prices of agricultural commodities that is built upon both approaches and is applied to forecast the price of sugar. The Kalman filter considers the structure of the stochastic process that describes the evolution of prices. Neural networks allow variables that can impact asset prices in an indirect, nonlinear way, what cannot be incorporated easily into traditional econometric models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Leaf wetness duration (LWD) is a key parameter in agricultural meteorology since it is related to epidemiology of many important crops, controlling pathogen infection and development rates. Because LWD is not widely measured, several methods have been developed to estimate it from weather data. Among the models used to estimate LWD, those that use physical principles of dew formation and dew and/or rain evaporation have shown good portability and sufficiently accurate results, but their complexity is a disadvantage for operational use. Alternatively, empirical models have been used despite their limitations. The simplest empirical models use only relative humidity data. The objective of this study was to evaluate the performance of three RH-based empirical models to estimate LWD in four regions around the world that have different climate conditions. Hourly LWD, air temperature, and relative humidity data were obtained from Ames, IA (USA), Elora, Ontario (Canada), Florence, Toscany (Italy), and Piracicaba, Sao Paulo State (Brazil). These data were used to evaluate the performance of the following empirical LWD estimation models: constant RH threshold (RH >= 90%); dew point depression (DPD); and extended RH threshold (EXT_RH). Different performance of the models was observed in the four locations. In Ames, Elora and Piracicaba, the RH >= 90% and DPD models underestimated LWD, whereas in Florence these methods overestimated LWD, especially for shorter wet periods. When the EXT_RH model was used, LWD was overestimated for all locations, with a significant increase in the errors. In general, the RH >= 90% model performed best, presenting the highest general fraction of correct estimates (F(C)), between 0.87 and 0.92, and the lowest false alarm ratio (F(AR)), between 0.02 and 0.31. The use of specific thresholds for each location improved accuracy of the RH model substantially, even when independent data were used; MAE ranged from 1.23 to 1.89 h, which is very similar to errors obtained with published physical models for LWD estimation. Based on these results, we concluded that, if calibrated locally, LWD can be estimated with acceptable accuracy by RH above a specific threshold, and that the EXT_RH method was unsuitable for estimating LWD at the locations used in this study. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rats were trained in a Pavlovian serial ambiguous target discrimination, in which a target cue was reinforced if it was preceded by one stimulus (P -> T+) but was not reinforced if it was preceded by another stimulus (N -> T-). Test performance indicated that stimulus control by these features was weaker than that acquired by features trained within separate serial feature positive (P -> T+, T-) and serial feature negative (N -> W-, W+) discriminations. The form of conditioned responding and the patterns of transfer observed suggested that the serial ambiguous target discrimination was solved by occasion setting. The data are discussed in terms of the use of retrospective coding strategies when solving Pavlovian serial conditional discriminations, and the acquisition of special properties by both feature and target stimuli. (C) 2008 Published by Elsevier B.V.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We discuss the expectation propagation (EP) algorithm for approximate Bayesian inference using a factorizing posterior approximation. For neural network models, we use a central limit theorem argument to make EP tractable when the number of parameters is large. For two types of models, we show that EP can achieve optimal generalization performance when data are drawn from a simple distribution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is not a specific test to diagnose Alzheimer`s disease (AD). Its diagnosis should be based upon clinical history, neuropsychological and laboratory tests, neuroimaging and electroencephalography (EEG). Therefore, new approaches are necessary to enable earlier and more accurate diagnosis and to follow treatment results. In this study we used a Machine Learning (ML) technique, named Support Vector Machine (SVM), to search patterns in EEG epochs to differentiate AD patients from controls. As a result, we developed a quantitative EEG (qEEG) processing method for automatic differentiation of patients with AD from normal individuals, as a complement to the diagnosis of probable dementia. We studied EEGs from 19 normal subjects (14 females/5 males, mean age 71.6 years) and 16 probable mild to moderate symptoms AD patients (14 females/2 males, mean age 73.4 years. The results obtained from analysis of EEG epochs were accuracy 79.9% and sensitivity 83.2%. The analysis considering the diagnosis of each individual patient reached 87.0% accuracy and 91.7% sensitivity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The mechanisms underlying the effects of antidepressant treatment in patients with Parkinson`s disease (PD) are unclear. The neural changes after successful therapy investigated by neuroimaging methods can give insights into the mechanisms of action related to a specific treatment choice. To study the mechanisms of neural modulation of repetitive transcranial magnetic Stimulation (rTMS) and fluoxetine, 21 PD depressed patients were randomized into only two active treatment groups for 4 wk: active rTMS over left dorsolateral prefrontal cortex (DLPFC) (5 Hz rTMS; 120% motor threshold) with placebo pill and sham rTMS with fluoxetine 20mg/d. Event-related functional magnetic resonance imaging (fMRI) with emotional stimuli was performed before and after treatment - in two sessions (test and re-test) at each time-point. The two groups of treatment had a significant, similar mood improvement. After rTMS treatment, there were brain activity decreases in left fusiform gyrus, cerebellum and right DLPFC and brain activity increases in left DLPFC and anterior cingulate gyrus compared to baseline. In contrast, after fluoxetine treatment, there were brain activity increases in right premotor and right medial prefrontal cortex. There was a significant interaction effect between groups vs. time in the left medial prefrontal cortex, suggesting that the activity in this area changed differently in the two treatment groups. Our findings show that antidepressant effects of rTMS and fluoxetine in PD are associated with changes in different areas of the depression-related neural network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To develop a model to predict the bleeding source and identify the cohort amongst patients with acute gastrointestinal bleeding (GIB) who require urgent intervention, including endoscopy. Patients with acute GIB, an unpredictable event, are most commonly evaluated and managed by non-gastroenterologists. Rapid and consistently reliable risk stratification of patients with acute GIB for urgent endoscopy may potentially improve outcomes amongst such patients by targeting scarce health-care resources to those who need it the most. Design and methods: Using ICD-9 codes for acute GIB, 189 patients with acute GIB and all. available data variables required to develop and test models were identified from a hospital medical records database. Data on 122 patients was utilized for development of the model and on 67 patients utilized to perform comparative analysis of the models. Clinical data such as presenting signs and symptoms, demographic data, presence of co-morbidities, laboratory data and corresponding endoscopic diagnosis and outcomes were collected. Clinical data and endoscopic diagnosis collected for each patient was utilized to retrospectively ascertain optimal management for each patient. Clinical presentations and corresponding treatment was utilized as training examples. Eight mathematical models including artificial neural network (ANN), support vector machine (SVM), k-nearest neighbor, linear discriminant analysis (LDA), shrunken centroid (SC), random forest (RF), logistic regression, and boosting were trained and tested. The performance of these models was compared using standard statistical analysis and ROC curves. Results: Overall the random forest model best predicted the source, need for resuscitation, and disposition with accuracies of approximately 80% or higher (accuracy for endoscopy was greater than 75%). The area under ROC curve for RF was greater than 0.85, indicating excellent performance by the random forest model Conclusion: While most mathematical models are effective as a decision support system for evaluation and management of patients with acute GIB, in our testing, the RF model consistently demonstrated the best performance. Amongst patients presenting with acute GIB, mathematical models may facilitate the identification of the source of GIB, need for intervention and allow optimization of care and healthcare resource allocation; these however require further validation. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Immunological systems have been an abundant inspiration to contemporary computer scientists. Problem solving strategies, stemming from known immune system phenomena, have been successfully applied to chall enging problems of modem computing. Simulation systems and mathematical modeling are also beginning use to answer more complex immunological questions as immune memory process and duration of vaccines, where the regulation mechanisms are not still known sufficiently (Lundegaard, Lund, Kesmir, Brunak, Nielsen, 2007). In this article we studied in machina a approach to simulate the process of antigenic mutation and its implications for the process of memory. Our results have suggested that the durability of the immune memory is affected by the process of antigenic mutation.and by populations of soluble antibodies in the blood. The results also strongly suggest that the decrease of the production of antibodies favors the global maintenance of immune memory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In temporal lobe epilepsy (TLE) seizures, tonic or clonic motor behaviors (TCB) are commonly associated with automatisms, versions, and vocalizations, and frequently occur during secondary generalization. Dystonias are a common finding and appear to be associated with automatisms and head deviation, but have never been directly linked to generalized tonic or clonic behaviors. The objective of the present study was to assess whether dystonias and TCB are coupled in the same seizure or are associated in an antagonistic and exclusive pattern. Ninety-one seizures in 55 patients with TLE due to mesial temporal sclerosis were analyzed. Only patients with postsurgical seizure outcome of Engel class I or II were included. Presence or absence of dystonia and secondary generalization was recorded. Occurrence of dystonia and occurrence of bilateral tonic or clonic behaviors were negatively correlated. Dystonia and TCB may be implicated in exclusive, non-coincidental, or even antagonistic effects or phenomena in TLE seizures. A neural network related to the expression of one behavioral response (e.g., basal ganglia activation and dystonia) might theoretically ""displace"" brain activation or disrupt the synchronism of another network implicated in pathological circuit reverberation and seizure expression. The involvement of basal ganglia in the blockade of convulsive seizures has long been observed in animal models. The question is: Do dystonia and underlying basal ganglia activation represent an attempt of the brain to block imminent secondary generalization? (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The seasonal evolution of daily and hourly values of global and diffuse solar radiation at the surface are compared for the cities of Sao Paulo and Botucatu, both located in Southeast Brazil and representative of urban and rural areas, respectively. The comparisons are based on measurements of global and diffuse solar irradiance carried out at the surface during a six year simultaneous period in these two cities. Despite the similar latitude and altitude, the seasonal evolution of daily values indicate that Sao Paulo receives, during clear sky days, 7.8% less global irradiance in August and 5.1% less in June than Botucatu. On the other hand, Sao Paulo receives, during clear sky days, 3.6% more diffuse irradiance in August and 15.6% more in June than Botucatu. The seasonal variation of the diurnal cycle confirms these differences and indicates that they are more pronounced during the afternoon. The regional differences are related to the distance from the Atlantic Ocean, systematic penetration of the sea breeze and daytime evolution of the particulate matter in Sao Paulo. An important mechanism controlling the spatial distribution of solar radiation, on a regional scale, is the sea breeze penetration in Sao Paulo, bringing moisture and maritime aerosol that in turn further increases the solar radiation scattering due to pollution and further reduces the intensity of the direct component of solar radiation at the surface. Surprisingly, under clear sky conditions the atmospheric attenuation of solar radiation in Botucatu during winter - the biomass burning period due to the sugar cane harvest - is equivalent to that at Sao Paulo City, indicating that the contamination during sugar cane harvest in Southeast Brazil has a large impact in the solar radiation field at the surface.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a catalogue of galaxy photometric redshifts and k-corrections for the Sloan Digital Sky Survey Data Release 7 (SDSS-DR7), available on the World Wide Web. The photometric redshifts were estimated with an artificial neural network using five ugriz bands, concentration indices and Petrosian radii in the g and r bands. We have explored our redshift estimates with different training sets, thus concluding that the best choice for improving redshift accuracy comprises the main galaxy sample (MGS), the luminous red galaxies and the galaxies of active galactic nuclei covering the redshift range 0 < z < 0.3. For the MGS, the photometric redshift estimates agree with the spectroscopic values within rms = 0.0227. The distribution of photometric redshifts derived in the range 0 < z(phot) < 0.6 agrees well with the model predictions. k-corrections were derived by calibration of the k-correct_v4.2 code results for the MGS with the reference-frame (z = 0.1) (g - r) colours. We adopt a linear dependence of k-corrections on redshift and (g - r) colours that provide suitable distributions of luminosity and colours for galaxies up to redshift z(phot) = 0.6 comparable to the results in the literature. Thus, our k-correction estimate procedure is a powerful, low computational time algorithm capable of reproducing suitable results that can be used for testing galaxy properties at intermediate redshifts using the large SDSS data base.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present preliminary results for the estimation of barium [Ba/Fe], and strontium [Sr/Fe], abundances ratios using medium-resolution spectra (1-2 angstrom). We established a calibration between the abundance ratios and line indices for Ba and Sr, using multiple regression and artificial neural network techniques. A comparison between the two techniques (showing the advantage of the latter), as well as a discussion of future work, is presented.