745 resultados para Neo-Fuzzy Neuron
Resumo:
Pós-graduação em Engenharia Mecânica - FEG
Resumo:
Backgrounds Ea aims: The boundaries between the categories of body composition provided by vectorial analysis of bioimpedance are not well defined. In this paper, fuzzy sets theory was used for modeling such uncertainty. Methods: An Italian database with 179 cases 18-70 years was divided randomly into developing (n = 20) and testing samples (n = 159). From the 159 registries of the testing sample, 99 contributed with unequivocal diagnosis. Resistance/height and reactance/height were the input variables in the model. Output variables were the seven categories of body composition of vectorial analysis. For each case the linguistic model estimated the membership degree of each impedance category. To compare such results to the previously established diagnoses Kappa statistics was used. This demanded singling out one among the output set of seven categories of membership degrees. This procedure (defuzzification rule) established that the category with the highest membership degree should be the most likely category for the case. Results: The fuzzy model showed a good fit to the development sample. Excellent agreement was achieved between the defuzzified impedance diagnoses and the clinical diagnoses in the testing sample (Kappa = 0.85, p < 0.001). Conclusions: fuzzy linguistic model was found in good agreement with clinical diagnoses. If the whole model output is considered, information on to which extent each BIVA category is present does better advise clinical practice with an enlarged nosological framework and diverse therapeutic strategies. (C) 2012 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Resumo:
There are some variants of the widely used Fuzzy C-Means (FCM) algorithm that support clustering data distributed across different sites. Those methods have been studied under different names, like collaborative and parallel fuzzy clustering. In this study, we offer some augmentation of the two FCM-based clustering algorithms used to cluster distributed data by arriving at some constructive ways of determining essential parameters of the algorithms (including the number of clusters) and forming a set of systematically structured guidelines such as a selection of the specific algorithm depending on the nature of the data environment and the assumptions being made about the number of clusters. A thorough complexity analysis, including space, time, and communication aspects, is reported. A series of detailed numeric experiments is used to illustrate the main ideas discussed in the study.
Resumo:
This paper sets forth a Neo-Kaleckian model of capacity utilization and growth with distribution featuring a profit-sharing arrangement. While a given proportion of firms compensate workers with only a base wage, the remaining proportion do so with a base wage and a share of profits. Consistent with the empirical evidence, workers hired by profit-sharing firms have a higher productivity than their counterparts in base-wage firms. While a higher profit-sharing coefficient raises capacity utilization and growth irrespective of the distribution of compensation strategies across firms, a higher frequency of profit-sharing firms does likewise only if the profit-sharing coefficient is sufficiently high.
Resumo:
A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.
Resumo:
Circadian rhythms in pacemaker cells persist for weeks in constant darkness, while in other types of cells the molecular oscillations that underlie circadian rhythms damp rapidly under the same conditions. Although much progress has been made in understanding the biochemical and cellular basis of circadian rhythms, the mechanisms leading to damped or self-sustained oscillations remain largely unknown. There exist many mathematical models that reproduce the circadian rhythms in the case of a single cell of the Drosophila fly. However, not much is known about the mechanisms leading to coherent circadian oscillation in clock neuron networks. In this work we have implemented a model for a network of interacting clock neurons to describe the emergence (or damping) of circadian rhythms in Drosophila fly, in the absence of zeitgebers. Our model consists of an array of pacemakers that interact through the modulation of some parameters by a network feedback. The individual pacemakers are described by a well-known biochemical model for circadian oscillation, to which we have added degradation of PER protein by light and multiplicative noise. The network feedback is the PER protein level averaged over the whole network. In particular, we have investigated the effect of modulation of the parameters associated with (i) the control of net entrance of PER into the nucleus and (ii) the non-photic degradation of PER. Our results indicate that the modulation of PER entrance into the nucleus allows the synchronization of clock neurons, leading to coherent circadian oscillations under constant dark condition. On the other hand, the modulation of non-photic degradation cannot reset the phases of individual clocks subjected to intrinsic biochemical noise.
Resumo:
Background: A possible viral etiology has been documented in the genesis of motor neuron disorders and acquired peripheral neuropathies, mainly due to the vulnerability of peripheral nerves and the anterior horn to certain viruses. In recent years, several reports show association of HIV infection with Amyotrophic Lateral Sclerosis Syndrome, Motor Neuron Diseases and peripheral neuropathies. Objective: To report a case of an association between Motor Neuron Disease and Acquired Axonal neuropathy in HIV infection, and describe the findings of neurological examination, cerebrospinal fluid, neuroimaging and electrophysiology. Methods: The patient underwent neurological examination. General medical examinations were performed, including, specific neuromuscular tests, analysis of cerebrospinal fluid, muscle biopsy and imaging studies. Results and Discussion: The initial clinical presentation of our case was marked by cramps and fasciculations with posterior distal paresis and atrophy in the left arm. We found electromyography tracings with deficits in the anterior horn of the spinal cord and peripheral nerves. Dysphagia and release of primitive reflexes were also identified. At the same time, the patient was informed to be HIV positive with high viral load. He received antiretroviral therapy, with load control but with no clinical remission. Conclusion: Motor Neuron disorders and peripheral neuropathy may occur in association with HIV infection. However, a causal relationship remains uncertain. It is noteworthy that the antiretroviral regimen may be implicated in some cases.
Resumo:
This work proposes the development of an Adaptive Neuro-fuzzy Inference System (ANFIS) estimator applied to speed control in a three-phase induction motor sensorless drive. Usually, ANFIS is used to replace the traditional PI controller in induction motor drives. The evaluation of the estimation capability of the ANFIS in a sensorless drive is one of the contributions of this work. The ANFIS speed estimator is validated in a magnetizing flux oriented control scheme, consisting in one more contribution. As an open-loop estimator, it is applied to moderate performance drives and it is not the proposal of this work to solve the low and zero speed estimation problems. Simulations to evaluate the performance of the estimator considering the vector drive system were done from the Matlab/Simulink(R) software. To determine the benefits of the proposed model, a practical system was implemented using a voltage source inverter (VSI) to drive the motor and the vector control including the ANFIS estimator, which is carried out by the Real Time Toolbox from Matlab/Simulink(R) software and a data acquisition card from National Instruments.
Resumo:
Whilst a fall in neuron numbers seems a common pattern during postnatal development, several authors have nonetheless reported an increase in neuron number, which may be associated with any one of a number of possible processes encapsulating either neurogenesis or late maturation and incomplete differentiation. Recent publications have thus added further fuel to the notion that a postnatal neurogenesis may indeed exist in sympathetic ganglia. In the light of these uncertainties surrounding the effects exerted by postnatal development on the number of superior cervical ganglion (SCG) neurons, we have used state-of-the-art design-based stereology to investigate the quantitative structure of SCG at four distinct timepoints after birth, viz., 1-3 days, 1 month, 12 months and 36 months. The main effects exerted by ageing on the SCG structure were: (i) a 77% increase in ganglion volume; (ii) stability in the total number of the whole population of SCG nerve cells (no change - either increase or decrease) during post-natal development; (iii) a higher proportion of uninucleate neurons to binucleate neurons only in newborn animals; (iv) a 130% increase in the volume of uninucleate cell bodies; and (v) the presence of BrdU positive neurons in animals at all ages. At the time of writing our results support the idea that neurogenesis takes place in the SCG of preas, albeit it warrants confirmation by further markers. We also hypothesise that a portfolio of other mechanisms: cell repair, maturation, differentiation and death may be equally intertwined and implicated in the numerical stability of SCG neurons during postnatal development. (C) 2011 ISDN. Published by Elsevier Ltd. All rights reserved.
Resumo:
The analysis of spatial relations among objects in an image is an important vision problem that involves both shape analysis and structural pattern recognition. In this paper, we propose a new approach to characterize the spatial relation along, an important feature of spatial configurations in space that has been overlooked in the literature up to now. We propose a mathematical definition of the degree to which an object A is along an object B, based on the region between A and B and a degree of elongatedness of this region. In order to better fit the perceptual meaning of the relation, distance information is included as well. In order to cover a more wide range of potential applications, both the crisp and fuzzy cases are considered. In the crisp case, the objects are represented in terms of 2D regions or ID contours, and the definition of the alongness between them is derived from a visibility notion and from the region between the objects. However, the computational complexity of this approach leads us to the proposition of a new model to calculate the between region using the convex hull of the contours. On the fuzzy side, the region-based approach is extended. Experimental results obtained using synthetic shapes and brain structures in medical imaging corroborate the proposed model and the derived measures of alongness, thus showing that they agree with the common sense. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
ACR is supported by a research grant from CNPq.
Resumo:
The ever-growing production and the problematization of Environmental Health have shown the need to apprehend complex realities and deal with uncertainties from the most diversified instruments which may even incorporate local aspects and subjectivities by means of qualitative realities, while broadening the capacity of the information system. This paper presents a view on the reflection upon some challenges and possible convergences between the ecosystemic approach and the Fuzzy logic in the process of dealing with scientific information and decision-making in Environmental Health.
Resumo:
OBJECTIVE: This study proposes a new approach that considers uncertainty in predicting and quantifying the presence and severity of diabetic peripheral neuropathy. METHODS: A rule-based fuzzy expert system was designed by four experts in diabetic neuropathy. The model variables were used to classify neuropathy in diabetic patients, defining it as mild, moderate, or severe. System performance was evaluated by means of the Kappa agreement measure, comparing the results of the model with those generated by the experts in an assessment of 50 patients. Accuracy was evaluated by an ROC curve analysis obtained based on 50 other cases; the results of those clinical assessments were considered to be the gold standard. RESULTS: According to the Kappa analysis, the model was in moderate agreement with expert opinions. The ROC analysis (evaluation of accuracy) determined an area under the curve equal to 0.91, demonstrating very good consistency in classifying patients with diabetic neuropathy. CONCLUSION: The model efficiently classified diabetic patients with different degrees of neuropathy severity. In addition, the model provides a way to quantify diabetic neuropathy severity and allows a more accurate patient condition assessment.
Resumo:
With the increasing production of information from e-government initiatives, there is also the need to transform a large volume of unstructured data into useful information for society. All this information should be easily accessible and made available in a meaningful and effective way in order to achieve semantic interoperability in electronic government services, which is a challenge to be pursued by governments round the world. Our aim is to discuss the context of e-Government Big Data and to present a framework to promote semantic interoperability through automatic generation of ontologies from unstructured information found in the Internet. We propose the use of fuzzy mechanisms to deal with natural language terms and present some related works found in this area. The results achieved in this study are based on the architectural definition and major components and requirements in order to compose the proposed framework. With this, it is possible to take advantage of the large volume of information generated from e-Government initiatives and use it to benefit society.