937 resultados para Agrupamento de dados. Fuzzy C-Means. Inicialização dos centros de grupos. Índices de validação
Resumo:
The use of non-human primates in scientific research has contributed significantly to the biomedical area and, in the case of Callithrix jacchus, has provided important evidence on physiological mechanisms that help explain its biology, making the species a valuable experimental model in different pathologies. However, raising non-human primates in captivity for long periods of time is accompanied by behavioral disorders and chronic diseases, as well as progressive weight loss in most of the animals. The Primatology Center of the Universidade Federal do Rio Grande do Norte (UFRN) has housed a colony of C. jacchus for nearly 30 years and during this period these animals have been weighed systematically to detect possible alterations in their clinical conditions. This procedure has generated a volume of data on the weight of animals at different age ranges. These data are of great importance in the study of this variable from different perspectives. Accordingly, this paper presents three studies using weight data collected over 15 years (1985-2000) as a way of verifying the health status and development of the animals. The first study produced the first article, which describes the histopathological findings of animals with probable diagnosis of permanent wasting marmoset syndrome (WMS). All the animals were carriers of trematode parasites (Platynosomum spp) and had obstruction in the hepatobiliary system; it is suggested that this agent is one of the etiological factors of the syndrome. In the second article, the analysis focused on comparing environmental profile and cortisol levels between the animals with normal weight curve evolution and those with WMS. We observed a marked decrease in locomotion, increased use of lower cage extracts and hypocortisolemia. The latter is likely associated to an adaptation of the mechanisms that make up the hypothalamus-hypophysis-adrenal axis, as observed in other mammals under conditions of chronic malnutrition. Finally, in the third study, the animals with weight alterations were excluded from the sample and, using computational tools (K-means and SOM) in a non-supervised way, we suggest found new ontogenetic development classes for C. jacchus. These were redimensioned from five to eight classes: infant I, infant II, infant III, juvenile I, juvenile II, sub-adult, young adult and elderly adult, in order to provide a more suitable classification for more detailed studies that require better control over the animal development
Resumo:
There is a known positive effect of nocturnal sleep for brain plasticity and the consolidation of declarative and procedural memories, as well as for the facilitation of insight in problem solving. However, a possible pro-mnemonic effect of daytime naps after learning is yet to be properly characterized. The goal of this project was to evaluate the influence of daytime naps on learning among elementary and middleschool students, measuring the one-day (acute), and semester-long (chronic) effects of post-learning naps on performance. In the Acute Day-Nap condition, the elementary students were exposed to a class and then randomly divided into three groups: Nap (N), Game-based English Class (GBEC) and Traditional English Class (TEC). There were 2 multiple-choice follow-up tests to evaluate students performance in the short and long runs. In the short run, the N group outperformed the other two groups; and such tendency was maintained in the long run. In the chronic condition, the middle-school students were randomly separated into two groups: Nap (N) and Class (C) and were observed during one academic term. The N group had increased school performance in relation to the C group. In the statistical analyses, independent t-tests were applied considering the scores significant when p<0,05, expressed in terms of average ± average standard error. Results can be interpreted as an indication that a single daytime nap opportunity is not enough to ensure learning benefits. Therefore, more research is needed in order to advocate in favor of a daytime nap as a pedagogical means of promoting enhanced school performance
Resumo:
This work presents a methodology to analyze transient stability (first oscillation) of electric energy systems, using a neural network based on ART architecture (adaptive resonance theory), named fuzzy ART-ARTMAP neural network for real time applications. The security margin is used as a stability analysis criterion, considering three-phase short circuit faults with a transmission line outage. The neural network operation consists of two fundamental phases: the training and the analysis. The training phase needs a great quantity of processing for the realization, while the analysis phase is effectuated almost without computation effort. This is, therefore the principal purpose to use neural networks for solving complex problems that need fast solutions, as the applications in real time. The ART neural networks have as primordial characteristics the plasticity and the stability, which are essential qualities to the training execution and to an efficient analysis. The fuzzy ART-ARTMAP neural network is proposed seeking a superior performance, in terms of precision and speed, when compared to conventional ARTMAP, and much more when compared to the neural networks that use the training by backpropagation algorithm, which is a benchmark in neural network area. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
The Commercial Sexual Exploitation of Children and Adolescents (ESCCA) is a phenomenon that has been given priority in public policy agenda of many democratic governments of Western countries and civil sectors of society, besides the object of studies in different fields of knowledge. Psychology is among these areas and is considered one of the references in the construction of knowledge and actions to confront the phenomenon. However, the epistemological foundations of psychological science are quite different, and so, several are his speeches, knowledge produced and ways of conceiving man and the world. This is evident in the publications of psychology on ESCCA. This work aims to achieve a state of the art of psychology publication in Brazil (in the post-graduation - through theses and dissertations - and journals) on the Commercial Sexual Exploitation of Children and Adolescents. More specifically try to evidence (a) what conceptions of commercial sexual exploitation of children and adolescents the authors lists, (b) what are aimed at research and publications, (c) how the research and issues are justified, and (d) that theoretical approaches the authors are affiliated and methodological possibilities are applied to range the aim proposed in their work. In order to do so, a survey was conducted in the major index sites (eg, BVS-Psi, Capes, theses and dissertations database of university libraries) of the material, covering the period 1990 to 2007. Through research on these sites, we built a database, including information relating to the work sought from specific descriptors for studies in the area of victimization of children and adolescents, with reference to a list provided by Faleiros (2000). After reading the summary of the work, the number of recovered 25 productions was reached - including theses, dissertations and articles. For analysis of the material used in the analysis of thematic content. Two axes themes were established in order to guide the analysis: conceptual elements of commercial sexual exploitation, and theoretical and methodological strategies employed. The axes have as reference for analysis a chapter built on the concept of commercial sexual exploitation, so that all analysis is anchored on it. The analysis points to the existence, still strong, conceptual and terminological confusion about ESCCA. Few studies have not demonstrated this confusion, maintaining a consistent theoretical approach. In relation to the theoretical and methodological strategies, there is a great diversity of approaches in psychology surrounding the phenomenon of ESCCA, enriching levels of understanding and action. This diversity reflects a constitutive heterogeneity of psychological science. We emphasize the perspective of socio-historical psychology, most frequently among the publications. It is hoped that this research will help advance the qualitative approach to ESCCA, especially in the field of psychology, as well as contribute to new research in the area and construction of new means of addressing this human rights violation
Resumo:
This work presents a neural network based on the ART architecture ( adaptive resonance theory), named fuzzy ART& ARTMAP neural network, applied to the electric load-forecasting problem. The neural networks based on the ARTarchitecture have two fundamental characteristics that are extremely important for the network performance ( stability and plasticity), which allow the implementation of continuous training. The fuzzy ART& ARTMAP neural network aims to reduce the imprecision of the forecasting results by a mechanism that separate the analog and binary data, processing them separately. Therefore, this represents a reduction on the processing time and improved quality of the results, when compared to the Back-Propagation neural network, and better to the classical forecasting techniques (ARIMA of Box and Jenkins methods). Finished the training, the fuzzy ART& ARTMAP neural network is capable to forecast electrical loads 24 h in advance. To validate the methodology, data from a Brazilian electric company is used. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
In almost all cases, the goal of the design of automatic control systems is to obtain the parameters of the controllers, which are described by differential equations. In general, the controller is artificially built and it is possible to update its initial conditions. In the design of optimal quadratic regulators, the initial conditions of the controller can be changed in an optimal way and they can improve the performance of the controlled system. Following this idea, a LNU-based design procedure to update the initial conditions of PI controllers, considering the nonlinear plant described by Takagi-Sugeno fuzzy models, is presented. The importance of the proposed method is that it also allows other specifications, such as, the decay rate and constraints on control input and output. The application in the control of an inverted pendulum illustrates the effectively of proposed method.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In this work calibration models were constructed to determine the content of total lipids and moisture in powdered milk samples. For this, used the near-infrared spectroscopy by diffuse reflectance, combined with multivariate calibration. Initially, the spectral data were submitted to correction of multiplicative light scattering (MSC) and Savitzsky-Golay smoothing. Then, the samples were divided into subgroups by application of hierarchical clustering analysis of the classes (HCA) and Ward Linkage criterion. Thus, it became possible to build regression models by partial least squares (PLS) that allowed the calibration and prediction of the content total lipid and moisture, based on the values obtained by the reference methods of Soxhlet and 105 ° C, respectively . Therefore, conclude that the NIR had a good performance for the quantification of samples of powdered milk, mainly by minimizing the analysis time, not destruction of the samples and not waste. Prediction models for determination of total lipids correlated (R) of 0.9955, RMSEP of 0.8952, therefore the average error between the Soxhlet and NIR was ± 0.70%, while the model prediction to content moisture correlated (R) of 0.9184, RMSEP, 0.3778 and error of ± 0.76%
Resumo:
Intendding to understand how the human mind operates, some philosophers and psycologists began to study about rationality. Theories were built from those studies and nowadays that interest have been extended to many other areas such as computing engineering and computing science, but with a minimal distinction at its goal: to understand the mind operational proccess and apply it on agents modelling to become possible the implementation (of softwares or hardwares) with the agent-oriented paradigm where agents are able to deliberate their own plans of actions. In computing science, the sub-area of multiagents systems has progressed using several works concerning artificial intelligence, computational logic, distributed systems, games theory and even philosophy and psycology. This present work hopes to show how it can be get a logical formalisation extention of a rational agents architecture model called BDI (based in a philosophic Bratman s Theory) in which agents are capable to deliberate actions from its beliefs, desires and intentions. The formalisation of this model is called BDI logic and it is a modal logic (in general it is a branching time logic) with three access relations: B, D and I. And here, it will show two possible extentions that tranform BDI logic in a modal-fuzzy logic where the formulae and the access relations can be evaluated by values from the interval [0,1]
Resumo:
Mainstream programming languages provide built-in exception handling mechanisms to support robust and maintainable implementation of exception handling in software systems. Most of these modern languages, such as C#, Ruby, Python and many others, are often claimed to have more appropriated exception handling mechanisms. They reduce programming constraints on exception handling to favor agile changes in the source code. These languages provide what we call maintenance-driven exception handling mechanisms. It is expected that the adoption of these mechanisms improve software maintainability without hindering software robustness. However, there is still little empirical knowledge about the impact that adopting these mechanisms have on software robustness. This work addresses this gap by conducting an empirical study aimed at understanding the relationship between changes in C# programs and their robustness. In particular, we evaluated how changes in the normal and exceptional code were related to exception handling faults. We applied a change impact analysis and a control flow analysis in 100 versions of 16 C# programs. The results showed that: (i) most of the problems hindering software robustness in those programs are caused by changes in the normal code, (ii) many potential faults were introduced even when improving exception handling in C# code, and (iii) faults are often facilitated by the maintenance-driven flexibility of the exception handling mechanism. Moreover, we present a series of change scenarios that decrease the program robustness
Resumo:
Image segmentation is the process of subdiving an image into constituent regions or objects that have similar features. In video segmentation, more than subdividing the frames in object that have similar features, there is a consistency requirement among segmentations of successive frames of the video. Fuzzy segmentation is a region growing technique that assigns to each element in an image (which may have been corrupted by noise and/or shading) a grade of membership between 0 and 1 to an object. In this work we present an application that uses a fuzzy segmentation algorithm to identify and select particles in micrographs and an extension of the algorithm to perform video segmentation. Here, we treat a video shot is treated as a three-dimensional volume with different z slices being occupied by different frames of the video shot. The volume is interactively segmented based on selected seed elements, that will determine the affinity functions based on their motion and color properties. The color information can be extracted from a specific color space or from three channels of a set of color models that are selected based on the correlation of the information from all channels. The motion information is provided into the form of dense optical flows maps. Finally, segmentation of real and synthetic videos and their application in a non-photorealistic rendering (NPR) toll are presented
Resumo:
O presente estudo teve como objetivo o registro e a apresentação de trabalhos realizados no Brasil nos últimos 40 anos, relacionados com a investigação sobre a deficiência de vitamina A. Esta deficiência tem sido diagnosticada por um ou mais dos seguintes critérios: ingestão deficiente de alimentos fontes de vitamina A, exame clínico, níveis séricos de retinol abaixo dos aceitos como normais, concentração hepática de retinol, teste de adaptação ao escuro e corante de Rosa Bengala. A deficiência foi diagnosticada em grupos populacionais de vários Estados e capitais brasileiras em cidades grandes e pequenas e em zonas rurais. A maioria dos trabalhos foi desenvolvida em grupos populacionais de baixa renda. Quanto às conseqüências clínicas, relataram-se achados de sinais oculares leves, como cegueira noturna, manchas de Bitot e xerose conjuntival, encontrados principalmente na Região Nordeste. Alguns autores observaram, em menor número de casos, lesões graves, como lesões corneanas e ceratomalácia. Trabalhos da última década indicaram associação entre a hipovitaminose A e o aumento da morbidade e mortalidade, principalmente em crianças pré-escolares.
Resumo:
Peng was the first to work with the Technical DFA (Detrended Fluctuation Analysis), a tool capable of detecting auto-long-range correlation in time series with non-stationary. In this study, the technique of DFA is used to obtain the Hurst exponent (H) profile of the electric neutron porosity of the 52 oil wells in Namorado Field, located in the Campos Basin -Brazil. The purpose is to know if the Hurst exponent can be used to characterize spatial distribution of wells. Thus, we verify that the wells that have close values of H are spatially close together. In this work we used the method of hierarchical clustering and non-hierarchical clustering method (the k-mean method). Then compare the two methods to see which of the two provides the best result. From this, was the parameter � (index neighborhood) which checks whether a data set generated by the k- average method, or at random, so in fact spatial patterns. High values of � indicate that the data are aggregated, while low values of � indicate that the data are scattered (no spatial correlation). Using the Monte Carlo method showed that combined data show a random distribution of � below the empirical value. So the empirical evidence of H obtained from 52 wells are grouped geographically. By passing the data of standard curves with the results obtained by the k-mean, confirming that it is effective to correlate well in spatial distribution