29 resultados para Classificação decimal
Resumo:
This work holds the purpose of presenting an auxiliary way of bone density measurement through the attenuation of electromagnetic waves. In order to do so, an arrangement of two microstrip antennas with rectangular configuration has been used, operating in a frequency of 2,49 GHz, and fed by a microstrip line on a substrate of fiberglass with permissiveness of 4.4 and height of 0,9 cm. Simulations were done with silica, bone meal, silica and gypsum blocks samples to prove the variation on the attenuation level of different combinations. Because of their good reproduction of the human beings anomaly aspects, samples of bovine bone were used. They were subjected to weighing, measurement and microwave radiation. The samples had their masses altered after mischaracterization and the process was repeated. The obtained data were inserted in a neural network and its training was proceeded with the best results gathered by correct classification on 100% of the samples. It comes to the conclusion that through only one non-ionizing wave in the 2,49 GHz zone it is possible to evaluate the attenuation level in the bone tissue, and that with the appliance of neural network fed with obtained characteristics in the experiment it is possible to classify a sample as having low or high bone density
Resumo:
The increasing demand for high performance wireless communication systems has shown the inefficiency of the current model of fixed allocation of the radio spectrum. In this context, cognitive radio appears as a more efficient alternative, by providing opportunistic spectrum access, with the maximum bandwidth possible. To ensure these requirements, it is necessary that the transmitter identify opportunities for transmission and the receiver recognizes the parameters defined for the communication signal. The techniques that use cyclostationary analysis can be applied to problems in either spectrum sensing and modulation classification, even in low signal-to-noise ratio (SNR) environments. However, despite the robustness, one of the main disadvantages of cyclostationarity is the high computational cost for calculating its functions. This work proposes efficient architectures for obtaining cyclostationary features to be employed in either spectrum sensing and automatic modulation classification (AMC). In the context of spectrum sensing, a parallelized algorithm for extracting cyclostationary features of communication signals is presented. The performance of this features extractor parallelization is evaluated by speedup and parallel eficiency metrics. The architecture for spectrum sensing is analyzed for several configuration of false alarm probability, SNR levels and observation time for BPSK and QPSK modulations. In the context of AMC, the reduced alpha-profile is proposed as as a cyclostationary signature calculated for a reduced cyclic frequencies set. This signature is validated by a modulation classification architecture based on pattern matching. The architecture for AMC is investigated for correct classification rates of AM, BPSK, QPSK, MSK and FSK modulations, considering several scenarios of observation length and SNR levels. The numerical results of performance obtained in this work show the eficiency of the proposed architectures
Resumo:
The central question of the present study is to identify the epistemological knowledge that the teachers-trainees possess regarding the characteristics (properties) of the decimal numbering system; its purpose is to offer a contribution to the pedagogic practice of the teachers who work within the Basic Literacy Cycle, in terms of what concerns both the acquisition of contents and the development of the knowledge that helps them in the elaboration of adequate strategies to working with the Decimal Numbering System in the classroom. The study is based on the constructivist sociointeractionist approach to teaching Mathematics and it constitutes, in itself, a methodological intervention with the teachers-trainees engaged in the Professional Qualification Program in Basic Education of the Federal University of Rio Grande do Norte. The foundations of the study were found in investigations of researchers who had carried out studies on the construction of numerical writing, showing, for instance, that the construction process of ideas and procedures involved in groupings and changes to base 10 take a lot longer to be accomplished than one can imagine. A set of activities was then elaborated which could not only contribute to the acquisition of contents but that could also make the teachers-trainees reflect upon their teaching practices in the classroom so that in this way they will be able to elaborate more consistent didactic approaches, taking into consideration the previous knowledge of the students and also some obstacles that often appear along the way. Even when teachers have access to the most appropriate dicactic resources, the lack of knowledge of the content and of the real meaning of that content make the Decimal Numbering System, a subject of fundamental importance, be taught most times in a mechanical way. The analisys of the discussions and behaviours of the teachers-trainees during the activities reavealed that they made them reflect upon their current practices in the classroom and that, as a whole, the aims of each of the activities carried out with the teachers-trainers were reached
Resumo:
The use of non-human primates in scientific research has contributed significantly to the biomedical area and, in the case of Callithrix jacchus, has provided important evidence on physiological mechanisms that help explain its biology, making the species a valuable experimental model in different pathologies. However, raising non-human primates in captivity for long periods of time is accompanied by behavioral disorders and chronic diseases, as well as progressive weight loss in most of the animals. The Primatology Center of the Universidade Federal do Rio Grande do Norte (UFRN) has housed a colony of C. jacchus for nearly 30 years and during this period these animals have been weighed systematically to detect possible alterations in their clinical conditions. This procedure has generated a volume of data on the weight of animals at different age ranges. These data are of great importance in the study of this variable from different perspectives. Accordingly, this paper presents three studies using weight data collected over 15 years (1985-2000) as a way of verifying the health status and development of the animals. The first study produced the first article, which describes the histopathological findings of animals with probable diagnosis of permanent wasting marmoset syndrome (WMS). All the animals were carriers of trematode parasites (Platynosomum spp) and had obstruction in the hepatobiliary system; it is suggested that this agent is one of the etiological factors of the syndrome. In the second article, the analysis focused on comparing environmental profile and cortisol levels between the animals with normal weight curve evolution and those with WMS. We observed a marked decrease in locomotion, increased use of lower cage extracts and hypocortisolemia. The latter is likely associated to an adaptation of the mechanisms that make up the hypothalamus-hypophysis-adrenal axis, as observed in other mammals under conditions of chronic malnutrition. Finally, in the third study, the animals with weight alterations were excluded from the sample and, using computational tools (K-means and SOM) in a non-supervised way, we suggest found new ontogenetic development classes for C. jacchus. These were redimensioned from five to eight classes: infant I, infant II, infant III, juvenile I, juvenile II, sub-adult, young adult and elderly adult, in order to provide a more suitable classification for more detailed studies that require better control over the animal development
Resumo:
Recently, Brazilian scientific production has increased greatly, due to demands for productivity from scientific agencies. However, this high increases requires a more qualified production, since it s essential that publications are relevant and original. In the psychological field, the assessment scientific journals of the CAPES/ANPEPP Commission had a strong effect on the scientific community and raised questions about the chosen evaluation method. Considering this impact, the aim of this research is a meta-analysis on the assessment of Psychological journals by CAPES to update the Qualis database. For this research, Psychology scientific editors (38 questionnaires were applied by e-mail) were consulted, also 5 librarians who work with scientific journals assessment (semi-structured interviews) and 8 members who acted as referees in the CAPES/ANPEPP Commission (open questions were sent by e-mail). The results are shown through 3 analysis: general evaluation of the Qualis process (including the Assessment Committee constitution), evaluation criteria used in the process and the effect of the evaluation on the scientific community (changes on the editing scene included). Some important points emerged: disagreement among different actors about the suitability of this evaluation model; the recognition of the improvement of scientific journals, mainly toward normalization and diffusion; the verification that the model does not point the quality of the journal, i.e., the content of the scientific articles published in the journal; the disagreement with the criteria used, seemed necessary and useful but needed to be discussed and cleared between the scientific community. Despite these points, the scientific journals evaluation still is the main method to assure quality for Psychology publications
Resumo:
In this work we used chemometric tools to classify and quantify the protein content in samples of milk powder. We applied the NIR diffuse reflectance spectroscopy combined with multivariate techniques. First, we carried out an exploratory method of samples by principal component analysis (PCA), then the classification of independent modeling of class analogy (SIMCA). Thus it became possible to classify the samples that were grouped by similarities in their composition. Finally, the techniques of partial least squares regression (PLS) and principal components regression (PCR) allowed the quantification of protein content in samples of milk powder, compared with the Kjeldahl reference method. A total of 53 samples of milk powder sold in the metropolitan areas of Natal, Salvador and Rio de Janeiro were acquired for analysis, in which after pre-treatment data, there were four models, which were employed for classification and quantification of samples. The methods employed after being assessed and validated showed good performance, good accuracy and reliability of the results, showing that the NIR technique can be a non invasive technique, since it produces no waste and saves time in analyzing the samples
Resumo:
The techniques of Machine Learning are applied in classification tasks to acquire knowledge through a set of data or information. Some learning methods proposed in literature are methods based on semissupervised learning; this is represented by small percentage of labeled data (supervised learning) combined with a quantity of label and non-labeled examples (unsupervised learning) during the training phase, which reduces, therefore, the need for a large quantity of labeled instances when only small dataset of labeled instances is available for training. A commom problem in semi-supervised learning is as random selection of instances, since most of paper use a random selection technique which can cause a negative impact. Much of machine learning methods treat single-label problems, in other words, problems where a given set of data are associated with a single class; however, through the requirement existent to classify data in a lot of domain, or more than one class, this classification as called multi-label classification. This work presents an experimental analysis of the results obtained using semissupervised learning in troubles of multi-label classification using reliability parameter as an aid in the classification data. Thus, the use of techniques of semissupervised learning and besides methods of multi-label classification, were essential to show the results
Resumo:
Data classification is a task with high applicability in a lot of areas. Most methods for treating classification problems found in the literature dealing with single-label or traditional problems. In recent years has been identified a series of classification tasks in which the samples can be labeled at more than one class simultaneously (multi-label classification). Additionally, these classes can be hierarchically organized (hierarchical classification and hierarchical multi-label classification). On the other hand, we have also studied a new category of learning, called semi-supervised learning, combining labeled data (supervised learning) and non-labeled data (unsupervised learning) during the training phase, thus reducing the need for a large amount of labeled data when only a small set of labeled samples is available. Thus, since both the techniques of multi-label and hierarchical multi-label classification as semi-supervised learning has shown favorable results with its use, this work is proposed and used to apply semi-supervised learning in hierarchical multi-label classication tasks, so eciently take advantage of the main advantages of the two areas. An experimental analysis of the proposed methods found that the use of semi-supervised learning in hierarchical multi-label methods presented satisfactory results, since the two approaches were statistically similar results
Resumo:
Para aumentar a viabilidade do uso da Classificação Internacional de Funcionalidade, Incapacidade e Saúde (CIF), core sets começaram a ser desenvolvidos, e objetivam estabelecer uma seleção de categorias adaptada para representar os padrões de avaliação multiprofissional de grupos específicos de pacientes. Com o objetivo de propor um core set da CIF para classificar a saúde física de idosos, formou-se uma comissão de especialistas para julgar o instrumento por meio da técnica Delphi, o que mostra a interface multidisciplinar do projeto. Finalizada a participação da comissão, o core set foi proposto contendo 30 categorias. Após aplicação em uma amostra com 340 idosos dos municípios de Natal/RN e Santa Cruz/RN, o core set foi submetido à análise fatorial, tendo ficado com 19 categorias. A análise ainda proporcionou gerar uma pontuação para cada idoso por meio do escore fatorial, tendo provado ser uma forma fidedigna e confiável de se pontuar um core set.
Resumo:
Automatic detection of blood components is an important topic in the field of hematology. The segmentation is an important stage because it allows components to be grouped into common areas and processed separately and leukocyte differential classification enables them to be analyzed separately. With the auto-segmentation and differential classification, this work is contributing to the analysis process of blood components by providing tools that reduce the manual labor and increasing its accuracy and efficiency. Using techniques of digital image processing associated with a generic and automatic fuzzy approach, this work proposes two Fuzzy Inference Systems, defined as I and II, for autosegmentation of blood components and leukocyte differential classification, respectively, in microscopic images smears. Using the Fuzzy Inference System I, the proposed technique performs the segmentation of the image in four regions: the leukocyte’s nucleus and cytoplasm, erythrocyte and plasma area and using the Fuzzy Inference System II and the segmented leukocyte (nucleus and cytoplasm) classify them differentially in five types: basophils, eosinophils, lymphocytes, monocytes and neutrophils. Were used for testing 530 images containing microscopic samples of blood smears with different methods. The images were processed and its accuracy indices and Gold Standards were calculated and compared with the manual results and other results found at literature for the same problems. Regarding segmentation, a technique developed showed percentages of accuracy of 97.31% for leukocytes, 95.39% to erythrocytes and 95.06% for blood plasma. As for the differential classification, the percentage varied between 92.98% and 98.39% for the different leukocyte types. In addition to promoting auto-segmentation and differential classification, the proposed technique also contributes to the definition of new descriptors and the construction of an image database using various processes hematological staining
Resumo:
Navigation, in both virtual and real environments, is the process of a deliberated movement to a specific place that is usually away from the origin point, and that cannot be perceived from it. Navigation aid techniques (TANs) have as their main objective help finding a path through a virtual environment to a desired location and, are widely used because they ease the navigation on these unknown environments. Tools like maps, GPS (Global Positioning System) or even oral instructions are real world examples of TAN usage. Most of the works which propose new TANs for virtual environments aim to analyze their impact in efficiency gain on navigation tasks from a known place to an unknown place. However, such papers tend to ignore the effect caused by a TAN usage over the route knowledge acquisition process, which is important on virtual to real training transfer, for example. Based on a user study, it was possible to confirm that TANs with different strategies affects the performance of search tasks differently and that the efficiency of the help provided by a TAN is not inversely related to the cognitive load of the technique’s aids. A technique classification formula was created. This formula utilizes three factors instead of only efficiency. The experiment’s data were applied to the formula and we obtained a better refinement of help level provided by TANs.
Resumo:
This work is to identify and classify extended techniques on acoustic bass. The main purpose of this classification is to provide the bass better understanding and practice these techniques. For this, we carried out a literature review, it was noted that, in academia, this topic is new and understanding for some musicians is still complex and nebulous. To assist in the difficulties of the bass player, the extended techniques were divided into two parts, proposing different approaches. The first part proposes the experimentation and exploration of conventional and unconventional techniques from other instruments on acoustic bass. The second seeks to identify and rank the extended techniques, showing excerpts from the solo instrument repertoire compositions. On each, there are still referenced authors and their contributions to the instrumental practice these techniques. In both categories of classification procedures were presented to be adopted by the bass player for the implementation of those techniques, as well as suggestions for their study. Among the main results, it was found that through the proposed classification of these techniques, one can incorporate new resources to technical universe and timbre of the instrument. Lastly, it was found a creation of a composition performed by interaction between composer and performer, containing new extended techniques or extended techniques not found in the instrument's contemporary soloist repertoire.
Resumo:
This work is to identify and classify extended techniques on acoustic bass. The main purpose of this classification is to provide the bass better understanding and practice these techniques. For this, we carried out a literature review, it was noted that, in academia, this topic is new and understanding for some musicians is still complex and nebulous. To assist in the difficulties of the bass player, the extended techniques were divided into two parts, proposing different approaches. The first part proposes the experimentation and exploration of conventional and unconventional techniques from other instruments on acoustic bass. The second seeks to identify and rank the extended techniques, showing excerpts from the solo instrument repertoire compositions. On each, there are still referenced authors and their contributions to the instrumental practice these techniques. In both categories of classification procedures were presented to be adopted by the bass player for the implementation of those techniques, as well as suggestions for their study. Among the main results, it was found that through the proposed classification of these techniques, one can incorporate new resources to technical universe and timbre of the instrument. Lastly, it was found a creation of a composition performed by interaction between composer and performer, containing new extended techniques or extended techniques not found in the instrument's contemporary soloist repertoire.
Resumo:
This master dissertation presents the study and implementation of inteligent algorithms to monitor the measurement of sensors involved in natural gas custody transfer processes. To create these algoritmhs Artificial Neural Networks are investigated because they have some particular properties, such as: learning, adaptation, prediction. A neural predictor is developed to reproduce the sensor output dynamic behavior, in such a way that its output is compared to the real sensor output. A recurrent neural network is used for this purpose, because of its ability to deal with dynamic information. The real sensor output and the estimated predictor output work as the basis for the creation of possible sensor fault detection and diagnosis strategies. Two competitive neural network architectures are investigated and their capabilities are used to classify different kinds of faults. The prediction algorithm and the fault detection classification strategies, as well as the obtained results, are presented